var/home/core/zuul-output/0000755000175000017500000000000015137035675014541 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015137054160015473 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000314075515137054027020271 0ustar corecoreX|ikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD -/$~i.߷;U/;?FެxۻfW޾n^8/ixK|1Ool_~yyiw|zxV^֯Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{V5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁weor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/P_]F@?qr7@sON_}ۿ릶ytoyמseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזLwfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O %VO"d.wEр%}5zWˬQOS)ZbF p$^(2JцQImuzhpyXڈ2ͤh}/[g1ieQ*-=hiך5J))?' c9*%WyΈ W\Of[=߰+ednU$YD',jߎW&7DXǜߍG`DbE#0Y4&|޻xѷ\;_Z^sнM\&+1gWo'Y;l>V ̍"ޛ4tO,{=hFѓ$b =D(zn;Y<1x~SJ^{vn 9 j1шk'L"cE=K]A(oQ۲6+ktwLzG,87^ 9H\yqū1)\(v8pHA"ΈGVp"c ?Z)hm.2;sl$瓴ӘIe~H|.Y#C^SJĽHǀeTwvy"v܅ ]?22R.lQPa ˆSܫ1z.x62%z].`Gn&*7bd+, Z`ͲH-nမ^WbPFtOfD]c9\w+ea~~{;Vm >|WAޭi`HbIãE{%&4]Iw Wjoru ݜmKnZ<X; ۢ( nx K8.|DXb +*598;w)zp:̊~;͞)6vnM!N5Cu!8Wq/`FUwWAֻ,Qu W@ Fi:K [Av*_958]a:pmQ&'ᚡmi@ zF(n&P;)_]µ!doR0`pl`~9Fk[ٺ+4Hhao-jϸ??R<lb#P-^39T|L /~p│x@Bq"M/lja\b݋af LnU*P(8W[U6WX ZoѶ^SH:K:%Qvl\bSQp#YI$A@EEdT+w';'A7㢢V"+aQ33^ќz9Ӂ;=^ۭ7h9 lr_qSq-XbsK، JBJbeOfOAsg31zYYy[N 1m٢ڶEͦAc?-֋6rR)? I?ytwpC'P/9} ƘwXe就9bQQ!.(GNp$d(3 %רx%z(o6jp}vE#!3M. x!0=k$}  L&T+̔6vmEl 05 D"wO>"J~7+0_t[%XU͍ &dtO:odtRWon%*44JٵK+Woc.F3 %N%FF"HH"\$ۤ_5UWd̡bh塘ZRI&{3TUKSMCe<[%!:i -g[dABcAw`g*7R(#ғ [K&#Mp'XގL=s5Ǜ>Y+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ETƦ[@AeE{0մ{M&@4Q2lE >)kre_f |Nm8p5H!jR@Aiߒ߈ۥLFTk"5l9O'ϓl5x|_®&&n]#r̥jOڧK)lsXg\{Md-% >~Ӈ/( [ycy`ðSmn_O;3=Av3LA׊onxlM?~n Θ5 ӂxzPMcVQ@ӤomY42nrQ\'"P؝J7g+#!k{paqTԫ?o?VU}aK q;T0zqaj0"2p؋9~bޏt>$AZLk;3qUlWU Ry==]iSCQs~vn/SZ % 'I[DtUCE9s2".zɪ)u #T)D:fci[*`cc]%^W(̴{YES)WxI;N)%6-2H`cDFچ0<t#A/ 62$IL7M[Y[q2xH8 JE NB au)T܀S=;,)CfHCH#IY]tNWgA̕uF&Ix.Tp//|ʚ[Ưy.xF%ڄPw5fc9f짩Q{thbԉ]eH'm%=X |hM6rN+LxE>^D]EڬTk1+tnsǴ5RHİ[qL\}X` >+%ni3+(0m8HЭ*zAep!*)jxK:Up~gfu#xؾ .2ןGRLIۘT==!TlN3ӆv%oV}rN~ˊc,߹,=;2lVǻXEB6;5NE#es~ax޾8\0U'&2ihbvz=zl5|JE2z=wcMԧ ax& =`|#HQ*9ry\wW|~5Ԟ '!9MHK:9!s,jV剤C:LIeHJ"M0P,$N;a-zs=1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^Ak$wTg1!H$|HBTf̋ Y@uwiFī h[W,Ê=j8&d ԋU.I{7O=%iGtxvBչ̳@1+^.r%V12, _'j"2@+ wm 4\xNtqwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0?5ڦ=>"LgdJΆmΉO]T"DĊKٙ@qP,i Nl:~6'5R.j,&tK*iOFsk6[E_n`׎tp Nvw/oW4|<{H@#*h{Yp/E%dlh\bU:E%h@&SEK [ Ƣ x'Oz=CԒm4,tӛ"N88YܬUʽw c-csC{T-MzpvǸה-7֢ҙ40䆢^;4 9LڑKL|`PОnG#|}qOR{Vt2_tHߪ%pD?1%(@fxKrs45rMլf{sk7fjӞh2Hke_O'˿mȠ%>9cSH|cEyQp 'ˢh:,vw#[D6#6'mJTs>"tLvjkB|rN`)矻81 .&Ӎsēj\4iO,H̎<ߥ諵z/jYv4 0tU;[+8&b=zwɓJ``Fiwg s!8!ߐ>'4[7/KwNθW' >ݛ/_[z]Ȓtyڼ|=* lH#=M\`'%tYWm𓶝?Di륍sF,]VnSSJCҖԻq=ky^L6/R%eZ;i.p胉,y4F"37!ېATJKB2Z/"BfB(gdj۸=jB ]e>&ngl9%islԃ)Hc`ebwĪ3nZg0FRYeO:F)O>UD;;CY,2ڨi"R"*R2s@AK/u5,j#u>cY^*xk7%wCvpۊ ~;ɰ@ՙ.r{T?M0:;}d8Lj ݨW>Λ[Vhi/̥̒9$W!p?8=iU;߶o{+~e3]:({ܸf)*gCQE*pp^~xܽ`U'^vE90t~8-2S󥞙nc56s&"mgVKA: X>8[#yN SrA GYb8ZIaʼn8 #fg3i`F#5N 3qb$!qNCJ4blnv!+lR@;HspSI]ۡCZUck5pDcI9,oN-_XI,3\j ]?}G]c ې!rGHw@56NDq LA!&mYJ*0<>[Vݯ,%M{,#SY(H_USFC"6mݡ"3`Rd1e6d逖`7rCMxR2=ϫCy~ӡ` ^ y!k!9yL%VLU2gr26A!4vbS ]Wp+&ee *w -`J\ ppǣ}^~#_|o{ К8EW>*)D{ޛ$fnY𱹜M>4&$/"lX-O_vwrpP40ޢE[A͝ Z5 om2p)lbp/bjWd{R\' 礅_}\Ƕ:wNgOb{}IStg_pvIxpI"Oc S GMV+Z:H2vd,P4J8 D72?of1>a$]Nbs=Kx̊ygbE&>XYs䀚EƂ@J?n>lhTm'nܡv+0fqf٠r,$Zt 1-d}2Ozr@3?]^M ܓmBɽkQ| }^~n6Fg86}I h5&XӘ8_->b _ z:9 Z>eUŎmxTL̵F8ՅX/>'qwߒZȆF 3U>gC}Y }S1JG9TV\-B{MӨ&Ę4.s"x| 76ү4n;[4E8#yrH9=v֍/8. ZsߴIJ>&I?L6i}Y^XpCًݽk-$pxbڲ&6*9mg>{rtD)OQ`߸hy:1V(!L7,RPEd;)QϢ +RlWDžuF7LFֆoM~ar*EtIbW>jqour?qzJJaQ#-n`/$fhnqgTĔO5 ꐌSYXzv9[ezksA`<dkON৯s|&*pNaJه5B5H:W2% `6MRR'xZtfC$1aH_dx$1'/v^ZZ4`9);q`F"d1v>ժbLGd~MP%m x52LMF9 E"A,S Vo}\"X.2< 5FB΢u.`aJ#Tk’"D#cuCXȉ4 ՖK(KP|dZ1&8{9rLnMRф%V Ng2K|`ot.GSGd oE'!B'Nb1{8LW^9KbN;sö!`0ݘ/l+1L#B8U֕&*?V6N{դ}Y(INBKhx2 *MOenT.a~.E jG)j{=u^K+Ȫcv/w#MivX :)ǪCZUnAS`SK6OSxa3 W; K>窜̀'n 3u0?K@BS %fee}i]>̤+*l:\歶!IZ5>H;0)N.w7ߍ|+qUߤ^oå~4en\.cY[s'wSSۘf ?.D s}Y~/J[}jX^ޗ_-/̍ݥ*n./cus}]\>\\^'W_nAqC_oO-S_sOq?B}mmK2/@DJt}=xL@5MG0ZY,\S Eb uw:YɊ|ZԘ8'ˠ*>q/E b\ R%.aS qY>W Rlz!>Z.|<VD h5^6eM>y̆@ x>Lh!*<-lo_V684A飑i2#@+j3l૎S1@:G|gRcƈ?H(m>LC,HI~'.Op% ' c*Dp*cj|>z G` |]e*:nq!`{ qBAgPSO}E`́JPu#]' 3N+;fwt[wL X1!;W$*죓Ha-s>Vzk[~S_vD.yΕ`h9U|A܌ЃECTC Tnpצho!=V qy)U cigs^>sgv"4N9W_iI NRCǔd X1Lb.u@`X]nl}!:ViI[/SE un޷(ȊD0M^`MDN74Т C>F-}$A:XBgJWq&4ۓflq6TX)ى?Nwg>]dt*?Ű~{N_w7p682~ =WBX"XA:#u-9`x 92$4_>9WvTIj`+C2"s%DƖ|2H\2+AaTaBˮ}L@dr_Wfc>IdA Od[jlec=XJ|&+-T1m8NP$%s,ig\Z:h Ћ߉n!r}_\ \5 6 d#=&X^-kOwĝJO\Vj; )!eoB4F\jtctUb.L[3M8V|&jZz/@7aV),A[5TpUZL_?CU0E [%W%vl x٘3܎y,< )i7 Ո: tC`\?c%v7\Ct!$9iç$><+c~݊lz1H[E'2/clQ.I`AWOlw&5fH n`gMytdx)lwAK~GgbJI-tq5/i ?WǠr^C/1NEU<=co(k0Q~wˌ\g,\ rf\PUH,L#L7E"`0dq@zn~+CX|,l_B'9Dcuu|~z+G q|-bb^HcUha9ce1P[;qsA.Ǎ-]W‹y?ڕ^Pm:>I+Ȧ6' ,}U=̀*Eg.6_~OJ/8V ?ç&+|t><,BLqL򱷬dS{X6"X#-^䀕#{К4i̎'QIc(<ǩJi lc*n;Yػ޶^W|9_*[ 8izyisvXcidωn?$;NdSmݨ)ė9 +$1XBkZxGIXRfpד%2TIEw jSdd+U r,V!CՅG$0T(D(K E^RܭJr *J)\PqxB U(ys+0wܞ%_~CT^0DV d"`-ţ2R,: Hw`5iA U=nb! zgJhy̫(N4Wͱ@.L:#X ?/ldxD/r>xpyi(n~ǹ&Q]Urp^ %_KNh?u bf//;e6"fnd}Y\9"z?7W;o8,L +2*#сu1 4v` 5qЊx~Nkmз˪9hyVTOw{s \|ٶ*1|?/GyhyOZ 1bX`Q<}wǓ90Y?I7 k˱U`_2 L߰^5HxA:y?v0 7"qrq|W)ma_>'Pztpzkei5 sq};=GώZZUo"ɹr'㱡n\֥\iMykG%GE3ټuՠWXP^cX=7*p u\%B'@rm=V_>``ybx?2 vmH//A9d lxIc.Ln;e28 cMKކ?ZxfhjZkֵ /hhsxX8O(ly)ƣ缘y Œ,3h D6=ưt3>jkFGO)A3dprX%n!dЈZ<.q]*Tv9ڝ?s発.r#<רNqez߽cx=:cm!A&_y.B% ,-|QBPX(Nuz5{I\&Rb$문z+<qqs..) o~!$>aãeW\9e{Cpz?|NRV0iL(yWS\o@Z88;q|4jXM;?f%XO/0z@?uVq [ߠeIy) א,OY(y kN"Ҋ0"}@ pEeCDR,lgSaOS˦ilEӥHlsk;=kL`DE Hp֟M<I TT.gŧo9@vG\H<`ly4{t h@N|+BA=Ѕ2E;m.d ߷=< $|_S@s/i1ד߲'oc-&*عS1G5!4>R L!,ݢYWpS9/`I x^Ê<ͅi |f\~Swµ; YTPMȸ[߮Rg׺44ݟ|<-Ye N`-'k40^o` D Aa/}/mh6/MN"6MZ\ 6 L.0O*jv?"}j=[Ncr|icZ;bv?YOSyAP>B4r;4Tk3(3'%k_t!^Ce~Ӡr6BjJ_`P( 5xH;, =|j/M/< X0h&xF,-Gໄ&6 o;;BAC%O/~/ObfwIlnтM&oC1}-7Bh 9_h"hFDZv$gwҜZGضt=ߦiTU>sIMu`VUMe_?ULV d㭻@lV6|]C뎽d4 uUqXMRi; fp?>wrsZJ䙩M"9dd݄Ӿ2;ބj}[_=&Vqg&uFT#Q2QpEw\y!hCgjʕz`掔{ٰ{{"LW@A̒0VYn.>jc;Y]n@ ~D&RjuI"!|1TS"&Z1õv",7|%ɛGC)!3UH{K+/Q cDR$dC۽q1aڋpSyPe!ȊW{Ɵ{ƾ~R#MU Aݥn o%87pƌ6ږjh6wɗX,{wZzm 2<.AFRhesPc2b*R٦[!ĶkIn*FɲD+}FPU{ 뜭D=Z+ZdA-ƱX,\pv^7frap?kzQ[3fZv)^w麞L3p]z4ɶvƛo BV85֫YAl&&[OsK"IP#pݦs%5\ltHx56*w6\ű Buܖ jW-?e&j%]Թ%蹺#6eK#lu4;חꪇSU~"$_;}3Byy}$<[:w>xmV!gG= aUa8>jמ94*uUh`iU>=%ًXg<=KL%< n`洽Xܚ4kSghJ;92EUugwNn&p ]O^|!Q~._unZMUe[<]Ze{ʛ T 17Tie-ψtzH&yuU$7jWbՋTU笫,lDVEXW|dXrIK ZIT.@P20@3:_UGϚkL- 섭q34>ߞP|:Wm:,!N`?Q!` 4IF(PzBoa}] Bo;v-QU@p_WC>sĶ @!ر^[Y+Yōрw{| gןI"ܻXa(l I6B&{H-H)Byk-8a7lvMcF޵KviZP?ժt:.z%wm|̃k$ ,% FК(M$oh[@bL `!4i~tbM"$8O-04b M($i d{I yC܃ p{2둠xU;wz>YbSF^J,T>dǻmh[p̰Ck+mnVÙs>I7pq$N|츽3MCFuHɎo$ؕ :fm;m].YVW( F<w$=Q$fvY;tp~xã"&Գ ؅x1 -)(ږ-:ݥp7xQx^Ga\5@d`x`DɀʀҒ' l"_{rA;@h x[(NGeݠy~y'] B m;=:܀pZdQK: u p lBt%XAv;ĀwFPHE"x7nZp @(}P!:RDH9`(>Ŷ8¥$s晈R2 }0`4;6pO D DF7߉Xy@bQX3@aχ ~PODt `oD W;dS}O]Зg  aUnpgy+Ev `{9 [rPӍ+NCPK8ďP 8-lzloWf T[u^K!D lfx!&Bue+ra۝Ě`JPŷbY(C&m;E^%\>u5 X-S&M8m}&9O`_z\66%[BqMmj Q&QS-dx]ca{w; b޹Da?fJu'l`[\Am)掠 N*!U֮liޕ0A}-^* ݤ ډvEf fw+.#3z.?`9N'3-`֩H<7R{H38naB~! )mw.CpNA`$eϢ/>t %x`3[A.Ŭ `CPIt(#ݝvi_֙M,1u}zj(ϴP@s)r7HV|`X+;ECh.318TVN"EQ ,#;AG6n_I"hݢ2_i2* Qi]əRVF|N, 2* Q1gAu`9RXڟos>Zg!McU.NS&<:jH4D+ I~KUY0)sM̲AllcpYh_6C.?Z Q2 J(.[6т]qߚΥ0Ur|kD7WqU5 d0F0ܑC#S0nLԲ5-,YVP~t.{jԏ ]ӛ$uQ6-k x*PGDz o/Z"3䨇$ +=fU9,/u3p0` ӟCq8(zY?}LC Q2M,2ίAK*FSѕa1bA[sW @l&qt~Bn))k<@xŏdZ`uq,e eFƬH6qUZDy``rN =WyKq dF0&Qԋ/a!Ϣ ^\?$6j=nlW6gꃲzc;ڽdHP Af`p>m>矌DY=Ί3-؁+IuYJX?/ȋ&l ?ARlT!"F՜pXe\Ve|/sg{6j6ũsZjzx.0b@a ZΣgY >ɠ^(ly?AcHͱˆPB!zy9@eѳla -_|zslrtvk:M8<\G|xzL-&Gze ~vs_6 i^f7r ?9M'yqːs{|vRo>wLGyUCGp6;`蛯?>0b.O ߿OY5v46P.>\eVTh` vuLn:"?Ncr!È&AD RF8zCq`E|4i2F ADA{N`{"y=0y(a(=mAiSP\~Z꯺7^|ۃ[h'Sb4*)*'hɃZbh,.VJ[=/k;E Z F}Q?RJ~4M4 ЀmwR~\8?\c5pf_ʏ,|nLl(_6 fLelD/`O4ɱ~ŭJ&"Ua6Q:&Mx 3$HM2aK`o?L܄ i8JÄiJ$"G =m E!0,jZgK=Q,$&MOC} {h'I3g̪uj ps&ՍC~IoqHߠX(wۧ.s`᣾VGR}4h[PVrH)08~_q*~o J&(SPkUJwSG'*ݷZN^;>UԺݻrw]*-AXi=%e;W;w?MP|AUAuvyΞ:; uAPoUPoA&;.4A=w4X4A`wA )hbUPbwA{ *KsV%|D>9wS"AYUCZ;+qN4gh":ʼ@!w&_I|pTR*.GY#J:nw%߬aMsoDA&?h\ӷ{gt, G4-ޗP?EsY_4e_}hlڌ۳16<<;iEJSOjU@fWy6>n`!U fEXs/?^i?ϊ,G~G+L j/4EPWUMU7 n{Sz1/eu~dYg>Gѡ_Bi<' !I]D5-TA; -i3tHejv8A / RG75mO/plP`(ИU=X!U/NvKKy^-tDQD6B =F~پj~H4,PR^TDIT5m2I3`R@6a CUH> $ޥT*e絖#{eOrm }Nn7Q^:}y%~h[(ׯ!D!~K5Mi$Y~"^ )`~<^s퐌Q1nimMd'>3R0EL`I|?1ImNq6ѣиm S!M*Fq*]s \] &75Go٘_N4U}S!Pl[[.-(B{ꊁ?"K.M^{rm.s24f`Ci\فmQ9Yrc8I~7Pv=a-VD33HN6+rte$.RcH0vdyݙu[X`1|D iy\ݬn))e忠(Ʒ}c5]tw׮)j>i]}$!UcNejey^zXZ?$pQ ݫmJ 0Y+ xm葻t %J-P=ճY*O\*?"8&:H2>Orś S}d!\vb&N,_3Xi%ʓ^7lc U)OX"Ug )rSnA_ȣhIBy)BM+V@](+Jy0X9>[) vfK;xh]1#T{}p @5"Ѷ:zfPб}K0"߷#Zj֖ Ke\zB!ĤDG?~Y"c.󑜀gV-cDt|WAQϺOz[Piѳ.$.B]~U1(pGoyѶXvj dWL"#M@8 PWD=u.sz؁ZO5_]fvzPh| &+7`ZR@պ2$ ~SuNJˡ9 ׸J+T&? ,?~_M[ލ:f%b6ζJ^ΣFV Z II (k4 }1ͤYɢ0c[_򮵷$|6*䵘-J␴dk0}/I8t'1mIҽG]>u*,g?߾ oj1vΞ֋+OM wn3}Y ^m:yx27)_|K{\5r0Ox~֚~`~;ݕ_^cϻo~OR}b\H?gmW?6py/6a{Wj' ? uy_&+'PvOwvf6hz_۴"rZNN(r$^ΙWWڼWSbW{4 ebWf7;즜4~~.xj0:@e k(8 5xVP,lqseѷMpr)H-RtƾQv a|xj.0:3b*y )Z6&,lc^ON V !7F%I[Ks@':Pk0><s"o@`+ <%zDvS@Ʈ?\, S5_5.GV(CEn)! ).)WcϿh$4m%oSV1WU@If`~q[>dA%o@R"fQ{4EE)`7cjIk\$s~!웉WrX 1*&LfFmS.1t\orq9na좼M¯*}.8*6ؙo!p)C VV UK8~NtQck!tf&0vCT$rIb?˄(wbAP1)o#_/?MbX0<S\41"@f5@HLNO8[54'%HzDcr`2mR,$%l?5>U`t&(U@ʰg>cp\ٗ>6r=c%J Z+.8^%'x v3P4.Z]L*mmV}@wFS:9Wն`CaMLbr5)tC $b$~T6unOk۶d #c8A$]hm=Ͷ3Q՗x<,Ic&Dj& Z΍XJ;ܑll:g/ *B5/ ^JC1ӱ f˯eI tJ~?ܮ6 9aR`sn{ѳ1( 'dۧme~Y#$s5LKyQ z!nԽr+(|$=>ۏ66M ڌ/ 1JfXG Xx4sEM tX; ˿nev0\fݒI:=]JjP` ڤR-WF|iV&Xja3J\9i^K!'<.8۪sđԭGsm6]pL4t'Ї9AQN*At*>eG\t.n0E\eb$)^fGS7)FPeV+[R[PAS_tZ\ZeÏr]a)9;AGεP Ó#5dt/ؽm3YdLz끢 y'(\x@ЭeQ$k:m"xk#ן nx楔]/vxPͥ;9\;& %bҪ00]<C<8y}/7m[$G] .5s!:k_T%f|I/aP 5G05AT\߹o>-rX)u3)\Q|y^7kWgUhjEC0K/BcnN|xض!֫^`kp4f[M]\M Qi*PO2kiK;nsJ]օL$+[d2%rĐ?/eEzpK1H>c k Q X'c|ڊ*{"$:-HMt.DQ+>CFJc|ʹpSD4׫Ͻm?:m{!bG7Wm KlUs)\fYOK#6LAׄye/iKi2D fߖIQ>F% FtǨn>tю3/h!R`$ڱ`?u7[2VmA#z_[,<>.vHuаS.VN:~QGAR8jy쁣FX1L,mPYӣniD8GiqO\AN J_bρG9h"v:ޙaJTAFr]DK]{:J3Gt1CW{ w"b'EѶ^Z R~*[*U.G;y4LٞLeѫ5L,jl3H,p5̨MKpۛ&s8I]+E 4R/\x"}btu]p,#Sc7(,Fް}rPsNdpSq$X:/t{h/KI:p7+s3B *bMzX.R9% $ !sG ֝.ikb$861m q r ɪxŽт*eKt]VFAP$+6n`uE6δ(' S 7,;('b[@OyhM^?9ڡf1b"a'Չ`:hRLq"B&erQJ=A LϼŕJ ( _7'/Tz`ibZЏUy:zB8Sfv*~g¬ݶ^9BİkOwOMXe؄^tNH/J]x0@smyRhj~tCO̓a KsG})dӆ5jYT %eB _t>c(grG$Srnp7`aF's; ^v+O Q3%JSo#b +_o`yY̽C.xtj<Ԝz+m2z^DtwYf3g ΄p(O:q0nPһ8|lf+$~-~bD7YPSbLg͉}Xxg ^Tqc]N`kL] 7_mgwM_ɭSTsArT3)xbHԕ Is.g6B9 =pFqiQ{9vOk!tfEAKA91Ƥq_;&=K=ub7^oi~賥(AE cE}VZd!e FȍH6)(3L]9(Bٲǭ+Ee f[MZL9QB)N{wŝ]p4FYJ`Q'.8:V&U!^TK͋и}p.j|,Lxykgl;ZY Yڴx[2 bfLt~8qf3A 8,W)e9̼1Wq#K+fv#Clc3N&?LF67WlQ"M n[TWwTիUE/S {Ju ./lgP v&'XPKI~OI4X w3_ 4SFpQǻ Lsr~8$>)9?~?f~B<<ߌwM "z9oƾLwMe/ήJS5(˟'PAfOoiz_d~EhҬґknMwlA'؋lptYsLYvQ?j?ъ,4$Sh:ɻODq.h&QDY1˲0߳u2/ID׸3ҥ2 8Fozƥm9mz<fb֏'[2"koYoɵ0ӯ~=so~|'{ `8xٟW_V%(*Ӽj,e,] OpQA6& PLɷ tz;韝pSܷ8}L" ؎|=n9L I?q0OM:ň*zJEv?!?'/=FF"+И&:.y[\_+I h^?m%?χGZXDr'(PUq+H(~cnr`݅'T qQӔ&*Zܕ=0פ;K C[fz,Qd/QlwvU{牚(|qDIsuqhxwl2<Nq:D 켕Ux1t\нseB&-L|ups ${wUXQl 6%_nj/}LiDŽSey05cS1'? D;77fn'sX~"7\х'v$HeF&C(0ם7Fq%JQ$4CO|6 (oTLP9#byo..G`AK<瀱:.+xfA {IRD@v٠X1dt \gaPo~_ՎSs_TւE*}9V>+n|r@lǟ^YogOO&oD4TNMCerO }LMDR>)ک9*!׍r%hܥĕn|^hX5%4x;gY{WtjNeᤸze'ɗ? f>||>z4↼R"V{uYCJQne~!CMFr&EhSD1?QsfFHl{,Rq|o٥x-ɆƟ0Ho-Т886T;:{ܚ4{$Oܾ,s^[_ A-E~r}"tG3b+h_7p+8rGF"8g]\14+{?(ݑ#|Wړ;ѣ|hَ5Vr`5h7m{$>:u՚'yZQb; $bˀ(_3yx hYWZ7:ZrKH *dp,Vs]JCŮhJڼqd9EJlСoU#%׸6I2a{!^P~ 閁|D@0nE㕟UZc$^ lq>TJg1~^q);t.oUE)U5d%3x36ϘZ:\ݛ߃5.Ŏ.٤{ 'ޞ,. z>.R-Qԛ!aWfטL2_MqjoaD brgq)޼Y&gnRh'E`O-p݌~XYShJۄoտ v2GȞmǻ[GI/H< ~쥳1t njߴi܋SN2*N('9G>F9&9Ƥp T lXnQ?^ogOgpL>{:kk5:ߡ5cӒgY3$ c)EJUf m<7xG5*L*cf4f4A(th5Cpjy1) |Ťkw*BH"f4f4f0[V˚!Yxr&#F>xnSP*IPsީax_@RSY3vH3v"Vѽ ;Rm->t ZrT[OZwT[=p7QtzƸ0*K%Emdr(YNa#x⫉6+lzrG4/%?d|OJ70HC@z9K \|{+RZdW,_S lvZk?Zgq$y\AEV8bgføf`&ZEJvllǷ-gJYƶv~Č ֮ߡяF~܎ӬXǙ[QoܟyI7~b/|ra>iꏞ>EŶ᫘^rS[Ȧ!_ kGb@\X ;WPO2& 3!M@ \/(QXr=@ڔVUDT#([6e474[K/b@[ <aEiC[)~{} gX~&8: A/-ۯqT> |? }"STd]i1QQ2_R\K`Fāy>wQKZy"EQ@yDž`Jw%js09ĥEg+IJ.s5h6+F/1${J$aߏT9G۱q[Z+mlٶ}Y S'━Eaj䶴 /-vđBzky@fp`*b6 In(uJ t`DZd.1V $B*A8CHZ!MиAT`n ) +a0Kׂ7R!3wʃ"D L-Ivj#C,.` UAoEzf2^I0GD ԑ"5(8)Dfr|v1g-U>+a:Bs`ιԧTT(+"6 RT90&D}nc:1bM$;m\h)nqvۄEg *u?{W۸ !`i zu3 i3q(ږ%Gq|}Emf(vEQW"QS؄K236R} 0| 7\ZH-&Cd+o[`j5C#lOr@{&nl_ s!r<J^8Ĩ>)\(vΏcFNR"0л-n`_ ')<82%HdqUBR(LF)&JVdĚ4%?'a }Zd BLx(Qh t<*YQӌY" GU;XT' L e&>T4*m&0_$+L}Zn#ɹT8Ҁ#'@%$I,11Mmb K2Z1o>f;~E,~pA%FC.B$J"IJ"LS#c%Ē[8c UDD(ko*.z'=TA&$faG8i@o Hن'\mrS) J>i/j_-z5ڌ9Yr|r(r^p*Zr0.J;^;i#i$߯^9wi&^h;feڵBT1l7 Vl5cɈ/+n85s5گ6E܁NMPq~F; ,g&LU|3Ӿa'% UCAh9r׃+/_dz{M9yw6 חRW {P-9?0J_z"\Fh_aZyD1Kn\ó7ڐAvnE͈J[NEt1R!`)E-+1~d.|=8gx  b~~~?r6iz=ñ!vaexئ IhE8;?)Qx/\.lDՄpAڂ "颪w5vhђgYmcF&0l$B&,T/Bl-) 85$K"0 Y%chX$ƤB1jW*8#diJfnPd N"H؆?YY̕Tm k%N̬;6 DZK-6u,4as50e0=v)r`ۚnpMg g5'6et_޻(pc5!wҙ-Ϧ994vէc>f=oWC:cIeZljR%y@q|U:h 2+e~{~[jKQLCx`r"N 0+:燶m 6HJ ]SOS=k:)Z^Onw^7C•s+tO7p#dG !0ٱ8/pO[l?YK :^d5 W*k oe'WE^&r0ݯx6/d}ۻh%V*T<7lvE]o]~e ; ]w{F玬sG>5yhr tO &`l$gj#Țt!&K) 1HJINk<l#/h~w= 臶Ƥ^<%NxW KGL P׺ЂŇ"A%=yK$\"`JMo,u_,vpa/W[5= QN.'33]Y)e`> vH8>;7~u^p5`-1.l hv'3<.6 NSnF2Up3u\.#R= 'ZF[*GrVf`EzhMIBpw/V!;j!U7:]ӫ<*WM#%C&}~mǿ{FX#Q{ޅϟs.~Z)IhN8W&LVkUӏn%%N|>nyWKqǞN c4ZDC o;r/IFG9-Λ㼫伏"-NE |4N{vS1r0|r|SipȜ| 2٣<nۂZlR@e2n<^w, b;) aOi:"to߻gx╕~A@BpwC>nlNG=Z(ΦwW|:U\ xӪ?5> eQxٞ$/8ZhMd*+V5`Qv{{ _h@j2Va59<4 nk0T }gwԧ@Q&; 9}`jΡ .@!'* 0Z!U ssTMBgZ;I98+#lT{)asZI\RĮ#>ɗRibG!C16$@?}𲟼!7y;ꎶ*sM|1Iʐ9n~y0˪ 2V˃QҀirb&T8 |\l6ϊ5COET{e\+@073<[֦_Wip|.ol?4 ~ TMQ\  \ ƹt+vbnol!¿nwݏ95 3O#~2P%9(V{r}t/Ѧr`o$Zh .,*8)wi? #/yR/no|L⍋@#ޫގ;̛wNJEmLW>0AB~uדE~pv롲Pkl!~uuO3ʟ?0s.b vQiyYA|PP o+&3̣Oβ2?1Jr?EyKל|L3|u>h5[Vd':s>M*.QrIJB&&t̝qF13PvlqA RH~ SFR,X]=d G^3ń^I(9k ܚt1xm˄ÒՔY9yv ) FA߾m + DNJQAtF9Lt/I~V94v`xGl~6^(#\L<E2eby:͋}>q#]^=Wg(MnxD0ò!LϟA>DVnu6+qKˇb!D /Ok"EY;Ć:ix[nyFB$"\̯'֧8hNjWo~=v VLdDeeob<)FA~? hƳ_W wEQ0xo@pAs0Gt-lB@m 1rܑî4BS8a/*KX}QwGoCXTU' mr򒛁{E -pmPO7 x*2u3z@U+-az6 0 оe*q`h-tĆ"~2ShBȶ]UgfDӲ%1144 Dh"(/c g*躜͙UI岓nDdhz=?Jdך?]Q$o^z ;=[VK- SG53 ~E)oϱb@#FÎ@]Ӂj.]8J‚&5WE[)qgїQ9^I4Gq%a:K0HKvzL FC+REwOwyo>M]m(i+Y>75cFH~i@$%y?8ْF{ZQ74]j7+l,.v:b$P ĩ ?q#O;őY:^kc@4C: $%VOmfrTyg^Ww",vm-o8n>% "gӒ]gVuy{~fc&EO-@w?UpdXZ<8B7?MF(xƽfkh Kxqz?/fK;īm6tͷዊqq\bM5e^bڔ[|Vzy_qv\MOͯ嶺ڜ[O//W9Y,.9h]cG4ہ,Tu``i?W9»_Js=+&&,el]8ӳXfŻjhEkdgMK`-͊Xz%7v-ғՔ:rgV0M3F ? Ϡjg>5.&&(,8)+Oef %<6FXp#>;i\>|/]5U0 @7Oڗ0 i\y)֖>;hq~0$&HMe.LsjXQ70]_=dϨa/ sy,4)笡ic!졵Jx'6VD+f0PX 1Ǽ6N;>iW&`<8"#rUv'0+?7t3r=ºZY/9 N%w`Vqx{͏5 3aX7 PAA}}|uiMs]Ϧ4s2nZ qe]a{;oyT㨴_q%#"{vv:Ꮩfav ͐䭽.O~S5vۺjk;L;)=:':s&!NUng cGwgVrNeDb$V!LFwytgyqm-&+Ksv}lUcQ7Sg30[r _>>[b~k'scd4|y!NOzD ł2O'H'V(B1+5NK␴gPjXaG_On=<6QO:{Aݻl}?mۋo>Әq37HPOxf-Cl,F%VHY!fa(̙a$ bٌ`)ĸ 8@vHxew'japX6!LڹC+Pv(Ʋ}Jqy=2l zqP <(6qLaҩ%IKa` .Y/ʌqYn'>tˊ٥/)Yp$OWp\>GRe/Vp>C}, {RV%\u()-!cfhlz\*̤ LL̢Xl;J:vdD_E_(_^x6cAQqǡ{b_?Qv:9愭lg=9\x!zTS3n%Uy%Eg'|n4&|5i [rsz\=ߒ4xbXhE;{i.V~jj BQG*4ѭz{t NŦuNA" )gײsQ9BCU9mZ|>w\$#o?9f\4` 8Xߜbvc`yxoamNCbJ\.]vm(|nѡm+wơc'}.$?n eEO+n!-ϱb+w|r(J\2{Έ 㣂R`@0d2kP4"5  f+psVk@WX"h 7F׎ y[K\h8C@1XN[b$@{aJ? ~_x%]Պf7Ml|fyvB@HbzR;Z/4Bg6Pk@s #cC`u$햂eW mCԟA,V?ZqJiZ>R+okew 18a Kx =^H;ě~wrnH>L>/u0ihw2gQRJ!I JAxt>reO?d|ΛA1H$W̓PFw xy qEG.{n]~ P1ĨAb*nL@2UF7}w̻4F3aF6F89B ֳnyT0%]FXf5K`T+QZƴa0e!ΈRRoGwoq&7R^)yh\fl}-ؚ,aX;M,hs1›,yQA0V` ?sG>2&UW Irf倐s+jo^M Ptsp10AP`p =Ȳ0_[fjץ0W_OΒ8??d3pK-Ҙg3 FhyLj;ge4 8zB!}Xfv1j *PH9AD51 uF{SG'WCi`6N[GoKdpIj ΀\h&1bqZWNy!4sE-Do: 6(NY V@;`o%ݔGM#?SޤS4:ي\HmD; 8`y%䔇BN#?WȌ&@[ C Jq@ % 4Cyͷnʣ 馑+]X!Eőfȥ9H馊r3{WNaѐvfpcZ7/™y$_&[}wߛ9-c p/Z UuRIHw/1Hg?^L~]M*_}3)xCyfO!TC{024XYQ#3jܤ<7i*{gˏwӦ$"z:paۻxK1880 zJAa hSk6Lr{5[̦Y|73TڅI'@΀a4xiýDi˼F3 7LIB~_ȍEGmj L~>eE֣giOb,2bE2brh@fz`?7Nng_`BQ&(Y*,ݫ.HRx$U;fg<()B1JדWrZR]7B7n='lW>$F'^:nL}tX&b}ȼ`:kj z[ՉRJpD܅MZ ]e.N+y^3&Ztk꜖ӓAy,VS`}+T,ǀ+a ۨ> lQ$3.aQ*y*TQg89D3 R3 vWMy!4sK5EB%  谒ZAZʣMD~tM} Xi们6 * c&e>˸Gcm ܚ$3u/Y\hXEfO Fƶ ^&0 :2Z O7ŜD~!D,  9i a-d\y$+Z ͟$$c8 kārԆU\#B\)5F~PD٭,z^JAˀPy `rYW؅EYI5屆Tϕj9- )kx%'w[Do½͓|oЩRo)-zK"N4W q;PI;POߗ3Bj0ORl1Ps  #,8Lʰ`Vqm7&+fIAO7' P΁"1D­Ryk2aAGufoffo"Lj&7;w0uq9Lq-$Z[ͦ'Ƕò?6k=[;np+N`s<)k Gw OX4yf `LN5:8@!._'wEOS&z}>&:'el8QQ_d, rqk0L 4Xs:ڨC <$3EV@)P4n:HT8H U>g}$,l^;9S>wI~&՝Sy*FɴK  F:O<*igUKMP,̀T(f G3H+Ly!4s:BvVva\@f[)kJ[5zRcSd:$Ks~K/"q"w(3:} Y&9>ӷiI@3Is[ ZoIĝ3VgNϞtv޾dv!fȥwnW @chJ)M޵Ғ;5~)Pwlrp7w CBɧ6N:½Zu 2Ha'ew+"(µVTqWRp8 T$RR׵[רݒ@Gn<*HJY;)|7d u:<>w+yԥSګD:nbx)ˣN+:ܞcNw<Ŷ s;yJ o4Bܥكy5ˡbo(Tm~˞k;$$Ñ |7 $A7 \]!U:ɭ8 +<'8/YfeRm\MBk7<ѝɶzXN!hYXsQ]_D=/03ʲ)ֺN9ʀƘt"}`B'VJc"T7 Z F0aTA O [>Uro{kZ4SEnD1pWVSǑԆaDk65" [q*pF6䡾Z>Q|F6 d<£hNeӷr( GI-f76:WaV#0/i@ȭ6EJtfČ>>Լ0749õ=:#y1g^ hDʌ/ /-aM]=rĵ`vu] @B-*r%[O"~ $^B!#gq0*~uey\,(f5s9~X\(7aqZYguTːGO 'JW8Q؉p3!_  e!TkcՔ +iPj4"-*} c$hl(F6ZZU*m !P#Otj6%z^U!m榙QkD;5rtp]V"(#S52cE~\0c3Vh?md˝!e;cognY|[NY:P - ~0VGx 0 J6ǧ #<ʝE.ͬqfr~]. ,ʅ6g|y+7Qp(r/>BSrWmbNⱪRX &4_<F?Z>QQu1+Y#>RɾvI(^GU3jQѵ^֘4£hc(E)-<7Mҽ r/k/rpBRFx;EaD<>6BEhB '&xΉ13?$AR'd NT|jkoR}yohpbA!s3%~SO MaaG jL̍eə`EC|Qޒ\.r Z<@X &A(k,.UaR|cnGگ>¯S#h1t5cۂ ~ <:-<,3Ni$Ef]·hQ#oA򃞨G Bxt-q2Lϧ1~qb轸o(4߹z۔s5 "U98v[}&_Mb{oٗJLcnUڃ*!:Ur%rcN۱GU9LNc~8i󝴷܃O+VГvqgߣ_>ґǠGxL#R;|R1ބAJ<P0y ǔv-giցg@p5G,7yiĨ .DэHkB I 4 ZU2#j۾U&g2q1£`. ;hڴihtD_`-O`> ˧ W5C44z8Wެ(]c BYDY*p,k?|׭m埤 D:Z:hGk` J+ /B3hD2P+lm~P5%̽q q+0κ[wSop1n'@ǂr(w<7kɊ 8_s(.ܚ>?%8+wV@'|'HO5 3£0͈jwJ8F&@MsS\R%%6'@rB GǁLJ|Ɖ Y>U3n4od%"vR#B&Z`T4j}GO Y>Q?'PR(VxTE`y_J: /l. яVpZ>7Y<(719xaN:`t v8\CKc6R+xΥ "7ss*оXdsAR>'izAmkfRp@H'g 6'bP`DDUWK2 !F]g1A֛c͛h\)QL-ʠ)=>hD|N!lBB d0SʠSE>#0ʠ{C֭pVqv<}o1dr(3)..EJeωҔӧ2/R # 0?S%9[^SBy#A;eIpѸ"n-!\ӁGW ѯ-(rMpI-/aN|9)l_\1@* gg32$䶲HW֡"ֶcWl0@x';5ksR$_s jGhU+kb$$=21w<0г4k@z7Ay>2 FxlDH@)EgJ!gFupFsev๾Z>Qo%)g "iY A&V@MQ pK/+_ay_Z)ٌhb@'h,W  xE,ڏ"lj7hZ{)5l4y'dBLt 4£SOM䫇T\@5kd-_VPAYK_]>"H\3³䍠0+y 2~RϋNpa*\YdKQlg]`bGC;t*-(h~ <T@d%-o7qSHoo1Fp)f|7~N| g?կkFx\$WzmD*Qp}A_azYf 0uţKPEx<~pTh P*^כݹrcx޵wG 2zI>r e_ VEݠ#_ptEA]OY4?V!e,Eݤ`Q]1YWk/Gf4ϱKWd yw6l~M޵WjoY\_ ? zcSDRrD;22D"|r}=hiq/^K:A{/4M\(}&ncCuBgύu*H6h7ઈ>ʉ=rQ;n(R.J<amt&5jzx~喙6JZڌѢfYN02]:e #;I|]-拯S||ELǤxՌgq3A˅t:ÆY&rͿ!9;|ǻCg5wBN;b7ܣF y5EzvC-b}[1 +ΑkK !# wњV@j^VNvPa5@zص ﷭2j1W_Vv[}-'|'w=6u3'{/Ro rIhv+Scs5Ԉbcl<\g|K>T z w!> }XE/8iZvTkaI߳-%U蓵?ehf=GS# ՜kpU+cr`bp $xz RG.- uerZ$ *WeD@toXZ5?e_aB'L')E3}.jwB\a\9DUݎeqD%zT<:9bu<9Ǡ_@Nf). iER TVq$#(lO'6 %u&dj՝&>wb=Gl͝ E@ڐQ5~p;L*ً؞9Bs wI;nbʊu 0#PnkQ)F(:}vcvܫZxh>\LolC\yo1񧄹=Z'<>넭Аa*oha?_\}A^n 5S036V ?2c+tԊ=d S2B7wR:o:rH2ۥ>a-=suWws`G`2?> fuhTNf̡!ʙ;46h 7_U;}χΙ]e.ߵ^DpM;q}O_$'|0̅44 P1WWl{E ?cLjA[] @ 5p 5"AQ :6#}'fpϫ < 'A!s?1B1*xKkmFJ8n@p$ȇ |XILݒZ2b{I.cEzx)}1uAe{xM2{.>>,ǧ>pM4+phMReR{\W7! |2]1 -ƄxPM#Eဌyr+P,7saƕ\kt&q}禒y>(O'QF\hΗ:(B._jU{^ nϹYλ<><e>fS*' t4˵؋8O*oP rẐ]a#]SrGUϿ<7 @rۓ߾mry(4HY.ٶܵofa&n<µmP-?i>i1hG8_\HvA~H*p3^y/|⏣붅xяs;Y4?i,nKvc? VߝgS/kqi*F5X;MLy$!U4]YìN*G}|; nayߊN\}h D#~rNv]Ontn4yEGӄۘ4wf;;?q8]_l.PzI-&V,Z*fo?N_)V,a߾COԧSB6Z;cpv7G ^xǔHG$(joA_!ZꉏVC8N;1{Oz60n{g4 <_&Wi S6v4OTϿx6]Z!T*5#t+zP6knU Hf ֠ϢkvejצG}gIhy~$4906P̜&;Ƀ&EMM{YOazdJ8E"F."`mT>X>gc88QpFL-aAqZ'Eryͣ(RU9t'iY>kgM{IM5"C4nkSmZr0ܛllFIr=Y/=; ̭շ%؎Z/48 y jC ZvR5vztӏtieyN*X.g3B)ɹQA)}I"5`WK?WO_n汾YETo"NJ״.:*;!.U߆Gb|xO-x~vvrޒFi-$jdYI% D*3v RN,-T'9ֈ|ۏnVoUG(f3 +72"DBXV6 ?rla Y}?F f/j;31ÝJ,u @帱2OH8_vhZNԲPf{?3`oac#sĢB\P&em(36U#zڦY,>.^kfÂ"c@+Dy¦:E/n9^!~96eۡ^]m)i80C֦F9a#‹oݖzf YĹnKۍ!A8+ \A qB@Zk aL)SCt׋lD[96f"vcD%)dlE2ɂ'9Z> d$0Z:Mzӫ8E n0<؋Ɩт)9u 53,HA#cĖzf YĹvcRZkMHFRTcG$z o1tN=j"G @GyTg6;O# ҒC#>34 3U ?\2zr G*j⥀EZXeNJ{![F:(Cw45!`n4"ēoQ9õ$3}zk{j[Nܲp~R7BZ\YM-l:Qj?9LbsXJՖ<0BJp+`ɍsF4^.ԝg&s~ˌtzCVdtQd8CHԍbJ?OJT{I>; R3u 9ĹIC$$Ui-8TTBF,Fٴ-ջbٙyl LMRGh8C {9yN dBVbx!>?T'\RƲYc\i[CLN/T[xm6[BUn*8lD4.X1+v;F.q jpHHSb@vFΥ tCɊÞll'qs".2=p+ӯ➭"XShW V(X :C8FƂ x 9޳HC.Е^!U|.Y-JQ;5$vsnWW5:(sWn (EM`b(d,D2ĂHaM]q6-&мVAhۍ!NpQk.I sYPcmJeHNY0 PzB5[XbhY_.y+{WNLfW汥. QU4іxn &|]K 4wJ bbKޖGn3ZsAk8[א-#䣹B?D~BH⽔ޚ}4dK޺ۺCm7PAe)O5 έ[{Aܦ)6n*U`l6~PMp"x< q)$)%)^S"HfH]Ϧ屩mn[ lYf]O__ֲ_$[",ط׃a `*4++2ׄr%v9įo SFJ5=:OVnb4OہFÁ:@HIp"c8P%Lr{=a`-(0(rrkT>52;Յ?^/d ri1DdFe1ۘ<6=F0Wl /xdA T[\(%E snw Tp^ {7 7XeŎwy(X0xt3HKE"$'d}^AFy9G35;"([5CƇ`6yr4-(^v^9}pu83 :2Zuq)4𳕂D=rYNKUv:m7pђXBHvl` J#V%sZ e+L,|ZWVbn vԾ´t#-A=2ic&sƵx|zRC:cmdXHgeI\a#VSgylĖvWC*#6MtY5˻E.^|CL%U<8yַS+]]8/(l-6H kD $Z/J ~]}YqvHtc]䊿krwCbI%+T^|(^ޱ6*^!Su_OY|v iɍ9)Ŏa( b ɺt!B󫸜y_Z؏6G} ,ݒRr/f9»ʆI‹8L}Y]?,A{ MZxQtAR]=NEu tAUt ɪ<_ޅ~>^]M~o!ɹt_xH݀_ǴX|i! [ivx;'Nb-8!Sj;Z0bhD@;m@$%&w󏿹 6鮘MDD 5>7MG>$tŤHDJ f (G}ԩY煝1z,@2$كW3_t(Ff0::s|Ā 衑9j=Y?'q]RY_C /3:1z4c q=423s*B b!53HYtb=qf6>t߷9顑9<_֊i˧1jaVRo"Ҟ/x޷hJm̅ !@IrĔ 4Ff1e))vL>8Qgv #3bڈmGMPb7igxt*~5*(C#3s{)Mg͐lC#3s;Ǿټ~h7; i5ǁ@'ŧ FF%1ơ:  ¨XF1B>8%Z`t#E-cWFfP|20`πz˜t=\̜+<ߴ57Gz7L>b4<Ӗ<1u9<_idf8Am*5XJΈ(ѸjY?V> uwoxK {|nA5U&O=m;J /֗ vx_.{(#|Nc=mjglj7e69(U' ꥲ{[ zV!sYqRE_9^Zc6⇈sS8_`df[/ZȤ2&@&qAG9k'-ŏeL"QX9U(ʣ}nQG%+cMfׇXMrǎs)P)ęm *ʄ4nIF<5w1 fS;(ؽ0Pf( FӾ5вPTJ N}}t(X. 2*fHpRi"GhkmOB>T#C`Jyn,+}` |xZDgWrGo{d X1BB'dg~䕳9'>yzҖ|rrJl`rW͔ um ԕA"'ZDE=|{-^UY?lVdB҇u_6f[wN bOOvd7v{:߆fh:=9v7fW0o;%|o$DILtVQ[ Ql+t\f~)F6n_O>:` t8 =D]>X? ZW 1(pdP[0:b+/C. rAdVY-ݲSR3n`.5ipތ~"ޛA{ *=) X)ƾTJ;~ZQRjJJ Ox3)~pM*0JcRj~3Gv>_=:UjoR -m;U'"4{VokPZ,5 ?5ϫ"NKէ &)ZHN;]@o ;2H&s&LPxPE#Eဌy]FC{7rHL1N"I*bDRwaK$m?H8S'zXz<ϜS:H*IBZh%fU1!=n?bdpzۓ>޾xd^3\аu<T.E0% ϛi9M(BH@9F+5?8)&%!@CMȌd5M.Ow*^ h% A6y1mܽ6}6=gJ5K h{ yC>TL2S(F?#)gO#aO,hH ].{[eꗺI wJ;xNi(0 1 .M)iGmXbjB!qV ѕ٘ 1wJ F(/-shXo|]zzQJ_BM}r_JqkA%:fhZ)bV*vek۬Nwz7QBli;UtZ `ԔS/ \,S;HaB&a9Az8n!@c!2Wl 68!M(qjSUT~W,?CaWBE^~ciHAAzy\*b;A~\"ۖfe.C;CaA *ZOf_~r5c E\K|maqn=spEB-I3 AG ×6\T҈L2 ,'$bV' sD- {>u-_5^*7#KF_ƭ0_\tg6oyƛt!eQ6]7lM4|Ύo݂U.7H\%'9BfD smjr=,וZJa>bm `+Oced«xh12QF~R4Tòf἞هA?p)rKTQ^;4qc;#Wk_i(¬^om`B$E]&*"`)Gf q@Vn,`֫7oZIqZ*fq-"ҩ&Ff}/;\tM_Bџ}@ $7-K-hI#kķ5i 2\ʹ7b3FuY.Ȫ.hD֕Is䨡H >Ե.zy]GתmȴG/GLR_Z?(AsVMk[FuܜK^?f8Riad:47-Yd+'I;T ܲ4L639]Pz79J YD!Sv+rqtBФQ*t %&׶Jʯ Ǡ_hKDJE~d/%B׃3CםLµ&[}h]g~ӷRR75/޴ guz}Ivħ"w8^oBeJg__r aFsNӺCi3M [AeWs! vp0εL<';`Ѿu] O?G.~3 ,scWMeJu LXybD~r:Np2 ՖYoL\]i=4>-K:uweaclb8WRB*'3F=l٪ٕ[ [^;zeV지;r0-IcRWtfi!\꽆Jḻ;(`!wȗ2$[}nd0(˼ûa ?*HD^$uB $,{Kx5*G3(ة3, p &2eshh-6'zgb&.CsC! :X0 H0醹>ZkQ]X%6>9qcFO2ҟ˘cY<<8I;L`vii1")|6|9l3Y)Y8R IX9YYN'K zhdfU3#Kľ(.r+5ln"z4sOzA[<*^K )#9I[ # y0) ޑa"JTiPlti+YPgܠ\s!7LH5 JCƈRJ'vk R1 Z;%5{M 133DoՏt%ebK% FeP`-~ `Τ "9UtUX)6C#3s['pү7'O:_"$qLՒ F3 uf̐LjXnX^@*$TyJ3DFAn7ڜ_n<=-3?Rj 2g';.$ڛ:WaUݩZ .=;JQB’ ĩ{Z}ٌ]q }9ϻ I!jSnChXn}x (t1_9~BU(IEmAb$G2V[7E>]n<8Va#&T'8XT9( 0ڬMDy7E?]ˢfsM08S`#3CB&"Sh;78]3J54ڂc\t&~}Kq$ P#d" 9^`CLu8wvJXC)8,Q hN<8N[E/CD 5 fc-fÍ:s^+E>.y½q8x:X6KgDL!`n4" {|Ы`;õ@5ytcH.%UAjBFbq0 pI%tkƀT=BDrdH(B,Y\J>R9ę*5HX a% Oף]'3Ss`(XV1wdC-s8J0C ,L:IgN|`:ݺ_-=hjUb9x0(G6ЃB\5^f|h lݤ;P\0&9-)m1fUi@?yPN  E9;).&&WV#8 NoZ4&tأF'٢~A遏g5H h(qnő#];!8-ZGIʓr54C1~A⡹-Իxc8K]IO6F~fT:8ӹs*RJQ~榰0{ζS+gKq(hP]ZW\Zx4/f?&O]6%%tuqrp՚^U {GQ 0ۢA?O*4ݑTYbNTH<"_§5ʌ8?MM_2sAQBl 't![Fcp*}ko3Rb>;T/Ve5(<M4$5?ϡ9>5PyjA}:jXa"&9N#*" } Z[TN{{Ɛitd!FX`h N;SUg(rwHhw  J~Vڠ&uN`Di9X+4CL/FѢ`Ji;m {bc @4\={崔I,פE54#ԳTa><k``#L>KhC.SEm=fixB챹#j?9X"(k\cPzrixͫi 5W:w$J"[ , A <17ELz0oߺt'xu?εV~npyWriRcEI4ukT1;Q%/A( p{z nj/*'P~)F_Z¸GJZz S(ʌ;E3Jh yI*:쥊Cb1ŗL@cXn;zj]jo0+5MQn 4ha(EI-YHV _K`JU銦nEDS[D#(l-6H“׃+L $;b"+mQ5r˘p'(QNdXty>IJax4Fr Huy*:"MC|pM*1x%gQ;u&g(FK[WQEQᦸGQT7+=\]/UR:wfewR54#}Z/=M/qZC18F?v@cQF?=dU_=yK7_фkh4=uE=- WkyjF7B!B8#"᡹ GpS$96X1JUzb]{dzME Mkh4<EY>'0mUE5):Tkh4Zm]fEW*][1Qedύ#!\x^p|-+0>}/jvDz"[s9Xw(~.F[ph`+{ׂu qZ, U)WkQLpP"x֍˂aX83zd;+on,zu٧E1꼈[sKGLY#Ca8Y)`y(WhY48O suݔP݄K?yQrCO t BjvpJ#7& kC8reVHnU..mvth4#z~.wPBnXyU@ Eqҿęvs; nzD=?2.—@_n+Xw-la7&4i Χ$Hhǽ?#!)Qm8.Ɇ54ŏJ_~E/^4bpQ.?D.[R[N,w@h~ύ? J.O]c|͑ods754-- i_DZti+8w<ĤSqtn]Zq)7ZvՍUac0Mgm]NW. 3m_%|;RO3hxjO l?BnFeZ1wAl"x$17$ց}2 Z݋mF7yFWΏ7eGn8^r0Jƫ u$>/l_kcq'݉Fg>>LqGQ-6HLG rƫْ8nJY 4O_.oywؚ7?z4|X2U,O*O;xolx`wmL΀Xopf?80`ǟ&IKrd4\p֏O>Nq8hjK`\xb҇pb ! QjQ), %3׷>[?}5pg{^>Lox|Fq{0~z{?@\?*{{7n wׇ_aǺ\'"Xނf7[̃i4p0.8bM} ?eۯ/`42a=3`y$y @+$$SCks J*gϺ)i—CѤSZh@ 5@E|qʳMsK%|{oH%WhjD6U#JVέ!9Y' Na47`>fViܻw{[ovx{vm@?8 c۞{[v>Yn<#/ 6O/'@ (ejyC[{sep+MٲrKreЦ,MRva=4Qdiw%#& ΘF(/$ zޗa҅&FOY|J Ҥ""T#ӖB֬>?3j^ MA6F@ckva֟IySlF6~ksi rr]r"!A,\Y*&_}>̾ixD!'z2yI|5#g=p<]mAvxbZAeP9 T\z+3NKXRYH)^˰}AAUyfiUqUԣr9?2+ws# 4"6K`e#`^dP%D>RmڕD5x t:i*uMEV-ï"̴˪a8A j&S2Z`r+paԢPuaTԴFi"˙ˍqCt?sǍa艫#sEp bfYmHIES#bĵ>"HP Hqi`aW폽q~VBK&BS+e߮CX?ˆ)#W# J+l3s^2Ve&T@]Qop "O"sX4Gm˩T܏e[rHOj̊aIh"AhȕK hp6,H;Tfu1Yu]̪VVÂLH\Wʲgfa ]j [`kiJ Tz%mU|%="1 af&b %]<`(N%7BVˠaC4)JҴ%)/?~ y17UwI. UjNw>?# ?~>gp"m0 T8brO8 SXmp )™QPA>Q&8(P0SЎ{v ɏIw_SED9|G`哒69uz0uqL>Ɒ+Q2N^X.dm1y{hh-`s$T!Z=s͠I*:ɍ(0xl@I<: F떝~xDld@}?}$ );o?>{%n9oupA?֣"oBw7!?e _5w9KztZ Q<>/>k-p3~ ^W|vKzD&?-?p%ơӮ|sG1JO({߬u|Z=WGP*zgN9z|y ;wF,cA) Zv2wdSгr5Q{4PP`[ ؀ jIQHLVeg 䦢™+)U.@oʠƀ GйShAR9)ypw|=7 z FxT YŁg|J| q.x.!fbn3܌$4Og"^턭7)\u@Ml3߯>7iđm\VWȕ E~eբX_uɪ5y2hh9]+ {E` 8@#D)COO8>KIAi#WEw%2k%׫4^~ϟ2XlLx:!ԞSq\Zt[XE۵M<$-)v"N)D#Xɸ=5J/;?NRUjR_ T{Ú]I$q!ʸ&JUeUG*qJn8!*$PI*]Nc^HtltQF#mJGn钿KaHUt)"{*.o$i G3dd) uP uPzmh5*TX\ (wlMc2 4|`iÙm4p8Oӎ4TE3ŋ}]ŋ}ž-ҪMBLN4I˷JI:ޘY l$6 T6Dú֝ >7W4ҷߤ|U?gQVqkqz0âCopN?Wʧ \˿Yvb,FՍC}p*kd,ѬtrUNL)HA ABK-2̽w" DD0L X^})W;S䫐g 'W!Z!Z*iMٰgXB.c ]Re|jX߾whUyo&a5L&^kY8( ˸o|5@<9 4r4]5EuL*ۗM&۔35rFMIJQrP1#@:p7]GyQlO[u*cz2g c8]ww*y=xoVXuQ_ݽǎ3Wo{`uz ՚i hZxLq?q.hDQ'"jә OoB3P|H E!lxJ>-Etsï5#Nx, 9EϽX8@7ZZaDGY? ʵ*xU@BWd?xB54j+Iy D3ˬsD^d6X7dK;!W)=21\q|R2-:s!`ݔu÷GQPx Wu-ŧ:LS15`T1.;aX<`aGv^_apWٜ~hFJOXd6a\-GSL5q#5l( ИuOuK ͆3=,KY%h[:ٺXE.zblV?~7ܵW9gUgQd5\e;dy]    RUۂtɕ v_[Vo'*"S nl 6M :wLTI |G1gU OUi'{bjsf7 䦦YLU,qQiGT_ioPYI)0qx/c ^$<g/r9JFJNkd.ä:{nY< d@oA|Fa  $>(v`Wc+Bժ=L07)W!eSm dSV)Wq%"Z6p -A) 8Pj.9w[w]ic\b@;IQ\>!?iH1gaVnZH0%ֿòk!PX.pˋA A Hfٖ*m)$)"e΂)alBS |ae{H >EX`I}t*_d1@nE[S&%;I(Q[rKm{qًd[W,dnc1Ww;Mɒ NNa)G!LvkYVnZ(qkֳtt%O#=odooeI&7rd;{#ui(֞-Ca voLGgj:(Mm * ^N L'u3KHS}.{T~x:0\]nG=5J㨘C@rM6Kؖ绎U{ۿ6`g1HZ8Hn ;}m<ƭ%q+\[i'xv,sl-,^7FۘBLJJ*" Tax'.*M SF/4<|6dә2͖E r<´B:}e,wP0۔6 M- 7C䠀xz*v  9q|R|q/+*Zͻ5j~+-VkBo:Qfbk֯9?\yVV[s5?dٴ RKog  6`Yyۢi13Wc{Sո4ͣ X\B7KomNSpciuF~lۄcQrpMq`Ыbfn|=m*N'_6mDzn}Oc "I%Ls)Ȇ>"11SvMq (p‰9HݰNъ#z:թr(wRsv)vh|Q7+Jf)fʴR@Sn15Yk:t`.ln./qk\~~8Wdjg'ٳ4Vp+Dt_Y4=7 1Ш0uq2QFؐu0x LȞNp(G:F\U3]@>l .3TZGv:iFH\mC7kGjg!* 4iqᚆ2񼞵5Oln2B+QFgY?:MDRa֪՜ JUlfqk6㕾lcHmݶ|Vֽ]b&x~ϗ̾zγs2bh8\%ƫ+6D5zaX*R*Bl#-)'B?@IAÐ0O v1mY{-FjJ%僡F){Y^lz|Cm6&Sn" u8$2@ K Z"K!=ڄp7z`ʾ" cdY5JzݮL=luՕǗJYf 5nO=TU "0R :""(L9AqB"Cl@Hb 7"FdK e,B~ _hN o /ΠPM_ʶ]q? (0]F2gHF07s³H΂'Q`h#S<0a麫WˎZJ!%i9 #XlGն d:o4 ^-D|3b}I!YΘ(LChŀq|/IopfؚGZ0@?nrf ׌RV$\!P %0h#+i&y^Zf]ӬTYMX'WNb yum(ʊJ7;ކ' F|N˖{]=?T StOgog'v{ȢVga$g޻=*;;]v ?$ o0`)$˘?a^?7 A?驫qVPvҧX27ȗGwkxΖ?sY+p_ PǷr(bNw ?a7~>9Mې7XA}n`]Vk/uTS oـA mj b*0 jR Kb/1\:6tKy!/W }km`UA#w]IL{ÄKq\B81I6pr SA`H"`KS#}D+).񒌅>b %vԶEA}˗(#~ ע NxH];W X@MiKM+.Y'8Ek][ZYF_n\LY<>D`!RT=fG0N^⟑Tr+edZJOI)0N+Ô0O2mn[[DNVEՔNfHa ]021wv~['dTNzpw7~~`]?=quӽZXw(۳F\.cKz8t'cp=Rsz9}>~8'ǧ9HPFv|V|2p6L6htH,~NOY3/E &i8Pq"nTNnwji}ɷbT}2S%4؞߳mLQԠ/iT19wu4;ON}fr9ݝm&7ްkc_r0gkVsxUרR[{]]+y.zԫTdUXs}a+ju6 uK虽|94%r O!2ւx:(kg>2|fZ Z JugBX}2-@ߠy\42c pQl-OI[Pi5Qlf5GBi5Zg !DŽk`U|Tuۖnvd&-Zf[+7K3/D^P-WbYu 6^ln Fef)xN޷a =j?:Boy7<'BO`w* *OIAxTe3U4۠ă=gAԍ٪$R/vg(~;x~0M/Vz)rx;rծ 좓rSgsLz\](i {('n_}v:sԿ\' < iW\;C#;9.XGprݗF†%aig>מȲ}"L:WMS6ƞG|Ca]FYCvwbԯ$%Y>+B$1Nl@L=YF $U`h2/QM~߼հ=E\Z/ʧCW`\߷52ӡrn: a1G>$r@Rf.qumT1,[Lb;( 11q=+ɑG*<[cߚ}T+W7kvzUWcuN?^|qa}khÌ'*~lxpa1[lcZӸ:IY[j_Hz{S_V`u탇1]LȉC3i4l˚K-_~kw\4ǡ%MwɓwX6=:7̆̋e0z^-i}낽ό2TU{yCˤˤ_嫪n3߀p,U_lC̈́,3`.7-~RZzta^ nL+$Όa7oz (y€OL-n`A!F 19=Cjq%-2H^Wwh|0QS8 =GbatZRˬ(/DwW!uꎧf ɿ-KDrrPȠzމ,}xE{U;UZyX?N]KmaQQۇo3=DzLu4i=0`>|]0\m?' `e`1_{+cl"esY:q .o8qڸѨ+`4 ̴c8q\|ZܲMaap Lp#=۵UМO޾><^E/ۮpZT]LЙ0a2rx:A`RGp`aЀe~{*z5lc a&wt;Lw./X q]PTdf[Ipo% \&3ÄVm6lh `¨ @US<߭bM_ڍV ^ 56R5u͹ۋ7Zh\jwգz¸ >['3:R+ `/|Qk6R+)am +`w,fWg0GMqqTwMej]R @۪X-A@{4 :uR\~"We=ynUk/IQmQ~;׫fjqU1=R,`͸  aP]GϡW/JocUM+8B/ . ԚAEh'>@6@K1 =<߬V"M\@5rǶ]Նjӄ{5$4@2F5 |CgZ) UjcAL|H}/`[ ꧟H 4NR-=F<vTS -!NQppCûC8";َ+n ,mŌ@{1]P*#X0P0\!^(a<NS_ ts+!-B ~$!$7'[tMic\|r@Ffgy'"xQ&d@+}qQj}5y-=,l`*8,keGoNaB_,QU8׎c:6JA=%O3GzZUhWNEra7PL<">̎ܺprI&{if_aʱ$bcRDmFYfZGi qMb`{;E2k_̉[ :+\ZfL=OssRZ!ǸHU;A%ɨBxc KKkt8B \`* *.,PL}`͙ j<=Y 1ԳiuI]pF]e9[6t e'f2B^uz+2n_x'Ozтk?ލS\ϵ'Z:8ԎZ7ƯƸBQx<^l:<&w_\Pࢿ:|}pmQx=E{$<4_-?MW@ X<R >H' 1^]ɕDrFWFZ렒iEy dOasʐ'|QElu,C'7Zug#U.x9ψ2 UoLαKvSyzwqo7[tIǭEð_ FxDWtj_(+!꟧4zM1JE>ާ/%(hd#w)M[O"oI*Wt<-)E.|j)NELh2BIW'mrC`? rn̡X=!W: HA1x~@U :wC2KYrU1nP\靨T݀5c$oF-Rm䬛j 9Ϳ?U@xʉ0PgR{vo~p.Ƃx7PsЀӞ6'LiL^N BwT0煏ht4 b )h*$Y⁏fB]1dRWHm|?1^ޔhnn2Sȿ5 tӟUԎy3QOjW$EU߯lE&_3J/{h0BWeF\>a/*ZyKLsuauI-prtE3OL4\HAH grb;gܪ |bvR4v%beEٺiN⮃3hqeh<2"S1+ rMZtU>lXrecMXw:"yx=\2XAZp~J\K8bp` ˄/|٪nr( =r?U"4+j)N[Su Dg2LU,Os|xSZ7x0h`` ' ahcԡPTTa:Jn&|h;#БeZP.mlNGOٕN8d`SQbj"m<q4qQ8 yzVm\ы3 x0:jf[7Af{pMx Iw3kA ?oh'K6,5Fm4M;iԟƼM}5A/v1v-^a;8id$`Fmic1 LZ0}I攇_ 3H]qp|A%^_ dS0iHpr;/Wpt6M7uuxu:T&O cTo>M ?㏭,/~ߑO!8XUtt-]T*&#&y :@#n{YjD^I^z;NjBz&+$x׏vzNeg6 eS+X:8)&ŏW4݁` lfpth˼pX{ .*TvTvNs {Θ)rᙞd_MgY.C$[N7=x{u}Mh>,18ynXz>7iq\<,]6=)d>:Q[./vx'}YH,u1LG'π0zpJۙIc]࢟aL4ttϠpQJ Zg0gq z}B)Kx<Ÿ90DS#NsVe4Oh[SѴy' `1iY辔[JܤKbp墐@K,G#\ɼ|'kt} 5+]9|=>H%#CNPQu{il]M1d]ܱDzYGû6VmX$J\B ʨ{ _Z:UQf~(L'X5 GK\Z+CD%XnC䱡ӥ ѕʵRL ]`~:mO;=2U0\L|R-9r'y{[|S>Q>)0+ç/kóg<;yfYx qԬYaxޒ%.C'e,?m= v*!գShXyſɤ!Ia4cJ/v^bе \J_"mvc^%䏅Fդb<[bcmc=.>>.>, IJ % eKQB,$0oKa\OM͵Dѝ|t'GwѝPwҳ)D2Cys,p/:W89W AU[h_B@f"wFC]B-k$g)s9nYnSzNwܱw/nJS7FO3sQŷ/ArLc048\`}"R0LIZ0<*_}xsa3z:O'2=wdZɭl3! u[u[u[u[}Bs1x$a$;9>ˇ`qB.`zX:%US?(x7(ϧ3d\V5ن?5tj=#~un>}'؜eU7ҷqx@(4Go)Fʃ#±\3J;գx5d\#+=}3qs`0s %n3q=U20voQvknڭUÎ2u`Kcp=lvnw4m0j[=Bi2BFĒPNՇഏ{c ƵHct9hQQ/xY[Nm23C0^HH9ҧRY@# q)sЏ.gxNquF FE nhm١.Vߝ[$֌8ynv%bwC·_0JJɐ3 zRyBH"p !#j AnI9~@9*էNQ~ũqkū3 `yLcPnXwxoORs49(Zhc.G"8&Dc/va"QF^ZX(љ$ߞ4}7˃1 ^LMiI,59ȧ>(DTEa2.cvsA'M5Nv+*02צVYP1Yŋ-ߏUJ c(r=`L>@dP<Њ1V|1%"E:dtzd6=7.Җ $]_m%eE3HI6:811(@ByE Hz\1S#\RǮ Y:`zwHbBFbSL5K_ LJ2dk-!<r%h^V9VƬ! n$xHEfRHi"K4+iV)i-a{ Sޱj8WQ^ĸnJ[-uQi iHq"ĥ(L 0)aT>V.1;b?h3lj3f Wo^XK }l4]XDf!ǝ|_ۅ䦍¾fþ3&؜$>2XL "I(CJ-0ـm: OweS(d̟Bp|zjtRF}.+G  ̏J9w:hj>$Z}{&d2ʭO|H=*j kءl7|3~nt+7RnpHb?JPǮʜO1Wa{D*<D+{ X^6 eoZ p|ZhAahEek52'TeV.o\JfdZysdY[ m,+,sǐ^^\ZXDzcMҒ7lNWdOm:tYh xV&-InGC=]\b`[΁5dV!L\~]( Z jY_;e\,V pӡT&cTQ-*Kmf aʠ&e&`%+؄2fy/k+NPXNN,}8iK*_,d(w Eevaۓ=˻qnm r=^nu 516%[`[y*54 !D4M'>AuR,Ş]xH yjaC%/`7B[1p5bS:20v)JkQx_]v;7зߤ5A f"٭́? +ɂETrA#ϧJHAADjq?9o M*tSˢz[`>ln-_d#}dWY)aZdj5CbRSD#nT!Dy7B<3 {qᜦ'c$z|JTH ܘȜlc`SG3FE!Ag=^;z\^J-/n} В  ,kt%!}pvU ?h+j\g='떸z^j/>mM]qq)PN%}CՊ߹n3wrm׊lF;4M72`1t{<S{R8J͕W+5 W\+xM}YM%wNW.gYgQ- eϩQ X~Ut}ΤտzNveOv5ě 0$msef'OѧϏQf)5s>|O^iN5fBK6K#936"jfupr4lٲ3dϮrfއ?ж:db;ݓPKhK6jkyv Yh0Pl?2(+7@Ռ{> {#C=`>"{(X).pdD{6sAЂݿ-k&= ql+ĶۻAįP ku?.ݗe_ck,lƖ6Ödh|9v21RbFlbc$`Eňs3ҟArwbv]osIjֱ1PXތP *vo3f._|Q=7Lѓ] ~3a'{Ig #qce8\~Mp7G0ϧF;jyCwqu^?{ 7Ǩ1߮^_'IFGY!|-+5rJ=\wf}yrO^_l[sGe`~/Lɣ~xpm}j.nOb`/-}}Lcwm+Ω>qйN|a)e:ݚ)>ҿБ85NRr]¬\Mf2Venn2g\mBMvDnP?2-bQQ`l \B˿|fۦG? =wª4IGp;0>SoŸ☜lPx?-|%>\N%vu.>J`y%`_8R#?%?YSѓH0տkyY=[3I8,Ґ2s<%D  Pek+/ƂZӁUK/w+7"kG#Yʟ"I9,z]BK?nsDYUYSrv3_gi\0vꂉ}崩5pz]acsiw=3KxiLbDQD#bz$^j6Tѿ!$S!8B\6uҽ&{d_ }C=6OĤ*sluX8+@@q䋑O`~ಘޓ2袌rXc`i Hә855|e\CgQQ+8O.s޼~}WL\׋R:~ Ϩ~ֶvN𹊑 !!jici(AN̡ =Y]Z/1g{.3~4:. 7it055za_I2rޖvj<tr}Cem6\0v}Z\M2*d֍x~:_8 #_5VFYf6a{A;[1iubṞ4SQ?UE lDl8mewEv'=h>PLt/0XagnГizEJ Q).JnŒيC}rMfkqOgrfF,ضkVi8 {a>r9 , 0vkpjnTn0fd7~oO!oWܰ=EqC%݅~1+)c-%qb?'A4xQWtlt:p?u$)%)JfGLt<-*-00 g حy?7Xra\!q]/ 1FeW "$ԗA-+rrB/GyyS7cRS2߱z$1*<eEp|^j1/PvX3jW` %>ob:i]׫ܔT]i9vش_`ٙafXC]\+#↞ ]r|'C]o9W>ØG0{``s_ng?k#KL-Y%5eS78DXd=~UdWYF.n߳u^nʘXBHI.F60E !fk=S9?\P1a*1oldXuT aCDd2;|SNqucB1}5LuV@0o6Vb Y!IDY!.j xH (`i3 ,TJA HK!q 1?LqUy3QLZ{dL@AgqǙ/Pyhm]R12pPR Fp)$J3| Mj= D&W ?F"/PytmkOÁD ` .Pd^cRgB .iKqzNi0pu)(XC*@cDUPJX 5RQ lOD8QH kgx"Z5%j0o2ϰvGL5FG.0bRǙ/PyEmINA,l5D``B1>6 (T`^`p! d/-P8i`B捪.E[7L85Jͩ 0_@$ߏ&V!MA[%*0u^h5uIm1# 'lTx1|SR=mH$CxLEZ@ ,+uqK(`] L"!z6#  tB 9nMBPr1Ssc0iń氠[ÕqMDy2 %j0qUeXXh@hUވI[3|K!jX!q^ %j0xN2B@47lR&AXB|f kIu}Ak)JpR%* P¼6ϷQ i-u %Vx:(PyLmke 4[}BQ \-PyM^aŀHWTRީ uZнs?m_!LM٤e |7R>nsp׶\msrR=m9,ggϳIYNl:A!XZi* -E;5jK!fZۏ'i#ېU&rRfuk=\ -^݉1~4֭mS ?-PKFФԍ疉3; ^=|1 f 9E;+-"p]O 6ӾљʊĻ_`@ji]^bCY>|:2]&Ks=8~%WM\ĔU1GӍ'k!Sǂ8= ֎{8v`w*ovaF'C^u W~=V>?$؃{vV6ᐈLe<_m%Wa}JdqJ1_2=?oo o YU7jKn| JXDTO_au?q3$6jrrk58y A(czfl 8 TR$NdIr.<Ş8b4`DS*0Kt̬=2+hOZ{]]v*~: MZL=C Çt\Ϊc)}<7sF%8Ab!rpC2{~v ֠v2s/oU6F)X;MLq\9C8&QdfY#Tyƍese8|K{*//ȯ9Λ짺~uw%ʓ E {>@zIo-MXKڿW<ɣSkrwgn"yQ7oaק \FѨ}<, s3fKF"zi-'{lyQ@8x=1Cc\z]l,Ac2nE?e?qŬ~ >4Pu=v yrDRrDgs].rC^)7e8p2w^,wMw.A9u,_`s?.DXPq*FRiE炘TKq:vں<2J| :+tGOq1r\BB;wK9mkٰeތz,(!p$PlLHLPQĨ&$w&ˉʝ]*|p'p*?dr#sgSYaj?Oi'V^K{4sTQ1bD֣ЫoXR孔8}zy{d㙯<u[n|𙞺>+ڳ4sw+T+at"f̧qjT7L!Ca9j(R`xq;xm)ZKA{Dh0G b F6>XevU}܆5#܏-aR֚]}}Z|U@1 "qHc!ǨE`hDQ狩RBKYwasӝ j4\]v+ jm'@Q ohumg/w-v^+h]V - U)!N 0IZqQ(b#;8ثHO^Y3XR04W8zʪ9ۆ2> {KF=CzCDrY#!Dc[ H[mMr=)ZKfAےǶ.5fFiT.$޽b%0_tEMc7TXvрr4‡DC}9ɕI16ipDH-N֢Ç>TߦŹk]u6p[vB3w]sv>I|305~,d^;4qZ\Yjrl;&T=~X]$AAbņ$_8X Ѿh> tzRyuוhvftdMtWE?^&N|9)x;r F:"A^Q$wAo!Zꉏ܁H+DH'S k_C1-k7xzU뽪Iw^&J!$ jZƠSV dmߋbs2aۜ$;p{d) ;?O'r%3Hٻu+r+Aǩ:Թ̒Lf>siK_JUV'*]9'cw7 sIKc=ֽj>]yoG[%C]_{3^ x6 nw}[O]2ۢ OFؔ9~pv=z q)>t N]D B',$B >ze9,rlXO#e5(o?Z?AhTʽx5_],jB,b.h͗|bkj J1*2ҌG#M)<d 3nV^٨Td@+8knQqe5_p䡦_;g'{*߀`0{+k@hX bL݉ä^CL;e*=ը_RcwzF41L7^Wo4isH6Q~iBV>&+ %huE|j>O(evK, ;p4giPSMjkުz z&ו2\T+3idez=Etml;Rɺ4bvvMmrW,w | cŽZO8@j-ךe:[|0iiSQ5{Y\r=G|2D*<dɆ`>'d;;U4|\]it'ѩACq}8ɏǺEA0DBTL!E{S&Hy|Յ1 z]~دL$'Wh7^bMKsϊ~)@c~d1.$t3\NBǤӁw!;(~<~ eaZv2P3dGR]I 9j(R`8X%B;˟Av@"] >PUgnGY\GU(9҄+6( gtlv%vi pRp.r87j*|츎rs2 5(ZGOX#DKRN;\_3+yCDԖ%{I'uZb,~1F_K" 4nwc<Ϧ)t6d)WGJ LI"c4B(b6;pRxp֣bs]A:L{NAh 0NY7#4Z"K rxmu5間x~ںGQ.O&|5/˧U^y6Oz}boHyei%62nufjkK=)ZKf3ڒv,Q͖Krw=z xdË@3[87|ۿvܗD-ȹǔqrQ[$\ܾw^OŇ^m%KTfuoNGkC\.ƾ:D.·t9};I+} +w@x6-Mwu '/i2S\4 h)fg $s@AN8&jw%xmX7Oy[`6J3CI,*5rn ̙C̘NX`"3yR8 [|mqwfJsgC ќ8:Yעhm၉fb`:QIlXvtܼ o߄y@#/@ja˒\-~Zˍ2 U]Е/˵]G)JK1J޻OޏmR$vㄹsp M8Ʉa'r0Q޿ܮ̓#֝iy;[y+Cl(f߂h'>rMZQ&TL:RX!sQGߨ:ŏl܅fNeXlDGݯwU-vx}2{W6 ӒZsx}!1B3^ q4tJܪ$"M^!@WOt[TmR|tyV7kՏ7[w;Udm20hxnaa6P̜qI%wUik(7|L+=qf;BXRY <3gYTL!b/i;8BYoT҈3 , m9";RE$^DǺB)D߻ᜮFmZ5cO"NORg]}}*t(Fǫ9NT.sOxo(lW{{_ռ)|2rJ!#ngcR;l*?Qqi/=m=kl-7wG"&2Ŧ%׭a@3@X^ E'g_[pZ<"`N"h)9 YcF)"|J9rXdVF(cj,P.$A3 F2V˭!bPb)˧0u=TG;8l!iRk>y)t&+ %huE|j>O(evK, ;p4giPSMjkުz z&ו2\T+3idez=Etml;Rɺ4bvvMmrW,w | cŽZO8@j-ךeK&>OӸ֑3!k59xkч}>^غaB)|``o8yhȦ?n>C'1*o<|I KMj?EUH%s\'CS 9hٻ6,W} 6qo=8xf> ik#RrS$ElZ#F8![q9UդEz4MΩ+ _&sn_=h|y;Cvbt'><*zѽmqz xa7> BIA)>};?%}|>8kP>K^\YrIMQiL8$ g%J6spG)_|!Dk=- 痃+AަA=YݤB+R$nD1iR?7-Rr6?񼫒vU=YTЛޤa+<8)>((NDOqU^`N&}TWZvt~\UtejLTW[zb/Vdiϟ}WC{q8t[ݮ`<=ۭ+_'\ 7N:$4rӪBE[tW]V-tYY?uj"ӗrܭsдVO{wbEs]2,ʸ::n^}_}X:?b7Gu'w' ⺫}ɋr~ p.[^!N&7 2yv7&0MzOhl -:tdr Xٯ+N?&,+`/Xe~59c[e/7Kмo~x>Igh]/sɮAw<䴟UloݳG%) Oo\\v.\L@,d> Q瓛?YGL3l 昒h*ZkchC\muD9)twrF"rӷ٫^~ηy<z?î}73YH/YF=\x o ߃z;uрx2vL/ ݎ;Ȧ7` ] {1%dq_ޗ)|o[v9OQrz(",E{fO.]μ59FʎGwOs7:xNB:;z`s~cts$rq8 (cHgx;cxFWLgtLsEGP㱞!\{VZ(F;*Jx`##r+7W9=ϊ=C .YZ˂V6%1`RB #o6p :9H gJ E10&d9Cߝ ֫Gbxeʇ{t;S4l3"w(6 bSL=!l E T,<5&&y :4 _'AhŖqr`ho?{y׋pĨzz95_O^\×z 듍}" ȫ6.zԭ|-MO|pht\vǷ_uLy|IV=:rOkIv._?d`K%ȷ'Vk&!e!DvCQkNGi5Sa'av# 5^^@~ qSmz'O&ݺ^rޖ84;~eH8z63`UGSlDߤ)痫rB GݿG%OA*rA^pUz6~0nٳ/7ajd-Ki-jɯ/E;vn24EJ+m7mv]['U-3tuF~J5өYYXb@rxKg^p,-]3[>WJxr>vჶ)mwn.[^z{=9靼v׫f3e? =KhW5} Qד_Oκvuǽ ϟp֝h f2鄸9Ը\l7w>Z1gclH_NopwvZyva}C;o=nmL{B+Tb ܓ~4ԶݡKUH<Eus{:8pX|o~#v)ws~ٍmK?, RZEs`U&z2Z"62i syOs*Kcj1ҫvd=kǴZ ʐ! mb6ĘUZw,ZMHnْ ;ZN Ȍi-% UN/%#J:`xIT0Ue1x{ ;2Z"ɓz% Yf>*!(c7 VJļB%cFq.@X\5jY1NW•9!Y 0a`t h_YH.$k&#-GdyJظ'S4NOMdC1&VUIZQb%rHT,s 3-NTcrY~ g%\MR"w ESps .~B뀀$Ғ$Ԙ P -HY$dOm9 ӚER-H@"Ӡ'i/&v$ N&aWYC6$*7 Č6Z,&Ac 0gU(tGlEBW Ko{ L!D߂jxRi,و"@Kf ,kB:pm*>vFaXڎLŒ,&\9 ƢeRU&!cQe k P=RΖTg19*uq CX +آ)0hQC,Z%d@jg\ѽs( -FPA4d,~ ?`T9k U!S%BaI,# ldBi1Wc;Ʊ[ځ@:=.g0!.YvLЛAޠ\I10 1d arLc5d h4!D}޹TUnQY(蠱Y&Ԏp\-0&Q!o J v(@Ű,1`I* p2M BP) wF-p2GSF4`iTϒo,E{dͯh_uW.()vܩ+ъnXkd5=*)IED_MhjDӨX\ժB/E j Ed$$Vc@ e29A5nڰkMj>R3&"i0ie# Ҽ_v-]{1"/E͘UhN)JQ;B w3lL|0MgWmz[U]iB UʷktZ'm@-T2^ ,4{q۬dpSt)\]p14;`e E A1J$bDO˄Ϋ jrQQ)Hdqq E@Y Z;[x(nEg06 2YЩ֏D>ON\Eݱ(,pi[&sJ]V!Ir'#ӇUx]n7%ԝCLE.&jH6 ^{@#B ~v)m U@@Q |Y]E(#TrEhɾ'1顔KpMކEK Q Di>ࠃ L ϠB.q9K9ϣrFhZI+m2@~ЃA2AC("224\Jc,r0碬"LjtE""*B :9|֔lƚ; .+ 騞;K4AXƶeB#޲2 r*=zˢy[7-Yh/a@?wN%(0 q.[kL6"ژgvӓNiyxyzR)mC@0upquJ#f=H΍QaM) RdYDڳT5 Ѳjhvf cG0,`Ҍ1Υ0"Q].(wЩz e %T]z x""0j@==2T߸]anhl2'~V&b$"JN. A!GohN:a ŽBlO(j(Acx)b&F|j 5:P1Z'`ҹQUʄ8zQrQ*7 fj $Cc |"&T=3)VR >(-8"USEc:+=kt;/j9(WsUΫ6`jh83 (RQë f(m@ <dǣ)EuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Qe:|GK&J:QIӔ2QiW>iu:OR"'u8uv.u=ۿHAKu:n$꼜QMD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAus|Dc<% cCԉQd: dGO)5G{$D% uD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:GCyN˶FKwЎ}o._a m1yGf+ 8 kdxE ʎWqh]OMbcߌƓKưx4 WIE3U`|/`\ٻ-+6RB5KA>AiwYr)؜ ^9Gʷޕ*J$r9:?>M,P}d\+(~wz0_&]Ce/R"|y@H[Tr5-M,0T_P8;0 ~ﮦl?f(JOffqvޱLQZvGE)9Lqfb`Tpu؇(LOej}p.+*v=y3>_ &%dUfYc#,Fd5j_ 5s;|6eہ:OHp3Id1;ƹg! /#_D+l. oO \_yϷyv%{'pwhr{E5n7pJE=mI{G<퍽o+wQk9/Ue >X s|kc!AkK~~-|L |P J0Tps뇾6T/qXٻIǙ,qc"o֮kqUvGВ_flVk SsVu+mwpf0i>>ÍiF:w|m[_̤?Ό ^+[,jjY_y @)t[oŰnv.v\{v$5Cscb?_եo\ԝ6Ok:lуtC<B=9E,U 3+77# t\P Y=;|qw<'-HFyރ5U6kp2 ӈO}]OJ鷳=\׈~U?MMst{m\ڙԾ,|7 uq#G3I%L93CxRr+ /ѡ$\ע_av%CuWRed `mpX@ fyE?@' ZM4l</J\^k!:9{fҋ-: >+b0Bqȸ\*rW>WV@x\ m[(bu놅'vү46XW‡lt1ϵZn5&yZNVcWYi5H3GJgK-67yro^s;Nr7ٯvYF1/lŮ׎q9ɻ1D 6(ClLufh9z/1U߻xΗnV[1Ż(o5P@j,3Z$1.LWnwcwnn,ajY`_0LQ_%RLJ%, 5Dž&*<% D+&9\Iά ߛryX'S@&d\ KE*' C΍!s 1JB3jd{DݠZ#s3vhB1}pXifkfxDbв}>`WQC}^Gg~3$4=Tv/8eqOq$Kô1>؆]Ĺqr2@ØXq䞅nv*>͆"ވ0UA_,] l-!T1ŕv@NJ6/. 3K>ONgCoKtzvà ^m͜nnXUPIE5-wޖW7*_fȆⴺ8fV!k&Qm\;_-R}Z!~0\mPfnuH<~8p4 aDƳ$, i&]<{ɴ9DqI{8\d8倈a2< C2r̵`pXxRAN栍!HؚQ44CZI[bku]u jUqm[}+Wogiʊ֧ĥ/$YY1'R ̊bB׫dnK[oqBmT4l^; }~3hpPc^^$itg$~ꂫ2JrbBn&T: c‡w00RMb5$m1 tzr]|)̤BP¸T\+x<|08%Zu9V_&39ÈgF/+vpvغtcݯ@,A'5v?rJ]՘N+ZtV*_5!χB*^e'^OBk*\pRZn9ۣ޼oy4Xh< qC2]^Ʊr)ye.ƇLxk'}+23/yKoȅ:xҩ+f)Po=Mm7WꄅijWaeSٝc^2*yf{.Cx㱽\1M2eL)e0 RiSXv'x4eg/^+Xݚ\Ѵť>[=0K eL4L43%-t]aQQ&ĴFwvTM'M].to}w!;mM7W7i[ZǑn C|2OnEH3ʬfhooY,q`؁tOa5,R6Z@;09 嚀V&,0NCcݴ'NP5YG!@'E_MM|.b!\@yS(oG '|IaLXCNgv8u̍c1|uyJ,N*QRꕦv}`v1D?'_\apfO,=<{@aB{ܳ>F&?ݎ] Bd<;N&aR|~{|]; Rj0* e>ʫ OZ v7:ORqij 7`|NLmrY}!g,\Ve׍43),0-'N)fDЌNfN9}˻1JL]C>,}6Nu)JPn¿>(@ZcR(u ^D6H'gY|wʼJbg[ҳ7pn}+%ã;?5ϐLwP1ӤӹaS5R UAg9,h[pi4S51톯&7SjŨ[>ݎR^zK7xmNoޮ[A5XmC!Ai&DuJIS>X5 $K E.Ji%c!9y1#9te[}ދY fZǍd*On\(5!&1&R u.:G0R̅L]Vy.ɵ7{*SgFTj5u&ADPi gt-{eqK7-7=lV9+6ju;(1%8RP/@!&Lo u+6dfUOCz7ي'r̊[* yruKD*bY'q3JY ńԄ@>zViǭQ+Lh z)uyǶҙ8;z ARwn)0H@x$d?iJ؉w6 Zk)sT'&i¼*eAn2B!.SU"\H:r({YdU9\JlbJ0j#5Vt\RmWb1XPYD'PKolF=uZ3r;*P  „]qB[#qp5ĺ3't:6,jݺպQ`)Nx(fr<9H+<ɣ)\Q!b|W jJ2Ex0))3$ Sc= ㈜O7~/9 '5c3Ud%Ldzim*慯_:dW?&K1dIOO?DdzԚ"/]I߶2'ܐ V =ceMCzRv%ZH% KΉe)9oE/<sN0"`DnS+knHcT݇"5kGسc{ZF$F @AՍhdDQ"P]U֗YRn&x92J#"( -J)&b;R"ӶeBŧdRʏ9be`UXFlɑDHćrCN|ȉrCN|ȉrCN|ȉrCN|8h$L$LL$ά$L$L$LGIęIęIęIęI-`KN2)y2xkKa)lPĂǶ!+xy0*+`]<,NNjO#248# !*bx"P 8)FʝXfNh{йgi3[`tᵬ{M(+W.P@b&"JcYţ3 _aiLk2 Q|[8u*Tx@$<:&,Y̰rL CV` j5t05Δ禞XQ"4A)4oK "E ]-D\T>P$~ j)7ba4UDJAൗ(/UXe3 w1NlHR7$p@8D@볪rϟ/\Д,={:B$yK10>W5yC MqRdq7) St],W|O<ģ-'{wBQ+XPqDp\ܢ+xSwa2׋rn" t16W  ݊N[lB<Y\€e^]&= Ied|=ϽU!|ݚʕLG Fo*_GbדAyi7~%Vw0W\Ɨ (U {+O4mݫ>ZU_zepA 1_8_]#KbdEh_&Qop>0!p';4݆4߆,o`> GLfpu6dTw%h]gtצgX 6kvBҍ9?Fc+{^ X7ּXݩ֩?/\o廷?.O?uo/u#0 .QEP_.@zy[nM֪ͧ*|7MoZ|f _0?:6 {צ=ŜY 3:W@l0~iTXJR)R_a0). )Gbn8ˋSGGfLsj?V܄8faBr`$<+%DzJiYa7 )ΛPlBarwHm +gXd^er e!H5U<=^ŀ%1P1]9OԮ!$0)xMFQ+``FBe"LLv3} ;(Iit\] lJa^iGjd^ wE%K1*3]Nl/~A]8ۥ/\Tv.M+I' t$q~X/ˤa -MRR}r$cnaç>װiafPc0艎 䁇o9}Oo4⋱5K>ay0I>/1",Dʬ2N*ЂFQ4`pHn#zmsʓAv4}c[h9Vq0wRSfqˏ!WịUh4BkM(@( F刊`) NP2mFg-j?F˜_묦z *decVcy:WhWp~5mlIcF1 )`)$$;Shy]e+'pyn}xb/z @yծ hXqr{#儨]ܽDl܎x+o'Ȱo#HK-a*խ?*Bq H1+A\x 2taE$X *ÿgߺXBz-vjLUpgz6]O|;W #]?FV}B.)An؟: ?s"ybpBi c-P`m :/2h<gu=\x``duޒ&,jr҅a*yY5f5u:/˩r[}ˣ8=i.xt@+;d DKٰ}sҒPN@1aR"i,=s2Z-Js6GM*޿]w9`3]l8QlcKi2(GkQڀJH13N9ťoW12#Vsf"|+f\l]{Z\l.n8峾XwW*ny Fr}@}ҙ OPM0+D{"#a:0- 9kk,,,r!c&S5;$ rڐ*"c}9AV9Ʋ=# #$$΀: dH[XA`Ir,aK,6l!>--[ZcZZ-F1/Ű/XMM:죫UE4[əmyғ?ݫ;Ih)C~~:&9- -AƒEiCxIwm8N;{w^u!Vʑg[3bmͮn϶\<>:Cj$;iWEuHNGqb0JsZ}4-N~pAΧ`N*g3ۆAS>k=p_z1+[\+v66fǁA>GYenݜ`"{&CDtF8SDc)Q"!`%K9 Q Ƒ6&M@ 9*pcSctk(ϘL Zj(Eާ7PO 3TlcΖ2af|:\[Yg mOlX(NlP{:15WNJw uД6lΨ@ lۆK6=mt 0e/wN^n-=8|-Ga~6/`D[{$طںe둟t[NMY)vFm{A?w)k580_κ8XwڅJKu tCtfsh7U\m( EyxRअ QL| [. psY縬ܔr3^6R%\ a|=?|:ݍ](8L0+&AlZcgmI ZrV8Af2"v_w ]lJa^iGjd^ wE%|| *k=?']Typv.MgnJ3ٌx`Rx*> s6r) ʂ)N **",atrnG@t3鰟hU:O( z̝vX-"ro0m:0#4'(xAc;jsk?|5A[RX=f\ٶkzxPߪrbqz{,n`u ePJ)PXIRԕ3IZ~gd%f.< ( rWI`!:㮒u]swWIJzJ+*$"qW B)뗤U쮞{z~$=:H`JeiaJ4 )> ?]F$쎻Jjw ]%)+)֝qWI\⮒JR* tW`Lb)6f9k՝*ՎM⫽?}~٫_)K BoG˙~0.B\`n ;A`}זNθ$>{7MaD6ų~9nӢC Ƽ;$]qW -9vH*Iyn;]] BrWI`wUفswWIJJzJ"(퐻Jkw箒TJR.řz9JaDEvDA`&pgUWꮸ+GWIJcW/]i ;䮒u]w&veUR쮞{zqZ?:H`)zg0qV'aaeulKdOu4߮#EnST4ȁy0 ewQnVlyspB7|N?;Ȕvc]٧Tٹn }ܸWn;nꝞ px>Bn u`ym~?1jrۇmN`xv$8?,X&NPQ""ehIFdVI$/Di (jyUǛ~@-}*/98F~6_0d[i0?$6?\eM9b ƍ8bBh @ز$M*WbkPj|Ds[]ftUS'^b{zytͽ(U* :k+-o Fieu)Guwcɰ"7ގHe)RYޙ6Z@.ؾ[5|nRS6Gm}z[+=~up}c!J %sMe!fi:?^NZQCw{npU ;mN8mQ0`<' ~wD:N'Ow/vzc.b٤"kغQkqFE ¦0I%0uJΘTTh+ӚKjՔ4}QnW,мYN,mI `Å}&G6z9 ais6)bavQ;aj4aNMÜ0yijI}_oÙ6M)rۚ> [E<6>3CJ]s8WR0=;czF BR{nnt=Z1mi@n:M#JA;>@ї_AH2Ma_6jB'BX ^:tϗv1Ums%J{+KJ;Zvْ>`usG)40| #Fx7a^2ַ4.:/7[?B:n!G0M!卒ryjSoBM=z3ަ0xB5KVG4v{bD"T0 B5{ ZNW1ҕ]撶eV#J"]l%[DWXHBm<]JIeGWHWVHaI Kў}8n{i|0(oooUJo(z#2``Ng=lχ2C~ʦkЕ걢qjZDWX.m+DD QG;:b["V•-thn:]!J::@`Mt+ZCW֒T T/Qu)qAՂ6oP=.4{߉\BӀVHtFv4}4-M^%fTjsvi'vt,t1UYb@񮔲5>F;DҺ "\ۚ9;@m:]!JHWFK%DWXSBS*=]!J%p"]Ymi]!} Q2WCWfCћ1Ԫ|T0`=b+߬J۰SLGW=ޓm]`NHk RBOt(sqGWBWhIY xW״ƻ䢣+,+ټ0G; ԖY+hezlvklMsKmWpn M#ZaNӈRu4}4 !ղ-'-i=Gc &ʇv[зf Zڋs5ZxwI+ƨ#e$ܖmjE4}zpMk"QҦG4.9FZŸh]!`ZCW״fY q~4qY|(J35`ya-q}t(/&:hCs6+g -A~3}bL \Ns:0B:>^0\cIQ(F{iIve֏#`BB$" |ARK']%/K/+y~?{Q/iq tA_x3A?.z$+沆{\gl0>l sk8~_kH0|Dܜ5ZU}y5;gUn4;gYalr;?͂Z`ﮏh_dM.&4!a&aVDS.Hx-L%Y\`&&*"&bd`޼.7P>GgΣà$Ε.t'96ɭ|Yt5}+ObyaUUT:wo` f~L=)8E{>;yNdzhuZUSW4ӷloLy\ٕ{+ /L/2x:y;m֣>X^CaRf?ό XRu|ea4BTrMY-LspD.K?g9=.qc\}' 9LFA;V.; n,{s8q"νp҃'8%z@8/FHȃS]d 4xs-7@n% i~Uƭ U:}g 3ϋ) bϳcE?odT2%ٗO2.:99^˗|# Qq\E<.J0H)6vf0V\TѤϽ[-?{>fuw~nM/N0D/u*&e޳.{4Q6|;C"^^âknRRQ5 jXu5ީk)aiQЃ؋8͢':W}VFuȪZڪOjJ[XV,A}~Hٹ W1Q/͗"VX,/W-/?o{Qt ͻ>|٫z?@pg?Q聩``UB0[Cu+p꿟QUPߢjn&U]zke+k>-Y3{;jO.ߜW]ׄc&D}t/̯wyVQif8܀yb!ņ1/Qop)?MNV(bapጾ2P0m"B 3N2&p|WdL.1{o$9젎؈$Dh*C9F4SE&*塀ĺ3nuN@*wn4oA6QnK7_?yo## wW*mq~V8}jWoo1EQն,Y~ݦGnSۗU_%_*[fs;Bj Z+OWW0_uB_(SJbסfj:fjx?_nx:}۰iwZ<*sCc/rܷ&԰( Wn5ayAlR-dw 3$PM Z[zgE׃$:d) ei{0E{erj gZ8Bm!Xal6_r~\ӤBJ`R$ŗġ;ey9鮪Uu=$okV~SlT;@~Gu _ قjvyi`V kC8reVHSh6ӽ?TY"DCc@qXg~yۛ䞃c^5|l&QSXb:$H6ӫs|.$QD5Ncp0u# ZMfpR9*[+qIQENm'׭B9V_"_Q$wAo!ZꉏK+Ť)oncǎ1Fo5>>?룉2]UU$Sk4LyqVK8_uTW_GjBHa-,(#8zP!GAĭA,!ߊX}H׎5(u>Ezf8*ɯ ;*4K(HsѬXAQT`IȍL>&nC,ٞE]IXRY 1lj,*4R%1aB{QV+2dhTj.2BL8MY\p^($E=YpOmh"iE)iŬg!tT*jJRG]="置(b(툟dڨ Th!'`Ȋr~b.pxGoR2K׳VNˆ;Ρ.)+|5mokc8|Z[ڔm} +q(|9.AZ8@j .gy)\E{lΠDE>NE]ߋs:y.E%W!WړK@[C"s&AS"'47HъPb;h>ǽ<= hỾ?_g2}pXJ56v [8VsuG@9|)#8 70 FU?pV|{VwCޞ+?/)}.=~o#Pߏ}?9'?ۏZzɝds< ĺLboM̃>jY gEYxj \@ \ȻOv{B`pf"X[?z_IM˫f5{KէV{_~ysC_0]޴bC'n؍t.ѿa?,5\jG "DG#q` r\KlFPdfKլMsifGyϱvO%C樌xK^%w`u P*̄9|n*3Ha#>6iPro$U 9( ~:`rwbu6ޤ A(czfl 8 rܜIē\x=qh`:UtOųgmўεmv6.1\^cC8S2+{r>Li$>p ύ~FA\H ?B儋s1JË#/{?&O >m16[GOw^;"76ުc<8S?k z?F͏N.T۬3mN=qAQkQԤ8Nx[r=<|\ؿ.ǠJ V[^Ў:t 1%7tu9\SJNca;`05AɏiZwA& YL´p9rP xq%ҿDZ DhAN/ͮ&{ 78'(&|joG)ӊYqWc`5yr5|:;eTSkP 5Kd(IqP:QǏ Ez9lE:!?3j63KBu0DŜdm MRH'zPLEH'>8`9+r c0e7SQk,tA sgLvTד\r43sb+8]†~Uh6lɔ=i6r ޭP0ߚ=zzjIVL DD-"} ۜG)k\p4%̩SjMlUVt MRH{ V3jX;QHQ\PP[1u>9&I'.)(@03,Y#$x1B=1؄a, tN)")68. ]PM!>b+vd=}Qz:%bj`3442Έt\T>Z*@ۡ/1XԔj铌[)06*X'I ވ8p-) K¸"cG$cM>OM]#me/gP֭1*6q~]e(w '(,}JgϷg-y{d'\}e4݇`O>\~rwQ+aG\EI(e[z.zа!R_-3s̾rf׎W .Tp!?EP[.Ju{<k.֏Ϧ_IM˫ܻdFݽy}}W?ap}N`Pӻ6|z<Ժ{uu˧Y=5xnf9#r<9V;ٿBp#wAſYy݌Z8ZIpCKx 9F-SE#B iL)fn췍7BœMfVϔp"O[{ٶԽie>/O2Ut4U&d~eND| (ݵ~\J:wꊻ+SKű2NE%7Uםj{; nmkȸ+}tEG_=c-:3JvFgr j)Ǯ3sEQddFޢEѰ2,y~I^0(+9f_7]C& |F B#=JΉ92`kVw~{sx`\ r3j.69_> p-2?swzT|&J*oA/e c^&k*7ɼPU'gΡD#EဌyPn^KSA i8O .gy{½ fƲILˆ1u eMS~+tY6V9[e" O{l΀ I0x,9 XRK.xW,L ne*"܊%w:7IHU&Xv'5;y*Sḵ+R^ +ɔ4/=M @.{o?7[ ;yvS};X|6gWfB@W'%5Q,vhpN6bVM)y O0}a%&U9;Ka5\HJj" L+B8_؄2,]! ]TK8:DUۇj2IZqF(ELL HN2DOSPJRaSLo0pB1gR%KGe`ѝLLBVcG,JSN#QW\%2* .'L?՘uE]ejѧ>g*̬NG]E1u5+ URࢮ^";.=yDvR2wzu3{ٯ|? R[īqO?*&n~߃TSuvaָokHJ8$4!F򄔡q*' XiS<=OiS<=OiS󬫀l)")yJ{Ҟ)yJҞ)yJ{:)yJ{ҞZiSNPY %U0r3?ZL9r.O(H zKUiS<=OiQ<FiS<=Oisy`?XڲWLh_ rDł,I/ )b UzķK8iy>rX39xL, {}3.3' igi%62n5 o5EThR /iK6di2+DG.xav~YHk`c#p)wuzxG?;hkme/`;N^Nvl-eWY,-35llf:"9乒<(Vw?솫iXGS'W&V̛^\_d/~w{qY-2&R/kY0PMw!b[F9j7gYlOif0/x'*ee%M'([9A)pks E׊"ϱw+c"BCYa + h$Fjrnd{0c{Vǔf\ ~KnU-X3x&lP=ȒEc mFCɓ2બQNݵQjz%w2G(" ,+0]ËAz4wu8!CEܝ( 'c(/i؋@ Yj楧BJ,zt }3w۩#I -( 0r([i"b(& Oӌ&OG&9/Y]:!Qz4'wBGSns\0F{)B3,!{,8] C%Ӳ,.Z=ɷQo`5X8)6<B]LwKzB.ڀ)je8A t];˾LHߜmFéPȁMN.AIP{qC;/[ {C΀GFk"%BIU(_ 1^P F) 鈳ɝb c&l==ھ 8YWS}kN.D_WZm1 -eVUJXrV(_2ނdV[XNAjZ[ ֌d%,nr?3N|'V/ȇPPe9l3+Fڸ\aB{4'X'cJs \ !2qJH/sSA@bci

8-qZmr^sbZwu_y)6݆lampW_sF(me|LJ 3sAc6OCy^`HIrʹϕ6$W$tۿ~vn].gٔ괟V|&N?-ݲɟW0Eb~?zsTM\ߟ߽ll'f)xA`fq0dUpJ0d:8lƒj >( + Z?xP4M ٪QuE;mE.w fH('Z8{_}T~UdFMf:̀p:ՋK7[6Fc'ZշdKK;3୲#=Bb6>)ܔ넵>m;έ Yqϸ,88j<1Q灧C@$UE |PJ \rnqnSJivuE}߇U?Y0<_x+-h8Lj=j]Rji>^ڸ6PN0d<ʕ`4RbͼTd= Ca~=zt,{?Lv_{;gǤkPYiXn)s ʥ:w¸6N"E-9 窰EN~p,q QVzt;^'fezyB]Xv1QsnKռ&AIQt/ѐDkL>k UY&u[᳛ѡ$.h RGVaMBb&Kڂ93*\b⢝(E,ԑTiGZ*DR(݃=rh3J߸Vg\;Q|,z7rtӀm^sh]lWwz+5WZryZϳ7w^gn\0 7yzΛFPسN֖C޺3ZOן4]Mh  Y[]O"\_,fzwpL'\X)ȥ /B<uI4XY20'9cD++dBz+ION#ɁH./wY9ޞFh7t[2.YڇKx8>sY8%ޞ"rEN ~u 긺[(1(;oI3̜,K|f'fO 泷_F`ٳ n]rLd-hqxb=H|&.߉3WpfbĘZf޳NWGX))fC'bj MWj'қ&u+ڶ57Op$[Ҷmȸu]: 'I+F$=#[l/Psb\~ +:v V7/-k7Zoo:!aKbufpS 'Ap QQΠ<UdRjFWF6N5k*)86d}zFeOhEҍF_n?xO 2a!Z.%Zm^EӑkjsbS\r=*%0 2t)SR1V|AF%C*d| Lj.>lQ֤W3[awTc8.#!} 㪩uwP1D + f߾{Dzlt`{#t1 ﺘ$2g(=W0aǪ%h_U-.,YWQ\1)T{$`WqedU%I\=CqTZVw,q0^_P0vY4a#A2?9HysQ 95fr-Szzc? jjmBcmXJ!Cn8-xD5ĩrSc{srTIKG>hM~ 3TO`އŻarz'$ }%RD]W-^L!Q|NyX s/}e(zIJK4)%k6\X+wJT.(+=)x:M9b6ȹQjEw9lVJH";aJ1\Fd%[b_S>J hE&ۖmK {LbM ʋa6 сCb[Ɛ Dyrn`Yw6@k [ޫxӈ# GHx0 1V"P 8>$ҁ)wN`sB yNṊ-e1v2߄ze !f*t@*a8Òt0EΨ|^PSt;@yHh<:&,ḭN+2!@ A:wW}SHG$ #w4M=n"Eh0Rhr|䂈q,`w[Xh. *L'ˠV<:͡XMU0p$x 5 BVA*kGVp#͊ԭ"l.$ W% vDv|Vs>$5BH\%HbL~wC kIm:qCIƅ^$/̸wMKsrF@_uQ{,bXJrxO{PLݡÙ_' .@0! H q"VwW",fW0!=w\b'I lv~|aA\TYUqnd:K0/MWE/^\L/УI|6yl~s`j6Qt{)l!IcJUrbaDӧ៕ΛWي΀}-gKozf9GIV fS no-5CڛayQyj`XQ0iN>_;M vl6lkmKfS4!4 >\zVX$>Ű7,bKպ OA_oxuW70RVkԃۃ::p}]5 5MS6iZu9ޮB!{S}4Z)_.0?WónXqbB{5Laͯ@Aj*޸pWTr$r+wk% gg17c!ʸXv2aFT/MalїPnNq[`,YT`#)E<={Q!& Sj">d ;L-lH,m{GT]4t;Pmc@faBr= J%k"=4XYa,Ryb?-OhϪ?oZmq煟w B<=$;G-]fzuI: a2K"c,ݥ” 䁇|+8R*/Gָ|4r(:.9ՌfUa}gΗbrZ?3Gh SOq0h4BkM($QFG .*GT KIwжBV.fY6Өp8maٽg~_Tp p5H#{8ۨVZ}~,wP|*_Td$Q BMe:-Ħۢ%8\^+k)RD=\P50A*1jyΨjd*+W=2 oA1dZ]7#CK-f_ZPre4WJO&ix`0Qd eJ+gI1ɀcoj#,' 3jg>NLj9+j$>[X7%+ȱny.hKa t3/Jnt vl4ł>V4Mt K3X^u ~i% LBg+wf ʵYNF ]>*$0s]T7ҍ izЦ-K73&&W~2ןzN@!D!e!x0k5f,`ZF 96MVHKDg kgB4yw?yO6N IM% Q3' qfA%*0ńJB[6i"jimaIVA&G[Gv.rz7%T77Ry}JtI.+I(֢JH15N9ťoW12# Ɯ ߉Z-P֮ k֣4EѵTCoVYWc7 ݭx׀"AjijlɭA ͢|zQr>w*0g p"S8#Taj5]pywLQO*Fr {wVHKhϓ4-ՐKzM4JPI]:I%m%d9/i˵)I"igɢmlcr5u{0..(I[; Bۖ-{ ]{3ZkC[JZV[$Z箺|=5l6 )B[(bI?mNmNmn}§Xt֏x<^ v%C2@T08!Z:Σ͂6 s2eҜe&-<<]ּek\|.G r$@֓L"FL !TKIh$Ip]Pf|BKl*0 42}‰ݕYPpb =tXip}B(4Oۈ8Ǘm)3D'TDRWDK# , Xp?Ҽp+eT9MBp3((1=(5&* 5B@WʐoYE;ɳwt\Ij1]c??kBԉRU6G;ۋi\6sw_ʐj`8t&F*§\(lu]{u2SV&Xv{CP{;tu)JPtJFs䊜7ݞ8 ^@{Fpe+ wɼeg`uw5jx gTgY?5lt~:})}Ε+(Ⱦ:R600m a?>H)=>_ZqP?vhRSܸY/o_L.[g+: z.kzjd삡LAgCۃAHMKַ$.aTW6(*P ##G1~v:Уyx Y*#[>d]uϪ/L5$7,Q|1\gSzB;9,VUjUP&ۯnwB/O~|Sٿ~?}8̜1| ]QM˦Ms#i4ԓܮ!GiX6%Z*;^}ۏ?vGi 7=AW_n_VL4k٫Zz.c9D³?l1$]k92xɓ AJJ)+͓tcp F߬rm5 ?'@.N-?F WǞwJ/qH^Wr1d`UIʽ@nbscb$tRn֑\,$Xfum8W (IHyR dO*R1Z;t̒øꅑ#&T#P#__=R|k׭F%,23^\A$~ˏيzRE-cj;Vć9thVa\|5Ž&RB8 b*r\&pB2U\\)HooBBqc( Dh,@{Ђ୍c!T#q6#CEX{2k>D wm&_ Xzo-,K!/-Z \B.\OBwG~q k ܓy/`:B .lRJ-ABjP=M]tdq8܅_6(,hJ ćYR@ErRrEm퉜jؽtc[`:VR}!z|Q*J?'_"d,,3T@QBSZBt"Hu3zOSM^7 = LK|(CƉThTA2P}E)ޘwǗUse>9m&(NLMd-]yO^Fh\(d;Hy ?5\,~(#n!E)xP[L 9^=8l88J0 ж c5ðژ^(k Ԅv2 ) `\ᕐHМpMһ"s3y]u]ffv{Vÿ P?Sz.|?>\춷S@^J2*0F.{ǞVp 8W N-ugo ^0& / /dp)dlVqIeY_ZU<K(ޛ<љLFzT~MPw_O޻> L_n窼_jn| tsڢBu *XfbwOO:(p՜G]L8)3aǠSZեH2iI7Ve|؋@@l᥷Ebi:r_6H@n(5 1T($B8K{E锨G#,Jrk)r{_1zbW&-}g7<4}lk$ÂnC9~2E H $x$U  74*ؗ/R5Bd.,p.ЎY礲-jOlD4$°vi{n; 8 )SC/,scX˝ہ;@ȵSGav8J(!j %Չ" .yg+jj? N3M>7㯲)FT#hr]P^ Y)cEZ3.A(^kbBr\O||G?ޖqkUkӰ,(D*'6ԔY`4Օ,NN)9Q*S{ʘ8eL0Rd=II8 QQR ߈Z?uծ-k֣ll;]ݥfiHtt; OՃ_(\7P$q3$ DezlĄaU HЦ9lv}F.x#@D)ŀpt8$ \vϚXcHǬ(-t~&)ZGp0! eBsٵ jZiԴְ(nu]HpYNQ[l=xaGb+_ϭGQI4|39g"{ĩ@Z^2a!8*!9~jxo[.Ԉqp$d}B Xb$Q4qlJƛ|&>u˘qCjMb;x7Ԟ{$.r&`WRkQ{C9d/Vs٨S-p1 Ds&a>4v 3%C{*YrVƺYgklCIԎLu]˅6?oVW7wtԪYRO.sWn[ZjZJ&رx6|i82ŀDo !RXR!d–8nAr~ӭk^\['ҧ|6~@:3Z,ϼk6OPf4-.ttQ^_=gYGKXrM}^›+8٢慒!5ozwI3o#[9j. Of/<,3ќ>kx$׼>eHo°{5[^bi܋|QaVoZ9HٙC*]֛y[! [X 繺XwYf[k})kzY%n5sf*)fjh^>bJgZ' SHoz5tS6 ~s{wն9.wm K!XجI >ee;l~W=9xYR5|ʹzUuuW~N1ixb9!Uʘ^V[}WzN_31w{&6 2 M H-u=T5V+&h-&A}\2ؤN LmDtռ-TjDDG_E_O`]VI?(pAl qI1脹lLNth^Ў:i|YrB<\)j`07Vr{|2-\9L0ou T'<@rFϵ,wJÃ&HI֊З+'[B~o :;ĬW逰 9HysQ 9 Ww,g@nӄON.~2Mix#t/ trɽ]uLŪQN9S93-#TEOt^#?/wt)9a? T'( z̝vX-"ro0CJf/h&>̵0nPTkگ}J)!:^6caX܆aPVPIR%(%Q,9:oXV=!SI 7*|f!UL?Ϫ&^Ón>&#oR"F.7^F6b7*eD.)e+`Еv)hg R.+\v42 ' .(xwa{tn[Oxj1nÓDV+ٸ.գ򧟗!:߼)4hyUEMgnnҍWYq[RDq]uGyܘݍn)bf@/gycۮ<&In4"gCޅ;%7iA6ƒE ,ӽ;A6">@#9gr 8M(rneZGEdQ0A[X+7B"IF"n']vv|O޹.ܗh;s74С<Ç]0rA T\P@cwAT.tA J+u2 ȥJy*QH`NH]J}**Q鱫DzJh$tӖ܄vrG6"Y9/CëGyqTW+&nue2 M "56TsJՖc.5wﴜܣ~9wP0}U10 erArb#V*Wy/rNNWZ(ڜTIL0&n[8ARs"XNs"W>3P+ѻ؉J{ ؂0|Ѝ],3'r9:u=ʖRWIq/-&-=a7& pa)Ҝ޼47h y`mV@HqQ_[A(±4Xi.͐+"ڳuQECD4hG E2gPTE3XX Q16tIv=['W2Dzk`:w3Qn L6Ţf1 \bbt`hN7XjC0eЩ !#فtvbKx!*BR5.w{~,_y糡6KU@^ᛛ qF,`h^G"`"2-a$EV 1B Wg[Es+찴\s Q`Z^nn=~f5;(mܗ=1,w/K$^0O0<^BW# V/Y`{`8ytc sr-tk^Ў:i|9} r%EfK>2НS˷EO-ߥckJRa0` 5*Cz Z{1l__t |vSc BJBm(2%ʔg9ќ19pԮfzPry9}tP-:yxOGCJ9ōiU"bK(F[ :a5A I-smn>՚-qvj~u X}/p.aPVPKAD 0<)ͨoNrs R]yKn"'Jќdsk,Q?p2]n-d"n`cB)7j<[֑aYqGk,k%&R򉤏Y[dÅNc['x}ghp_ n =6СrCc:̉ܜ],},vB䌂+$T΂#"_AsߣAw)t]#9twD[Չ'|uw)M2Ԭ-j4xáڄ^,ImzpzʫV$۶.-I^2AFjV.VFRuFò`VBV>:|ϰଟ5/+r0߆߆CY>mKlxM|kYW|"uJz +AHkwൂln+Pջ~Uyf %7?qVPLkc!Q4+H8< cN٩MIG%YQ-JZвm~YY-QY ILO^g3V.~}}x(Ќ;_WxSedycUm%g翄ƍi ۼ-ZWV Z9kl+)ɞ0ԪU'Uf5X{7DȬQp*sM2Xn&L5/sbG쿁2s D=<OhWy|oѹ:IJcwpWY<ؒK?Mc}[08EеB(W"Y3vS5vlN$hO܃hYx5[@YMߵ,W /ҜQ,\yXTȘ^(c;qLF.)܎CQ{0rHRhㅑJmc}J!LE8IJwĺǣ끻[mN.l?z0E]3[1Q2p0R_#l y]XSm8vdRAX\q/1ך!w&"[#bJOc:t*+HXG"\JlB]OUc*5( K ȳe3r'v'.)`U$D5M?brEAk4&klBZAnƒQjpH :q%VTGQ21$x5Vt\T>XJ'wC[cQE3#7V`3Yd)80x9읏cޙI]H&} raJ kAhKx?:3ݶ&H7oGXTm.gR\ b ! (OG#G:ay5"q'5O"F"Cs<Vxα"&&,. 1##(# ,Kwq)0tc12<WyBYIr2|'mV cs%!2 hLiU~grP5HG$8j,ḬrL^\ Ay<q_+UWaX/[Xa"4A),4)G.H؂>t0=? _R=(eXMU0p$x5 BVA*kGfO]niա냬n˳\j)Z1$5B3{;UHs)]8Dѧ2,P%wRYާFI&c) 3Y:tiޔi$З佺GeQ P; *N]1:s3|rjyy[  ފN[lBeq.?{Ƒl _Qupl`e`O1E$%[VpġLʣ83TuS]]ed2J0/OŋYɥ\10|m vLPZM;6 aRT!7L~z4y:+nL>7ōW<8[34_^tʑz6s"q0cӡKPMM .!0T[ͼs`ŴjG:1^[+Ak:YWk]_Mk ҐT1e0t^1Kg,,V*@&]헷3׽8~۳ק?y~:?ޜz}v:?+Xuba6A܄;5@pGռj6U&{=*rMwG[.mMWB(ʭUu(K{B5sv83}+lPғ;ܤU,JR)r<a0.<=lK1(4V}6?!2K?+à z#(w0XΌ:o&uxŢeId”H )ESˀ`> )Gbn8kIc[G9/TCT g:8''!aBr΀= JKDzJi0D –T&Hhu &T]ou_Hc2`A[E~ƫwK($근Vpuz[wP+EBzaxa4E`RGۜ-rm|SMagL}FNᕂ4n_m6ƨŮM@}Gx~mj $%'4B<>3Q>EM6T$1F/%G5h#A͝FA#°lG1 c&PfgwPi:KS٬+q`0u{G_Ek8E&~sԼ||qDo/Do U"5̣1U)[BGm썜6J~/&D&TC2H[znE^G `;@y~k>jb`4UTsc%bCQ'@|rz&9rq>p ?قqq`vaciuabZ(*c3SZse Vr/Ʊ ?ix`0Q*g er+gI1i $_w٥'L.v/xĿdW3U* GfcCΡ|/;Iy#haJا %=B -/e\q H1 Q\4x  ҹVpb]T ]:zL- ZG阔i !cF@1gz4‚SѷhxW➀_u=R_`fR>I[5L=y "H ž?ozw'ohϟhy9OSNh31(0i6 "֥{&+3X. ]$90s(.C/ҍixЦʠہ%q'ګNk猣ta81z]`B<UX-zg՘IDk1rl5ͭ֎WItɇltEV㑐m2RE 6)GHO+a {H5XRD< ̂@9UJk؍lVzWlԗ$4  [Yh64č5gOQ$=qs2v6vݿY]+ UJz>Y ^oie Lɜ2}穯%LR}pOn`f+~&ZI7ƌήc.\z.۝0{V~=C]n%_ޭlσ[9Jf0.Z&Z2b ]hΔa߂+}dl=*P8… 0q` rJn e%CDUDD{l08gXuE("țVhMم EVa)ʺsqJyx[BL -"DcD}pN3QiZiiEBi; &RBFyA*He-!2A[EjV>wވpeyT N\8є g󥱅H\ڼH60 kUa`.~5cz(ʸde-V%S)]4Nti^nECD'3({LY+0T$'Wԝ(\h I|r lE-6.4!!JŴ'q!ϓ85y;0>d9K62T2%ٗէ8/^<_^- #imTX~cjڱat kE'ϠT!7L~z4y:+nL>7ōW<8[34_^tʑz6s"qcӡKPMM .!0T[ͼs`ŴjG:1^[+Ak:YWk]_MkPiH2Tfgrgui+ U  . ۙ^u@ӟ߼N^o_>;Dtz~:Kjvn S?OݣjTP_U5UlU zMv޻-+k@~]y?wG.MܞV:f-ٚ @W>W٠PwU1V*I]b Q ϶siKn0&wS vBdX1T``wnc, *0wHsc}o3 ~+oQ!& Sj">d ;L-nH%umPBR9Lx'8J^H'6X)) 'SJC%#lSpVPlBjG]kiL& #hKOr e}ĤŎ{Vԁ{tNwk=::<:W`{s;G/t#u%~IQ &uD9͙"<ݛK>c=>, [ǟ~[' [-vO CYU?zÃ0.$úض@).pi\xr2mu$;mǫJc^bNd9Н(LPKx-j05> RGÓB9!2`!@C\zY4S,m|jؽ4nnNZmuńz)v?yXHb(*  ^J"0zlNS)BBQ!ܨ;yd` 628L〡^GT KIwPiXkf ^Bh`6+V#ez^Op-վUCWG_n z8ɭ^ *W\j" =z -_<{y|Q%zI~\i4q)7c2On"K5tn.F乿.z‚{LCXQ +:䛐3ɑKcp-X0Df|!_c~6ahT V0b]l涏ϕw٥'L.v/x<3.cn8! a533Βt+|Eby3zſGWWC(o a 9T"ҽ S9,J6,=G <4 =x}֊:=r4E ^0zN^\J 0xmJ>b:set^~D󩒮X^j"SZHa=)pE, YȘK $a. &'R,+Zب3afl`XC,hӅX y7dav@S\L)&HEkm*1@86i纰GHƪ}C6f XҖ۬4QŐ|@RA&YZ:!.llöljiiiżkQ t>.jދ'{,5={k3 ,FwyYh"@muNKIk2#P*5yX37"ǡ&N$ q'Ĉ0ʯ"K}c>:Hd*e /51m~j7D7Wӗ'Zyx<8y!@s͑@*zk^&EO9ָ9f)^n{r_|mX4=S7rkw 7X&(}YL1UߟgM?ٍ*\ѡ"22.O7dU.2q~~IkaM/ڇͣtGfJ$ʧq[dKw,.&w((͟%O&::t9R`O?&ȵ&ؕ%uYH_oGsQoG7mXiƭLJ<.cMh&t $ h-ǐ־v(-+eCJrY2.s]6Û˲V$&`i&݇8J.ЯOqڶmydo>+=s|6nӲ[d'{z孧h0 M/<"@V ]? {Ը,^LF~5Y>U@Tl-2aV|=\nwlX(lC&´A]XvRK'-]+[Y*BvvMO=U/wJ^ne z8\ $5Nn˂=Lޯ\5]>_H[6w:S穉_3,4va#{~1扌'j2b.}3wv'NPia.BtN*z.zC.EmDQpI{o._=ڷj 'oh']㲅b!gVO!G F'F7syM3p=?܏Bj(܍nrrȃMI4 YlFh1a\M-f-IaRm9cH{Ic8 |4ĸYAo[3Q!-I8o|{!؊zlʹtz8E2<v+R<Z h oZJ+Yxd(;G,o/޾KhB5wL`"pr=Ԁ dYw7^LMr7iؠ0ry3<`N|pjP4UfgB="H^f40-ש:\i5x:;ep ,ԥuQ9m,S2!4w@ (xǰ3qvY3ۛU=u"~~|"U;&[GgS\[ ˩ifS4<)uQ*+4@F !zkXl|zɈ%#z{(<{KD.Du)uf*CV sk Yx_ydpdAdBS5nJsc6';)'x!^5o [=$5g|i '&-i>4)O|BIYO>sfE ̑]q틻*Z^HJluWo] Gz㮊]iN]UtݕDmAX鍻*wEJ.O])-+eA?SyEQ6TvbßoW;&L4碼`ron&C#ov ϵBa_?Q&%MquoP%I * K_uoM#dUqꋻ*:x")-+h6G]qEoUUkFtuhnBʢǟxP0+h! MP66߰ *{6X'xf "SZHa\(t8w 2+F9ƒ% X&23d%S,S :v> ܔ~H nqqbnƸ_\7ߝub_Զ!|wstS]x캚6|Y1DbP11B8eb7'dLQk-c4I97g具j  S3g&c~ѤI+ޭhx%44wyeNvrc1 'bpLZrל"S^Km2ȂW c&䠣=9I2^2ڂ0߲[V~oY-+ee9߲[V~ʽW-+e巬*e|Re$͙ۜ,SL=} Ņu(cyF]Betx6fbڢVbJ ZA+_%ĠVbJ ZA+1h ǏJ ZA+1h%ĠVbJ ZA;҂ä`e^Q5y^N^5>@(Ӧ(c{ʿRµ>k}Zl9dU>I抖ǛH܎/|\gk& f { "$ \|tIiH$چH:"|B6ipݜQeM~=6z0f7޴Iꕵry"Fm[ffuT i!e`a(]ߎgEa}~bgs)6nT#M盧`,n.|ǿkï>3n?ϟ~_p))O_/23'hVfC +w\t÷k)q<ͽ5Y⏳oĿFxW7ҍΊMDW@W9,ξF6.Y}uVF3UbA<6QPlQ(#-cٲYE[vS:3Fi:-08l^@c{ c爟S&)"Z-nwlW?s~B%zm.s3Rփ*0%&p"fP4 Pwh|& 8& LO8S9L#c D~kcF)ғ Bd>:ʼYjUFi|{BmFKZdZ(s, [QSE 2d#( 3_ٶ/~Yz뺯?9}-`{vp.>l; i)3'SLרS)e{y2Kޕ4~Q2Wۮ\m?jX<:<2;^Axs\@əHtY;"iJJ*Nx§#|Q^VCB9"A͊Cp!if6ILkdcIĿuR&&SJ0]U5ϋ\ZRpHeʢ{O.h@H^*&NHFe Uզi] x|?)}?:xff]CK]Hrf!!򺩴6K:Bs4FQJr2$.#9p<M@-pϖV.-Leos$O 1CʁڥEjMg;Yo[1vRoQ"S4j rEZ/&% g[ڰF]A%nwhQɖESAoBk2m[fzٳN~_^ڹ7ᴪZ{nJK-ܔЖo_Sß}y|)#X(ȓ(ʇ{~ 弡R8"EXן$'NGNAtv88"q4`vvZ R$ߛK-pq2[VI}^λٻov{oWp5T[aE>Ψ_Ѿ4C} N>"+bGr OTI 1 WHP'7N$Y)E4 l BX:K VDR:`Ix2&霙YD+25"-A) #;i;vhp?Sx4sXO۶v;$ ΀?8]AoFt"w@E$Ĺ M8) \a`-}\ !rmU4,UGMC__|7HB4OV[ի7 kpKI-2H)zɄyLx1 a7s;s]Ia9'Cy_ XqT.8xC̜uj/)Ϭv&kHej:'I 𒳗id !lX-؁$= NX N{1ԆզEŰQ+wXU7,7/ <<+;by^3;Լg-'h˫ݑ&M_ v9}"^fRESqߪz*sǵ1 THLyt)ڜ3)jcLI5nrwܵ#wgT6-̖k{(Lm7Ҁ~[7wgY wS=^\ZF!؞Z\BsHck+{wtⰬq[$lLDD2L1Ѡy6hkSNAYE'\ {x7c`pF"@-udmF\GLdɆpK#a2<֢b8㴎i'a{`y} 4NWSqۣWPwރ(l/z_ɀwluZdߌ$bd) /k#d "N8[ [.g TT1_yK}>:ZEuB#`t2|53H7wjwNE`Kvg<+)۸0E]@n\ykeMxn8HS%]H^2*3#1J2,%r`> N IpG {dA$sWfjs56Euo!r̽6Cr-⃻2Nuu=e+w[:rhY2݌ ǻ9 d4%5+haQ'sB3|O'*nbPq! \4+K9dt`8 6dDF*2_JhR<qwv[Ćl#˓OGb=|0zL !|߸FD*M2rKJd&U%o]qjҽ@.b  9$g՝uXWg7]yƩ×4.#7n[yC.pckdzsіNm6~oͦClIrn7mo|Z/tZnR%ifּ |Ğηt+Z֐/ΐq3B/.]ڰ˗~xǗl?+<}͏QS6+4N:doF>9mV!<'[aβo:9J[S*A1*w۵-ߍ􅵰vP}:E@t|Η 8s gv11dqh _2cuLq`xP u<񈖗/q픫tۤtֆ=}3W'Y鶭q6z>u1r+*{jM<\M^⎎Ǣy}cJ|]?=tF7ݔߙӆ{vڲA;= ݽ{~" Lw;κx2ԅJ uD tҡ碯wۥCQ(jYǼ't/_[9Vю:eC.e 9P.[:1\wF)|!aHD)-5 H2sr<: ޓ'RH(Iܩ\εE ڳi$'F!.sߑX"Y"QX"PM5NYrf5.yԎȲdAyRښPbt<< l)is >2_;3qlC4 #pY$3KѡN[$ZV p,1c'<6`gͥȿ;k8D J *FKmfR\ǐ 2E&uAP* 4Ct@07aK@h)YK}vz*[Usզ΅9<\aBƂ2^/Χ-oxLPm]d'JN|EoEI([v Mvy-O'<;v#x~|>J+248# 9VDz Pc:2"2Rwy>z[k>{/R(+W.Pb`DQ:ji08gXr"`T-Ψ;[slHwD@$,`D 3"˝V2Yq$/=VnQ8Pu>P[R=5nEh0RxiޖI9 >rAD8 ]-D\T>PI;Rh"Ї k/Q^ RY8J w1NЎ%6o!:Գ:mL6g)!0ڙ|.HkʅYgvB$ \K>0Ά_ӡQad% JP\t0E]Ny&wwU 2xْ9ErEcH4%9B>(Eb l8u^ߛTP& -,O!;V+3p lHϝiGa0N<۝lfI/&2%+w vtW=_ɋNN*Kb`Dhjz Ng`IF Ȋ !at6WJasvY5<6}tRy[x=_,*g+%z[ojg삣m~2.+0!԰+1z˦eH2 5.* } K;v1I3bf7^p7JUٴjӳ ,Z00pԓW̒>ٰJ^bդjݤpzvzWϟ~z۟gzϷgt:{7g޾؁qpd#jg~ Mv{K󶖆KSŶYZo^W!#7{|4Ҡd- ɷcBoWova,ń2*Kg~^z.C%3ܥ/Y aj@_xv-m m6.;q2c0Jo41i`p2 vwϚcПAF=iR;`-j>lë6(E )5`3SVmH3;2yIdZXry xJVx/$ '6XT`HO) 8K lYHVwAVPM}Fr뾎pcm.W&м]BQ/#Ajг:`2JbN~7IQ &uD9к͙"cj2Dٷok)~|nВ=oTZ%]h@%dQ,R;+Qo3S O[G'sC;Rk I -U5꡷Uۇ6y-?L,ܛ G`Od53?IT~;Y5I}crL쪼 OF'@û92.{g=_gp2jRp>_BRK.`:3髥^]7CL91 (LPKx֕$<TV(IP0(u ҩqA`$:Ȁ2fH0[.,)IJB0#B2H0z)#0+Cʘtˮ24 jRAB,qxu5]@}as]dvݚL՘я8Wnyu ɡa! s0OF d4 9j b /`_.ڹ7U3qMZ/u5.5s5οI{D=zgצke-e0yd>jbiJĨ sFV՜䙣c--,7X2{+J, g8)-_ȹe&v9FZ4<00Q*g er+gI1 j$'#a%*r\re&+NzVZZ$w 48ndoxO C2:5N9 ,Y?^,^{i_qqrnZyr7%uTw)QWC ǸqނcHGp0h2![nu '֥AuJ̋:pv9LLrt6Q($@=ah՞4#y4‚Sw'9p^afMY&~h`oq 6!z-$|R@u#7A~CEN~yb.wB󜁝lhcН%&o҉ziaZ}\@lTRpt#Z@YJ4TrI, Y۟<.C0!B*Caj3jX$"﵌nl5ͭv٫C}?YX匒'X VRH⑐mcƥ,lR[9[A(^ci&"ծן!I<E.(q, ̩NXHmDt&;JrՔMv=>ZA5fjG7e"zZDĦ] ,OY0S2WOsIBj+$GkQr |WN9ťoW12#XP1g&pmH";ڒfgO3jkspusWr{Ÿ%F&ex &_!"yYv) Ӂi]IX2;>IDa iU3PAbJA1 ('@ֆccJ`zHLdu)Рu0I̥6hQZGiҞ5(fEV8\,j>4[ܻ\չ a _(>ZJ7c!,s8UC@6ULkci X f]Zģ?zm_j=`fPDj> )DTZhǮ^˪vWl[܏}Z>S/n YgN!zWZե~R^F!/覣\ܧd2@ uz7CrYRMNjԻIl̬:lհA[m8 ꤑB:q/Ǿ2m:Žq]UiAX' N:wtӢ8x[rcayl11:0ADL),p !#t[9]0Exru([Í^]yA羯9X>O22%j _WҳU9)^9;%ȧ װ YkZ;}+oMicp9B{ d2 #w?{ijnP-%}VM~RXi.0}QӻٴAQX {]VYғ)Mإg '3f8@ mZ qxvfA/D. &cU?pGy$$\Kr1vw罄g/w^ALw^׋@kx4W*^1ݼ*ΙK/ZPjKr̙$ȡiA(Չ)eՉ&TD/ʣDDfQ*.pPd(sEP4B6D'kPX hUjhŠy̝vX-"7zf!% 3Bs4h mNs|J}=c>^6x3=ڭ GRk>p|sk $#Qw:Z*}p/Aѣ1btCd Rp6 eJm7Ir:1uVn q*֋e<@m֒-ӇOd,FKx3`s+{(5օ$K-K[4-wX^7z4]a@{hɲ|D9B+*Xh$7QޝiXpW5VПzigSD=9 w#{RQo>GJuFMʤ4룴YFMx?W'?xe}iԇ[nCݽ]Xk8~.EZY -o!nr"h*PKMFFa4I YLƪ}I)&jQBr)1k-b eq1rѪi-|I?li ˹vFKK<9)M;EjO-|Mj5crIJ[׌R8* LTGTI BDii{Bؔ L)J#I>bgž(f >=ruNj,CRB;ch9ilIiT*)5L:?d{Bp+&XZ\$g1fk+kὐ*D#vaOڈ "=xC]]\?,U6#mJf;Ud!jwLvk=I2JA JȬB2k|&Y kNa͸C@IP&v5>bg]`H֎\1eWLJ.J vfwBX@te EdҡI "m+Tho*>C"F C2(&& fCR$h~|ARѠP HfTW6~uQ $)bei R)~U*B"(Z%AIhP8PZl-jej $z W },|:omBi"3aO˾9FwuRYp|]1FL:;rB=!`} (9,dC>}*6>uok[2V=V3Qw4%M] + 2^-ZX8tLKw%ׁіFC˥{ wS yY`8/5XH脺 jIT 2D&0We2'E`:҅`a}BN\tH_R˨չ'Aqۀ8,TɂNU~<{UjΈJʡl+dəj&@Tt*W?Zw8o1T[4+XD(إt13=Ŭ:"@%\ Z˅&"zjI7Ӡk@)@` DY9K&V@^a:=$@o"}S<$}]`6 3k)Ȣo&CeC4bycUjQ`&uH4DES6`k.?h`e!5Hg& HZI(;@ڠR)Ԥފ{)@Elg JX1 [մײ&!Sj16M0ڃw:/O~}zto.Ouϵ]DFQ7)nnc@GBƔ("?1|v9,f6iY{ZQZthvf cӈ̯go9+͘X9,Dӳ.J\` AwP ,)aI吀MOHLYPqGo^bnlAW( e W@J*UJ\(\BϺ]FLSCTE# HMb$djbaisTu`-\GҩH*<QNڙZg$h;kV+U|?S5d$2r$jd-ڵNZ-AVA-_cf1hTBmE"Ufm` .RUrmt0q !+x4Sq:P( g՞sTj7eW=*W j5yzƳ $l:VN_بc{LFk8"•cAZ=z$6H.zo^mb}qxy2edzZqz[|bqq^p;z4;93 b2Bli ۯ_/2xp97WwӗuX5eͨAY(*TgȮaߕ6~7e0~k`~=tu_Ecnlg 篧pl0hgqB>qHm|?":I].07'^|Þ.nUVHYdYVū~x(;poU}E.꿏L߬;_)_lCtJ6tk |&t)?_,^.> :[:S⫿N;ɿP7:E|t{ۑwSܭ/glVx=@o^tOrO}¬[o j~f\bIܷx_.srK95@xp]0 弟 wn0#p}W!qױ,npnZ"`)ܴ rӲGM%iioZjmI,U*&gdN (Cp콊](5v%҈oJiX]^-As5)h&.~m<mV:ڦ\Ϣmur/m3QǿRӧ`}s6w-s9dNO?ٻƍ%i}އE,K d = &ӤD5fV_U!0RC֔JnzN^o]Wpn꫃m:XiWH2;DW4i0˦^a:;"oG 'Wշ7fqlNU=jT4\q5^f ?XTX|%/2*}oKH56HBTc``σ8ifbZhb^ #J>n.`2x]Y?;*ᓌr %m> /~bN1+uXonC#3K*g<gߞENӷ%Ҩ޳|M޼\l~gϸmr;q [c='1L8Y&"qs;F0_Ö ƺUAT鿖ULjZ3H{uAƕ-iB~[x2~Ui!aiQ*c e2J!DaקE265mO ͛J><h=UjްWqT2Kmk {C "2r)VKy r5'ܨ˲|pflrw{8VR"W )#gIiBE'lTHR<* SUPQ f63'}ڗZ)>h9m^ŧ9ZQ^5J$7櫷_}[}P}?;^f^(:,IĊ_a!ƴSHT Ɯg·!eY pY|7a 7ƏQ֔o} ۆAQC(ӑ JO7Ca$ӓ}~vn}7;2<׷ * ^S\ +t[ iZ?E7qҞv hiϛ"~ 㮠Uh_6I(_#=Zu*GbuH(Y9 e_oF3Eyb.sB󌁞lhcxYG^[Ć%X2䡖V.L4i:Wh^q30yM]N_?oΰf Džu`xj0)\AR$Qnmn<λɾY!krD`|Uj;(>:qHHqQx&n @{HukDYVE0#wU3{N#ci' s3{?!JTn2t2l%,Z߾f}5;伸k.g{zJ$_v}g5OsraRSg'|%a{(Gy]-k94c2f|X2?6]uYěF生eE#&oTlTU_>ze;W-kUKyqPGp6c1@HL̬$C~ZU>gKfm:޸.fr ~ҩ-G Em 4בaYATJS"D4>\ۄv(ES>x0Oe.җ0LK3$z&c>TrMpJa Y"1Aqvn5ᅨؑ-#~:tG{|tW: XɫIr&5cqI(I#&,vD.3EO 1@^C/^%Kd8XD}RJ  6 KZrYP a,YduP*N=c|):ݻqEI=]/N]wΡ%販 ?E70 b~hAbuV=,8nChhwf!̠{}Uϫ;=M@Ws3U܆[mgޫ[#ZuG/|_EUSﺿ=pzՙSO\,!#EVZWBsj2RM U'"1T+8 WFhrՖgf>L_lc O?<`FǾlDe?"FhՇ6uՁn eedyh5W8Eϑ//(fM|EѼRAslIs-x]C6Zb5yDTzlYÙ=cy>Ҭ{m33V'faWy05tӫc'#k<n?ȎN.d\.AKKr[&^液x^SGXUW{|Yλu:L0 *?3GR\jXV.h-\AޡrK282/-]vJkuDG_GQ(*oǼN:~i;Fb[LL-0ShKm(|`KBđQdg$8K"ha8hƵE6]hV׽'|'1܏a.]{KOg.d &cet;I̱6$ ϴ,sJÃ&l/(iJZ c_כ\tĬW逰 9HysQ 9)IޙlNi2Er)<:vZR*7jONc7#1#B mtd.ҵr) eQ1ICӂ8BQq?.< ?\Dʣc"A(8`(29 (F!b5@#$Ԫ!cue12E^썶ƽuH ÌМ[:NB._RMPY> .OmgŠB&)a:d=T9[7m<+3eHcq"RQLd V،2S`|0kk߳f XGi(bFBk,C {6&zm:JXaLjKqСaKIeFj*e;Q!(8vtl aag Uнv=PrUl &vY_#YzM5AX\q/1ך!w&e bI+=Q) ap) {=evQc(e(A@ڬS#:Ά2ckrI">8"ATT.&O=xgQZ;PD*,Dؗ&|Be!&KTdThsL>5*GQ2aI&& j{Fy0_/cןFc)˵.!-{4E nVk,Z#s6-F\JƠZWoӌKv4w54{}v` 3*cRb2叫o zg(^W.fr l'}6aĂ&>l.#cmBm\۪S#wU2dNê^P֛.Cˤ7Ԑ 7kG0Ѵsяc8^F@yxeZ" Ne ehIym%u7Zw3h$q|,2)d8Մ5L<0-R2RDJN/)Pʍ"VFudXDcZ)"VŸt h:8 mH8dOߑq+RS~埽 B灬WGcHp]ڑw/HF Rd6DXPL bAZƐ {&^y޽Ҧ-u^z|/gđBxD#$<@7ELId#J27" Hss;%.qpٛ0a[6}PVR\L1LQEB-gXr".˛Ui+#ށGǤ"%Wai%X#pdtvKGY4d|hچ98`ivK "E 8]-D\T>P,y~9TE)!A^ T Taq w1MОeCvvn։;lϒq0K\є x`#_[$AKd9B d2RY㾤Fq?&^]O{jxG8</e05(V Voߪ YtwJWQ})*@7M#x7TJ?x#_]yoG*> 썍x8yxBbB2I)z^␴8HkX;i5u:gP&-jͳӇ_7o/'Ճypb2g4{i>pQ,ճs ~.`0xj_=e0y'> L4pп7z<wٛ[㍣~zM6U`)tq:k59?#YrXg8KRTTM^v3׻r淟ޟ>{u}ۛߟc~8;%ho`&)Nd dz`h0^=4UlU6| 60]d 禚` J|䫺wvaY:̲, ht'0m`ZzU]$r:+w A4uoicpi3]٢Ջ ZE0!LO^6bLru3Ey2T>c:. YIH_ouR(7P'v #ʶBEJ_Ҹ *u8]zM nH h&YbNd9Н(LPKx$k*]|Y{ CNI K'$A,1C`-a V}]$Ex!Uxd!T7HԔTZ b FрG!eLDe_Wk҇;WCDkǧWM5.C:VBzrQ+7?Aڭ ص$Y(*  ^J"0zlNS6`B-טG `(3 3hR*+w/ t6sJh8Ħ黣.w(5vEg5~ҙ~&O߯ G>mP]*/PN?_ɋwzg$&'nKV[i·$7Ifqr9O P"D %Yp`xf 0n`~M>- c< ?:zbW߾:uLklhc'ċJ+Ɲ#:bЁHV ;e =QF@;mTQUeƗv`A?_qDN[m]rz՛tZ@ 9GF]WKȽ{N ce\k]pRZ ;j/ s cK< j.Z`$5a!qZ8`{[oA#0 b VF D%kjPo-TJN°,`o4x5p$N)&,QM=hd:$Z=/7piBT9ALiFzϰ5bqW*i1 ).T IlDL0Ph pSTQ*摔)-*lS^KA3 Xg$G*@pI.,ad!IU2[p뤔y$䩪`Ȇ C_I:M r#v 3O"af"O CfL_#ːiPp5! LO!V趼Q˙K/Z5<Q3IC9ӂ8BQuf?xV>r@Y1 b(*bqP03("X{S0Q I1:T])ʉX ̃ŪV`Fla`o0mj֦Źk)J][RsaB3x/z|}>wmDϋfv3W b:8NRܛmtݻ$R<F9x2Ü9Q [cI+Lwrw<ۿGlN%J'M;v>eB{8&Vi]26- "]$3,$\ }&Y7ס 0V bڸokw7ItHPs5@G;rm nt.Vkt#J k&2McIL*,aviO&RR-, TTxr8(b"(UL'3z _O d('ٷz;YbҕۤJKWA0Wiܭ1{%n%^/} |4MMg WEﲻA*=φnٝI ,Ic7xDrSr* eՏ *:ZZ"Ai!ߊ9ȳ08*ADI\w@U#i~,|Ɓ`s(8W6flݳu)90Fs̶kKUv>[d :|H\}Lr&R^E;_-ڈ ys?>)f<1qjJ!.=C\z`q॔{ clX-γAXTl.y AHRVa S띱VcHm佖'b5ͭq[ΖIU{5X(.XrQ>xuqV"qV~LY-h"m=SEɷV ]Zx뮇WOص-m=ɵS^ʌ`w3?i]L޲fg'Ρu:/[s0z'hgf!̡e^Qݮ;=orl\6W]i1[64?myʟnBr{/z]rSҬ_7κB|C/Ϻ ay⒳ [wʱt)Iuy$] m:Ci2jHiv|i\??|HjVfwUnl0M.|6fG͠A[]"qzZ yn5`W ؚz|gNaxRlz,7R^5Cd9YM77XH>]X{))Jnpb΂wDðq I ceߜ6;bWs[Rw{+X Zl6(9,jpӰ ٞxRǗk8sgv11eqh_>2cuLL1s u;Fb[LL-0ShKm(<0/mZ82L>{<<\k>Lvi{j!=m,\B-mhz'x򪨩P -æUd)]$HxI)ŭGil!z2/hKym%wH4H8v>RJ` 0⵴ dDK ))" %]ro(y:%H|ABu*K8O5i,?vygGW#@E|ҁg^uxgꐱ̏vpӽ 5 sD{9Y ) )oǘ^(c;0qLF.)ǭ9a䐤2 #52 CP '^q[ Zf8ש\~5Hǁu@n. ͤ`PWޮ!֔H]קmxEpGf1WK̵fE;RA<0Ĉ Q0iH0jvek$]K׬S#Z֦e]rH"8V@4ٶI~b);iLȄ" 4Va!¦js6a9 y-Bj]TXőp.uXE0=K"2 3^%#(CM71Qc5@5{UAv:mtH<4*+Nq,U@p ✍\ێR0cMF\J^5Π[[M[سӕO珶_]рmP9  $hn߫wT޾J^Ճv-ˆ,Y<[ #ĦjmYˇ>B??.1!KtR)-j !wå+9\,~(#n!E)xP[L +-I%hHg 1)ɡj{ $F"<PpҖB$B)(py!W:(LaÆa9Av2oK9% 將NwEOI qB$DE)a$N.a:A' AQ݅6:.ri"gM%fX^7e  "\$lBJA:$WP6yDHٍAIfW10il< '+5:ˬZ(@P}#HGpL s3ϭۈ8A8ZJ%($JHpvz44B@׉P/<:T9MBp3((F8JqC h㽐$ĕ22;EjVк1\?`RnҚ,*,~g5-JۢG zi\^k"g0)s|mIU.Ry燲:/]O*̸ONu?ƊN*8[poR8@J͟dǗ ǃSx }<8-fE .cU^l"V Usr52Ni#L28gsǡ W cQUd%_U00mg_1}RZ3¹޸RX~g\k}Mu۫«El1D~,7]X(BAt5_}iI- uͰEQyja0lr0s1AzMdS\*#[ur]Vb8LCrQȟJE^~Ai17nYfG?".QۏoO{woOooN)e_}-:AL 䈭$Iu',ہ;M4-jۛ5͍ئid ]Ch]^]f+@ R ?]}v_uVYm܍'t3g"*e+A,$BEo\'*q*͵S.beB zs֭ cH16be_axLÔ JTp18sF_n@ v6ƾJ3`@Hb VFb M#,He:΢ "4P]b4&L*@M,xxL 48N=>yH+$vjeʫR zRZGI#-Kꅑ#&T#PV!xTy Ԇ*}=>.[QGNS-qǡ83V-Svl腛څa?I)Z(6uݗYUZSS%MWţ/ ftk_mPXbPG߂6*b"AdTSz]ns3mKԖ}-ׂ/U=Y&ڥw'ZYiʯv澞f4{C ~0 \(ƠQQ};?gƍ^ =\ G;;?Qd>n.t?n _jgnT&ep,H@sVMF _"go7S/T^^@g7 ^>26xj&l{7#l'1Is}ze5Ow~_f5[\Vh0m 6PῪՑ|ro29/ zx5Өt~=~gD+Pt/ce͟q>~"sPee)+ $ c9lٺh`mx]Q P~0]WyKݢ[%;@ON hI.~~[4j8/!$JP7)Q^Zj *K1N\:d7aHbL#N&&N&cEINCTTqG.j26F7z)b BѰ--ܫ-S6SkS7_4j~Ɠb7aLI lxN#QWcU)@"A&0٭ŖV|N>Q.x#@8ny.Q"0gc̈1Hp"5kB,_v:gcPDZkP*W`B@G ͝"0D:fCӴN:MۧZC@h*Eo'v}l[׭>"*rͬ4U@ENg^4l@kTKy^ZF5,GEwp =j~_я[.PH%h }舏N%F>h#퐮U}Xamq[wڂxO=ͥTSI\@mLzkjL3+s=^PQ%[g&~%{] Ss(P>bsmji6Y@UHf]F:ܮ'%o)FIhDnqpȒ C-vlyJP&3 *\Fl[tC7әJޙ/LK-2WH0c5*+Z3^\e*MDs4ђ\!gr۳f5JXħ3W_҄S}ǻCτcz~/~> I>u8.]Q2J(jDdU"G|g%Iӫ@8Ւx5vw':@h t01U.!z6t$`@RL"F|(ʤU;1RR-g)+FoR_L)sJE Xai"$I[,% ހPp,9 K7^ }P-l0/zܳWxr_GB#B>|&rkX F4y>FCO?d+̸=5̟֧k<D|쬩6QGTy0dWCruSx wp5fE [)XO] b:m`3P 7>tç=OyAn3bu>pbu]բdJ?[NדJTaa47bMY|ВSn)Xg!f ?v~]?;0`?xfV͠+~ waEM_ש|d[Sb-B "`@]kHQǐtYت#b繋 -jRvN0{``OScIx8 $be$pPX?rgR:΢ "4P]b4&ܙ*oUr/P711\T:)Z#s\լ,$W_e>)'9W1*-@IEj5Fk'hԩ|t\#j o\H=b!{Ew6{P&>I;(pPWԙ[x#`64VJYr8nxdQq_=ZV"{%MBa+ᬪʥX]BL!noB8iW_ .ъ %qUa۽8CFgWf]mzضHDHKsLZ".&gJ&Kə=vBLTJLsB$/K[\j: *kϮbF:oTLT &*A$V;~M ds4[e BA=a=ZJ7Sv]۟E{]nivF( `㺡 ?mgFf"$^kdbo^Pcp[*1Z(byw;+~[WdEP<.8|4oKRY"Qqf=j!`D$F]H "! @zj#p_9x)D[USLl ǟ%?E;# Dr&&U"g7ͨ6Q#wU7ʗl/)+>9(-cR(- <0Tڐws __䤾" ]Ξ)ԃ|k7<}~)yE Q9 9hd$(":rjP+aO̗-kƖoIQ&4Q竧CMpPHp ` ū'«[kǠ;d^AJt:gbu=G|gwyme$/@ZA dF!L:j]T2t$`b1b(Ӗs))\H>_IPDR #zj ¦ļi OR#g7>[Ba@qJ['^qg'[!qo/> ߒ?Q͇` }N}pe:\5RpejTZp!mk)$z[ ~28AdZY{L  , H)x#D5GOgje cD9Oh x $]Ta=uK礫sЭ { ޻r>Dn%m=9L]5tYou_!/6[uմq9.l4~jb [C3v>g=/ܾrO{Ϟr {:c#_Zuܱ5ӟ7]Up[y6i[7lN=I0`Ә-e\תdV#zDZSd4J*J*VUVHSh ԿOM ]|sH-VȿЫz,NlGl(}F;md%zs|,WG+wx1}]5Amh{٦|n;,^qn=+e0K囖oΒoV9x 7͖= ]C:Zbl͚ pi[r%37pf˘-c14CČ2+3ţa'{y`j^%|"YGji 'i[Be5z>|Rܚ tŬ8^X4j,Y{\+d=,ںplؑg:Qˌ6G&i6jXb\tmst"s{ͻx) LD tҢClgh6%_mP1o!Iw] 0"RO :)(炧訶XiW&xNL~ !6})7.VYtꬃ4d4 ?Q=\%,]w% -xp? PIBQMF^*^} Z 0'20kq%wUzvxt/A?`*[,ً Lbz1TXbPGß6*b"AdTZVTP \o8h-tߋ@ x3v4?B~:տEW?}7T6U*\gSy9}1`r>U,~ -6iϖi}RR Tk.ju4[K6 qAH96?sη>qn(=<Ҍ{OΚaA=fIT[Ih Y!ߴq.Ehۤi@C\Ң@NP:'.*- KC$Ec\t, 2ZؾW=J:9p놇!IUn[Ȕt4K/~E6%G~z6WeJ#ZAB}pcD0DYJD0#3AX(+DaRk(qs.A:#ADQW%8E}RWps,?S|'` @-m zoR+M|i\6!y25Im|&@$x0At)Q@E|UojitN<37NIB+i2E~D 7J>I[;#A -;}JhQ mecPk<9P<+/z՗ѷ|c $)8P_L%s*E X[ꩊC=8rܽ|]L.}qt7~ś95x}g:B>/"ɬIA<}TK)R!)b~zFW9xLŤ)'~81f.[7#AͻBALc,L.UPzCٚ ]fiU vV85fp Q38jG5fp ]^38jGUjGN/`Q386I#W H-вY12<˞1 q ]-qIvSv𠟂TB)$0FDF@aV^ddDD Җv Nwk@MS~"GPj@M 5&P:71R}M]^ 5q$a#&GaT*76CnZkC M7kfϨt*:ތxj6}}]vWw yIk602`%4}ɸb3w-<V"Ȭ.ZꝵsMsY>.ffwvoؿG$Kn2zdZD 椕N`H&2\6u![=ӥ'Ж '}RQg3I$EG?;JqL7sl8l,c`AÎR͞B7>ޖm6BAW KQe1Vx$KJFy)&2hRŨ]ݥw-Fo' A5Zn]II,#(] x*GJ7JbΥMG:6=;ڠ qY( #+ _^/̤ϋNŕIS;L ݞdk,IoG()&xHqtN\#?y\MdϏpfd+GaȜTxt z9tE ҏ'wvxzR2}3]#0|oZ_;]֔WV絶K0 y<LSn~̑$J%gJrލ'ݛ^܌gevM{׫s?̃le^G i}˥0f$GuHvUQyKjp8jm`YŲBO= 𨋇\7ghX:) А2"hG_Nz^#N nYهڇ>TL^_8?=|W}ɿ盓o{{…=y'o|KΟhK{SϞ/ϚϘ0jho1mO=_ǝ[׌{W}[Bۏ˝U )xa[2zt~QZ?z$V/a[hڤ糂I/?oHc5!u~eo1_rXӎyc jF98+N.N`]ucgZX̩Ĭ*iv4~rhtWu:)aA[.ςL Ng @a^,HPrTRm>7O7^[ 9}P# xENIؓNyn1^ dYrHVyˬCN&8xݷ>1ҙLp'E ?]A)Hs#>(˨tK71`Mk7Wu)鐖GfDW2h0.Ih;8rSJ?L\J,|!y^Zydu!gЀbp.ifLb:9! S3;>,vk{(!nCRV{{F{#f[֎&Od^iQ9<,rsDo!:e*n00|/jp@QFK57Q ך1+4*"앒\wX뎧E*ð̣b?n۞vXs L㍩ӳ=+;^PDP uIBs"$R:lYWȘ';8yQ@D/(lFύBE-=)Uw_=oniDjg1>k-y.7K NI/ڙi&ife^H[Bg}c:K [2f6X^( e [H7{g6tE^.7&vDbk^~ԾUӛ}bڻh/{df|_?}ߛm-6^zz$x9^_`?^3zX2z MqQt 7[0 XE,g;.r]Jֺ2t0جɕQ$Xl$4K)&-2hE Ut^OyyeY k4tcK*2ZpG).Lm'4򶋫ۯ? ODg {\m jA6E<57Q,,rpR!8\d3v>w*hn~YS92KMޟ*O'9%/Ai#p$xAǬ11UV{Ɛ䚧$1lg85פETJ][#&%䦃Jfcg]Mqq.[~\r/Yʠ-OH:*t|V&]SgsV;=+9'=,D{M$=ܩU~BT6iPb$8܈he8:"'I 2N:KѡP1u)XqkkS\hU&eDh &͵+Tז &C Ț,mt!hHFd2=f]3vl;;i\L:ծvoa*+m| ĝ+6H4d K&,-)FhGu|.cUn꘳1mPi]ֆdx)RH!Z@]jI1m`7b BTMV5mvX:YD[xޚ'uHß c=Rv$ +6_MM2!uy^iHƱpZD!Pxg{u{QKlIDXyv%g"˞12ePB*e4S^m9b6p!q]<ɻ"ǬSa2wRl:OH]/ cqٿ;jr4p>Y& zWwar@T˹ `f UmDbZtHˀ L)At#& :50JKmP[+-yQd540/uqڢ79! )MIe;!BJtJ2p!k†Qw`S!Om]8l:Vh<|fBy1}@ yvTMMBAt^a?tsRp (e+dj YII0&NNy8ddQ [ Y?{W8/7%| /ד[0=ؙOhH"dmgc_ndٖ8";c62ئd"YŊV o#* ɭ7{vpZݱ#uϑ:"g;W%'Mpktһ Z?m&:Nt~,J?]=thL2Oެ*):^SݻQрwQiyDf6ͻNwVԳ<\7huuhΛ?/|ǧ22.˼b?']SHksõ`b#\aCՏh5{~Dii!NYM)}5tF):ɳTxR+IKg,|u.in,h묿Y2+p|}ZJ6ćXEr'Z' !:x1O= \|9g'rO"ZA'}RGipYԛ7ԛm:s{TugηoL(Vp1kGՂϟmG57WZfFŌy<\[YG ݢWGVˍ:1m]g}L;8>Քo6hw{mqu*E-\Xnm <2K΋mO9ºzt |>G޳+[:ۯ=>g v{n頝\rwp^Ted&Nǎn*Z~Md aDWs9}P@ySV$8Nyq~.LhNYfPRe%\eTیYVxsʼ-$ǃϑPӇ${؛llfS﹪wV x'h: : F5ׇ Xntm19_t2~/|*JQtz7,T/iiXD593˺‘d[2vx;vujv|Ί QB}V׃ cR7¼[Ft_,Ir@*UdEzżN+1ibPs! a(KJ~uv]mڥOu5@Q ֒IL>zv7{>`ﲅi wPbe酑M=e* b\]bFVko:QT^<EAװH,+:5Kz,cVWLNt̺"m Eږ$0(WCLid9qd.NꙖy2P2!IdAeE;a52DH7qc7!†KNh_ h- P**"] ]/:vAc` r ]!ZiNWRG:IZoǡgB+_~Z8 D_Ы‰1 >%f+@deY.EYw#+!ui5d2gxQ |Q]wY+knj^yƢ28wǬzcYa"-֥YV)BgfXjkGDI/eJMsQpTAMITiK]9 'gXύ)̼<ƣ$G)Z͉dL&)M5qک7ۦ,u.$EC $wg ^!\^Μ4ܖ?L٧RLj2A/#v;b |Usz_a/g|ҳ|kB\wfyAP|sWyA0Dԕ+$8Z(9seV;If̶t{7Hߗ/f2~tˁ mg1 F̂`,1 F̂`h1̂`D1b#ƊY0bQ1c#*Ê`,1 Fd#fY0b#fŴ#fY0b#fY0bC6k1 FtY0b#fY0bhRu0 ؀}`%l0> & >48@1vT4ϊϳ %(ꚷ&t&m,+[ y5YtnՂ_?keWyAĤdfHd }lI\3`ƽlErҴI~Ϡ<.̸ھC`liO'_.$ 1cvrэU'XC2ҸYQ[|.1Z9%f^H .r.GZK̢7RHk 1)1n41 !`77]jzpKQ\ lԄ2#IkYtFKIUJ͚_Zooy+Q3TRE%v:Qr`?20(K $G$AU,&*!ŎE=r_k4Y(k4D5d4f*<5eΊ5ǡ|:-3_V "PդZx3&2L؍TݸQ%Ϭк=DmVS0ӕ)!׹.SP#vx;2e0c㯥pZYZKEJ(R ^Ien?Q6a"=-,rcѠVdr}A&`?+rg@/;qFMUte,9A 0kl+|h:A2u \GCQVCWWP 6PՑN& .)`+Dk{o]J!Y+۝!-0U$B%N4l9 /E*.ߓUZQ}t}o88WͳNjf2+i܏zsS?#SJSp+Q Iܼ|qѭLȀhs"`V4l$$4}:4-9tµV)ҕȐ` ZJC+D{OWR#S+%ixrcuBf, "+JO1XZb^y4*% *u=M(f[IvܹKȒ14`9BqiC>\>!J-S`$}PLI8UP ,;]!F:Ґ p6.#"ZANWREЕq<>]lmt:ue3nʴ+CO DW؄CW ]!Co v+D)I+&1* VCWWPIM;]!~`ӡ+.k\]!`+kY ZKIBi"]"]i.j/sQl{D-êd\>HyP䨊#w2P|K#5dx.7AGT|Y0}w<|nTumqt x b ]NmҵU'B_u]]ׇ#GUhK{B'xs29HeU.*%xؗS>*\FnCn `mUyBkB0'@c/qJF YJ isp2ҡجXQۡl8 E5pi99W=\+#/[U]bD{PRү NvYHW/zJ< ` jNth5;]!JK"] ]1>VDCWP [WgA$#]]qƘb~N ]!CǩNWgzɍ_r歊}XLI&@ݭm$2?dKmɣd+VcUJI*ŇQesR}z+-Da"iXhh"vhMJRYGд6ݰ\a%€욥,Jdk9w@r'M׬864umsj,I1Q دZ| 50[RCH]8'J6n`gU,ۢ}f.>ƍ FCW,ơo8;E1yW0DTtStӕEttutaTEDWpDXʢBJ] ]p_r!rpU4ޕE+(J宾b{S jđvU =0_mBJN[vU+աg@XDt ]YECWl=]9`::A @]9(+WA,tv(NgQѕ2ЕCzc^++%Mi=m TF1qs (v@_n h< дC 4P"t4}4 H1ѕ\c+iʡT+D`Е1h5m}ή@HW; r]n~You gٰ8й?nlȅrW0H )#Z-`Fy4ˢɥ8YڟjѠI;)(ppABWUBY=Esɘr)0~\Nc+VӕC]+9UZP]^ 9wU G.ɻR\W+աg2#+Xh=rNW] ]q-ѕy4teJnb+TʡT ҕPJ8P Pu ]-Jh M[HE,4ж?v(%v4}4Zw߈YgW+@1dlmlbUI&@/@U4y)!i6v2X3(q&qpM4z-C)Lќ`D*ӶЕֳhGZJW%0HWJ ib[X4t ]Y*RNTR Z~U .F{W%2 ҕJiLޕ)rp]ڿP.wЕS7pWSV GZ}P2Ѳ`PT+աgRƢ+WXʡӕCeGW'HW)mA`xW8u 벪j(vtut%P0#+ K\ƻrh5,JѶW]5CW MʖcKԜ 4 w_nwƴN^gw!*%26[ !$TF78n[ eo8dѕU2rhi;]YK::Bi aXh5,hk}١]J)TGCW.ӱЕC+ZDv(J+WVx\MCZ(S+1@VB<xW-ӕC]CWrOs%WUا# VCkR]"+ЕP3m6 ]YXʡtP-]U\#c+6ƀ?]9mɩFJ([Э`#h4t%BW-;sP4\GW ѕF1qɡ(* y9`L4mP-c9;\дV6i0-y* c%_羡%cs@K\7~~:|駤F]tB{'E:;v"|>l[#Y?~nv%Ũ@Mk #Y>![ Uaz7%Ze4g!tKNHnͬLR=gaO?(tj_럧~kM,j ߎ$6lL+rLR#ymc~#R-魳6bYvKnW6A4-jMU՗[BY]4cפ寳FFrFuIT) .Lgk9E Xbԩ!@^F uKV]*RGERDW J\9RA3ch~o}}^ ;˗ + ^bd)$[` dI|t&љDzI? A?3Rr h?(ׂj$M5Zlg $;i>.{|8)W7_tPT$ ;E3Y[^J_z?\^C uV`wRs)snmW{%t,;S|p\7_ax59bW~Չr:-.~1OOkrxE h_ سXeWr.UMerઘׅ d \z (ÔL>$)#YJkC@^_rY;NIǺe D(yR҅ C0d a JhduN>?K#Mp@hമ9zoZ-܏M Ƥ v/߈J~22^9-]|*H!'\O.^Q_:AjJLG V; /hm +(eItnrý5R+o[.'Ɏ[RLpկ*4~J죇/^јῚZ&꾅U+aan8U4{/ސ14Kr:)mX*5&(VȊ8b/qy(:Zt0٨3QUs%jtC颥0a^δRiQo#a溭K^EƛtfA'':t<\QaP3ZstFWRA3'?K- l(! fjl#fjxNJ!$1-GqG< c< QԴQhx7Ol9kLqn_68NS/ 4:h'0q- %"/UH Y+> S?'Awe.p<}]!2 lfmec3*wh$/  [/?ϲegYjMniHvw{5̊0-`ԷDPbRrqk#}eQ Z-&~Y( UҨ,^bi:UT!8%5s?G]xni3u?%ȤL f̗|20'>"@eZhY-my/to !>֋DL,eN G^|M*x(dӾN>cQخFӤ?-tɃ/\C1󆥼+y4`ꀯopdph8ibx7^o|vY 3veqZQ }& Q&y`3f35 ^VO}fD F$7;3PPjf>./_m<\L"S9pAC_)p5^,/+DD#~v;9tPeH@ut`ڟ:lz?)nr,}nHQ튋u?9 RcUeC'0,햜ck "Q>yze1XN49YH(4x!yHgs Pm`X5+٬2auږ{݁ Z|=]zë\ 3ha [մ':z6Koscj£W4xR؉#;7joiGY PFuhȡtV7;LGDJI'$y[Z%f%;A 2M 扗D~ƫEK>tXڶ]|@ Q4!wsStN5M7'E4!d/,~-fU 6~Lj ј$kA2,O͵~ uO,g~:(|@bDm$;kYr`?lIA)`NR5FL)t*#ƵcDk?SA -J'CM##"lAďt94] "v8[:ɔd<^hㅚ{`<IJÏu[( SIB>7^1DZa4m$,/@nRaJ]M# o ZAg4oA.II|ص^c >-xf=~݂d>lުN:ʷ} 6@Le*!|uTYrzV}Ե&4Ůj,G?gJG_վi/DHjzH Y ADL(ve>dVO\3M&!{Fgbd]*{?fIyն}џtO4!l-͜χ$ΘGD3dR+NS#uYR?{7{;tݝƣdEo$<\x I,+DSx& H0vQc;YSBZ>@)B;YR{P$y`FHt<Թ^0*hQz5 SB&ArwcgګLgRwB\PFǜ Μ&2&)A;M,Da,2”pTAc,U@B$pk2RZX)8$KO0CqdE0+qxM (PE 5D͂nu-JͲ@$1׿Cih *Zfy^#Y2čC*$IM3tfPq;0Q)ϯ!zV (eU/۹[]/TnƊ> Cnۤqfm(Z#P;85 n֝Rڢ@rΥ1ƔN%3Mb(25 I-(l=L/wm.UR`ڴE$ˎLuhM+¤mfTBDE:#X &AY(Hp2ZU !mkKܑ X3&;q%e1cl'doFPE(7⪢H`Cf/4_#aqC·Z*U{lX~=^`|%MdF(XJPʮŻGn 5PsXĉ=(0Ub ǹ)'0,пi0i 0II6/#`_F/4DXR%h"x] ] gvlpU Cΰ4*PVo*&Wt> QVԴ MŪ{P /$&ab-әloFƝfKgF'J$bZ|1M_6\?q Q'_Ҿ~½ F/U BHq@LWG#P= 6fBDh+O|:@m6(O8ægM (7ܿ Cz (ݍ_^Y&( uaW G uZNiS'Ro֢Q_ZJ: ֏@w4[WCj#'~Y%tkzUM0$c-HXDGޝq-h M<ݞ}Etvz-G`fاSYg">%p|c ~h-'O|Q WиؾWLF{O(PAWVT$:$1TEj>X _Jֿ]i]%j9ETm6[,/w[d\Sx"Il~RhIFW1QkB45hmghHxԷVUG}ChpY2&/2XnHxY # (5ʛn:-(ƅH9OM֦61uOVf=N[1+C'dgN@KNﵞDr.[H dJ3^cӼp)OpL@Ny]>Y.N'J(2c\k:^+͝ǭqs~ RNNSĵyNb <ǟQ y@'8Ji!5(OVxVR'+nDZ)sCVgٞmzӓa=+ ;K'F|*z/{mz~e_x+<0˴6V mRJ׺tuTܻ 15OZ1 _(%6EN?t"*XSQ0)FB1l^lfL4/^Scύ@OZQ+9Yzs&v:_M櫿;((DA\A#D,EX|@ʛ \]ZSVZdK ~yaXk$˭{k(9:%/">'zXHWw{Mq-jCͱ^$}ܧ VABkRޞ1{@bn2 HKn8cIز#37^)`ͧnIrUBTѳ,xMwG*hS2BZ7P,p!CO?_%(Fo bF[嘝M4AS{[SBtQ(㨥+[RQE[d7AHvH>&8ً[RPwέǮRbflCx(Rwy⯇p;F)Lƣd"cD}ҩǤa67B*Қ`ݝ|# )ݮL dCnBfJ3@ -w7Rh|WhP2A!UAwi"OM՘[`ab91(Ԡ|TޮF q ċ1=mkw Qzq5h5{ɦ U8\)0[IelNpK ޯ'M^e58b/BUª@ иf'y]F(usD.4em,l;dN`*޷3Glj3B^f#zC]D{ht Pq)%:ĩ5g[*,ދ3¦3gc Lg= \vC*X6ƳS<|)EQs(VԽƜEXYqG11*6^!kL ?m<(#phNC:gSBiq] (A2*x;Unr0_S^sx{f_mѢaSn?l\&ms!G~; 㑗W>7ś!@5 ~hf 9|Qp}>o. qCA3-~<pL@wiZLBOD9qymGAOi*wiM`i6 *j3q5YׁN4 $κEY~!v EqRO Wc!B9BDZ2!1_5DXd)XkQ@o;x˄2-:&j1JcX"ERwۓ<3(#0e$E{K>~)B87A4P XIjLބjiVXۘ9 :ՆKp0XTb<A*h[FiҞ>`6M f;S-IT~R ஠s/+BlhL wUw=H*e]F宇fҩ/aO,,`;}*؜*Wz%5Ug  3y>J'nv,OT 6mWApS8 aOvNAXvڂ0 &q#/+^PeN;*q-$@;XAG!VeT$9Z4o$@mѸIUqd$I.ߛnRE-DAr6P[Ic⅒D8*C4V) &Fy '&/4KRg,JHT>B+3EN:R`KBkiJ4Oi1S~E_@&_;A,IR_:܉, 3˔1(/YQ4>ƣ Lڧ2Oҩۘz>[ËxOEu D'ӶOO>HUи .nc'߷d;+:fy4whaeAxK•?>~5Y1I J8aI!ƒ(*#LɱxKo2Y[rNL WՎy'!.ɻjP҉U usV@0d~=8m6B[M \|zfCQA!X\=w}!aĴ nf΃6l(ɯ-42rsĐ~1/¬kaRF}i ޘVCe' Ju0=KAgnS /ӨG38#4fЭ(1Ōkuk HxW8ngRo-WS7oov16?*>.ï{~Qm\+BƫiT`EC&CLn!%[$}/%[qCˆb4dE-'s󽟐ɢΣPw'~9z?Tp|گ  B"8$Hj!Cgvopxe>G. CI,b86^U},!0F޳!Hɕ SsHqP^q.Er뮮2nyP+ 5RHL9'd1S# )8Fq8K4/]bYb0/fwBg`S e=J}P+hD< ,c^>R9+*HI6AyǔN9<\!f!9$U&s!7J [o{NhشF~-+?c.cq,*V}Xbsp_RefBw *V?VC$xW㶮*AN vBDkqsm{тk(i !,H]֣0v9$;kOh`aQ;Ǒ7š#4MᏇ*``>@A!6Co@IaK&bv?vM r`8!R)]cj`:}LwwyZEO7˚`TwF1qA7+mͽip(Pʺum8 & c'jed$amQay~ixŘh]vht압x7`{wJ@UKdi$(Dr%AjsYnlQ šM%E)d9e@&4҇ z5&7T4G.2]}4}p~u(C+blI4*Pjùf0 .kaI3BI| :_1e JQVd*=n$3}οc@^2։~Y5W1t!&h[g 7jlܼ'@8̟KZ,ĩh%όq>Eoz: _c_֜;Ƨd|bVMPjL _Qqli8e%}-%V]>/䫸 }§᧸1D/nLb08 -!HR/}64g\umv_=i6ξ^;Քm_D-I[ I9ۈYdCݧuq5LKΦ&c*]*t/5@cm,щZYVgwŢhooy*qODfHj҇Ӕpp29=|˿du̇9M1.wKkw?Ng^2ɽG;nK?~bq?gT |%oYz|~dż_Ok>77,Ŋ_ '9CET_ݡ@IډR_iki'lQBtzgJJ k+YWM(XTuʡ |v#Q*s:~6Mߠl`2ϊ` vP)B_=߶Ͳ2e&@SXTF5liH0/f=>;޿߲S3*q>zP$ǯX1X7`V,J{^fyK;eh?62e_z7=a_Zn!|s:M6A;={z fT,n ҚT,h( ՊWb1sCroݑo L^cm)\+$7*6wW]KWFM/?%0.O}>,~ۻLZl>$[k9j Gn 'Ov-4X԰UҶ*h Q7tWnBXٲܜߞωR"+ #9?-PSinIJUhN6̬3,QkP M?>p0{ rGVYv&3F @<#0'Ol(+3gϑ`̓|; h%ȴzQ9VYZWIK8u bsUf m|CR#s6-Q)q!Bd1̖b)IulqL}8Z\]T4qԥfncmP73:QZӝu%*E{62t2~@!? MRsM4pa8G1{uZJ֒eMCΣ]!L gJEBV/?7o7~[ x]__}RȻΘqƖ)W]J?0YG_3FIS- :מX*bX+s 2ƎJ |qxiY +(0 MF"ĢQh4E;=⨬alxR1SfdzQnh@@ɬ4mrAF0Y& ytp=T cGe TdV{qP!bZj_bRN,F4gTA^0N,eX((w*YǂQ@J$P')FRH5 &Pe9|au00yw9̤DN -G _s f,E7qJyY3-J_#oyxAR3eAHs~bq^VҒ$ΗˆCۨg.XSu6 WȎu.pp_NcPI쫧U5H) yZ LF 6 Ufp?Q]A84ȜE&2y10fJPft:G\)NT^Qjc3d湆pw <ՊHeQ@palπO4A}-HN& ;V@'뱪4bgs/8NӼGxyl9QSk0^j4Y*)__~[?꫟X̿O4Y;8ת6-Wa2L ?26Q  :D᭛5uFbFim↣֖w\fy/x}x?6Nv @R-xw1A|>v 5@goϊHo8$G{,w#71]1]}9O*l&Z%YP1ڗj^ &zť{R.Y ))R@YJh1Y$iVLIYL|32DݼߛL~9S5.h S̹%I8ɄM hAI5ݍ>U *Oon6-OCoFI萒]A*W:"_6kLaƉCߡd*,{]WT%JS^jC'` W{x~VH d 6-Q(E Ɔls%ϣ۲tv#*S: ֘h][o9+4zbfbA`ҍyX6X$+V#)vF9dlKu\fұ&YPR+rKL7 O6#q-}`_ 1]^?D;k3 2N9zprF "zRh+ '=o!w_U"&^&Co߬P?]svo li:57:m %*kAڹ2מ8/ʚ=y[9NX"חץvnp>tghRFecYxbUܦQXuh3ڶ[˨Z.9 583kj0H ޶}K mlq53osEn(Q"520cUa_(ͽ4HPl(N1ݑ{(JRZR~|P1[L̞Q{;zo70%DnL|FwMC4*"|҉8/( Z*AQ=2Y*] ꦨ<3}O r}Z|Pux:q + sp pyi[[ꙵZ.ʉx1ў*5‹h5y9I.r+5ɀFO5}ig#J*x-58 USzJX ]Mbb~r<m 7cbaw2u QᇢTzts(O6dx~@KQZ︺*.tA56?W]HJ^W,ynlPRdA#;] 3ߔuծT6-񆉢º[D)e;~_r^"E `E3Prc͵/i){-2[tN\r8EY;P?4ÕP,5Jǔt5wGYbt~ vrZ(3U_g^ܳ1m6<8# lip)5ն[|nv5/Ӌbyi։WH?0aXzNle -xSj0dT  #N>! >h4HH[ahq{ b)EDn4pbH@G+0N~_j;TMBz?NJ6cOF˰W@3Xbxc$ُd&#H BA O5=?aoGh4{7Y6y~M@zf.;ާ(M<Ϟms(X{IVkq_a05;4W[d_qυ)1{Ib#L DՐ+k,MHC܇[p_yR8G ics`abqp;Y {v鳝T]ݢgi^wsQbSk3_'cԔY `NF lg>J8.qS#f&yia7cu1Y׎9 ^rJxn pz#:].ټ}|۲TegѨīxh*m)DK_5)i8]ʷB<: 6ʳ\E$bZXq϶M >ۺo5|?TH\EV{8)U6Sm6RDڕ5YxQL!Tl e\N4T:12սD'zTI{Mjݧ_jYtz0QZҮ~,`:\cj$s 1YE%n@1ԝ0q?չI ^?\75 wbv K(>7F{?`w`TnQ_M)vGX&v;} #Lɾ@}v$:ѕ=qep!Jylshcr$(cN% 4U$~6o)ЉlKd֍p}ۢVF$=c߲ŲFZL֎9^x!#3#щ?IA; .$(1E8"GՑ6v+[*}w!1{`ɻ ]]h]}l7=LIuMCUkhҦ!sgeJMv MT˭0H2AX#hEPJ!zZ矯U Ԁ ٗ&F>J1Ǡ=R=cd$>^~-jaC=JzԇWx"7~j G+|N~_#ZIB!IBCշ8JNg# `2Z-n4""G8_}JPȭ\d?"˛lӏ\b?/eٻ|6?؟zy&绤l(fɃ߭߭ԟke2{tyl {i0TgeixXf2[;]4탟Ow8KJw/_@xخX!+^6AgGM;گB)2<`GJBXޥ`-J\^qkФb0A-> :0R9i%Iː2XUKMZyYPp1W?<$zT{Ao|(@%ꛢm 0Q"A- d략Tہ-~6Xc&&ZɶxU)IfGH<,u/W)lv s qpXmOs=Mƭ5V[^K}Jԙ_yB󉝬SzxU}$WRoJ<(Ժ24]1OrH]槧f!$ZOL\DJaKzHjf Wܳ#HG)U\f{CѡGnv*5hhvu&f#pF~'cw+S!BMFT5?w'benOzOW}U&;n( .77'nЌ3jUK\ O!>[o_s\!er~cdoIe{K_W pTB_!~e 3ujÛ #w}VXd& V+l 2rbsMi Kg5HD  NM)w[6Fɥ n*,f&! `ˆΨVGcсx0?@=u,] !s XOhrUIic=`kbf?.tG} *138UmO|(uSD~t%]9lgp!l}[DA^Ȟ׼dkь(E/f 1+Bl[%"4BW~CJ ٧҇(C̚ mm3ت,}ҾYޮh>3ڎc'մ6IΏv6I3Lg0&tx[O-V2@f0Ra@Y0 Tlh; S! !).5;+"lI0ScIBZO&Njsn@Z+N*OJ$YJ4D2`܉yCwn=t5X|V*,mU$ZiR(W0ڜ!"5ƈNըgcw6/E0,t/-rxvrȯ/V[x0%Q 2 C1B41C h`0)aB2y0 %4"EޔT7SdVcN^x*B.&|[\3rTZahYt<;Tc|)y1Fn*U2R3RS_]O_Xfj̚ߥF}yGzfn~Uڨ=TМe͘(,~KM5Z>,_MG("1wD:h8Wd TBۂ)LBgE!팗Z|Y\jVqWPf,O˰DT13Ig rE) K)9Rl$MVj'ƫXõD'àĎ\ɢ Nql)/ػ8rW(1@̞`v}9gq; }quWsqMgU]*ɏJBX1@': [Zh$j1bO]s&Fђ;pUO2C}p2OxPR$z]~ءP([ypV⇳g4 2#|Bca i˽e=gGMeV!iN=cQt,F?6D˧8 VmO +Svq)$I>q4>߲NoS{-X7R=Wiƨ%ۂ,q5Zfua \:j_!#q['It{W ]R qVƸ8aWS6}_ ۏ65C'Wpv*Վ}s*J@mcKlXQE~uqe? vL0f7! UT,O$ճ Ӌ֏Y{+m\!e~5u 57藉8aF\*&bC3LM4JP!!2vAЇ#s ]l0{ aN@$< ucv*fWlxL1^f 6NHPFkzٝcv"y2A1%0}5۸6C oijƈdr&dLKV-ל1:hQ"[(& :?;(cd{/tb p./a)'d:61`3H߮Ilu^8>W 0՞EUt'8ՖC)RiR S= űҋc_/摍K{C(f@uo 樇uuB/ms0G=P}xC4Y$YT#d,y2p_>kྷiXamѡђ}LX W ]jm4x_|ϾE2 H?P #ivx v83ƻPb'ZFЌfwx#3覸Za7[.(bRzµVK`TAVJb255Rz:Rk2f>5XS%e"MHTY)B』ZvsC ^$BD-ښ!  Tm9l (0n(v]ݔ.(,ژ'C(krSh_ȕ!JԚ\*,Oeht]'o]|qw:zuyyxOix>{UO:$>Oy/ -5zTzSz1+0o+\e?[qW /G(Y.a:wxao?wƒ+=SnsAV .xyWˀ I}w~|*VY"?Yš2v,-\#{JT~PRvfy/_~rEn˥9Mu):I7LJlGrը'Z?_]i\\Y"w5 N>۠G^37i9F#֑a뾬t5|QB7q_튄K[U{Uu[E/0l]Pv_)^O]1I싋b@n'1DL2iv^uJ2E9{Y_$@f:MY|xv3SJjd >~{pSrxGF2lm!e.7̭TB'ԓSC(pSX14%MA]&W+EN` "3"OF#{[łHbLyczumI6Ss @hQ2$31X P!xVP}eclv9{p:>dʷ)|`HC7RZm04c[򌡰C !dƎ_7+| Xܴ+}OFML<>]jn>dw1:q;D;EG71z_6.OӚe,(ϝ]Ronze5CKh iV E"HrBQЇ#6ӏ͠dDR4)r/[+ 1hxgFkBZǾA V Txݻ) eR|  T*`]c[,4޵` W}E74ts*bDn?0X4ڕ(8/bJjBC%W sN-T4ڮV{ݷeʪNQaЂLlb1@v cؘLf7U,CPc0Eq3:&[E|!IY?-eK?0{g\|0IoJX4V022kfs!qeeBv1bOR&h`)cdl׭WzMm2ۢVhsذ+dP LI}:fmP׸rߝW(ý'W7؍l͔q˽눈A"o#"b{CLǝh:Ѳo$UUJ"O *%C6[1nyR 52@96Nr C);fH-1DMtk/OU<=}TYlFEAa9Ve Uٌ[1Aǹς%H40>,b6M ^뾹pvKp|MǼr@fZ;yb$\f> [Y乮~@fLxT$3&;D Ozl> #vMvBPZm1(ddÈQ~pNJ[U(^YfNԇa#xd%VH ;Jb1۠)׃ 6 :5Ȥ&' %(**3 4100,`l){Z[~CT^gz Wku])ZZ'GĞ꒝Llmkueg[WKR6"LjbESِl&];ܰA6!M}pHBٸ*U-!k5)[/&ci,uBX="j͔"j c 4iKU ~D{2\Ί_y;˛*nK>N0OyxPʛV=#A֙ʓ,-1sMUFذ*ƚnĴvβs!WN9OXb;sO' r Y@#$&mRquLB IK ]-iT#\mNZg1 ?>iQ15DS8!&FNU.ھPEā)dfxфًͼ@:<9{ "^gHpѨjuQJD<N!kP&<Û?EW팃K-̠OzB!NK,`!'}[|Q/2C3bRs^v'{zzqE!VK-l"sL+kE-ʢ?{W\Gj^$AT)O)W[%ٞLZZgG/횑uN{AĜ |ѯZ:Y97 {VY|~KV $fz̿~wrAH*F"`}| :P3o{SbSZk0rK)g|r4)&^O>m>1{J6[PY%*U ElkÈTL(:jMG(ة-'Ñɪ揥c$WTQ5.uJ*_zrK݌5:[cltK1=9M^Gaڹ,E|}]PZ$hB.#׆sVt1(؇~ ,6zLN]z!4Ȇ&';"pzv`]BH&L*Я+bGLs9p +DH=Dk,2<mYfEΉpcjgfPkR؋be D#|z7['nKU6ݛ4^yHo7=c~=^{u10ݎ^{wdܽdBa> .@ [֏ZA(KQ^,hAS ~\ow:(id"Jbuܮi6~XeAv >Z[kg)S-bQEwUs,h6Db?"A˦xӢwDNҷO+n):d` &*D{>hcy}Mr'`.ޝ XsGZZnˢKkW=Y&~g7KU A<I1#:XIќHkۻ|\[\ jM&ִplwb e*(|̶mnk0T (TA#Ye6WweXD>b:y{&-r\zUc/I? 'P8)3hVa,SnC4MćGqAHG!kEb^yu ʾ*Zq=b"kiWL: ! o^9 ťN#LL Ҋ8CI5ѡ˚uPu`Is0BqM㍠qcrwFQ,b/V~'Un f1OQYj&95sPOYp2TXP^EAzҋ`EP0_L\Mbȳaeo'M!%BU> AU0 / V@a$br4j-H_brF1Tfs-]:Ǡ6=2 5ޢ`)f@9i"ĘM b a\5@G-e9EϷ01jhTA_v6M:D[ȫf|dn9ch4|NC&0%Qc鹸4 $W%N -hP6B62r8J޵AQ+9?kCPFt GoRK͑\bW_AI;qɘhS<129puK7 ~D':+k~Dpi{&Bڡݿ虍@ ujEi(sv_l~',gn?;u ]>bs$ܩOCv4s%tWw}zq)] ke<-BE,T1'Fj筠üTL{-ѯ'xO ?7[ojsZ˓Vr`Z DQz?!9l, DS(K=&k_-캏f(!=*b=> h wQl%W_AV*~}k=Ղ28[c +ܥ쨥QmL0dU(C\ DuEy_@b _zE7I>8X(߈E:Su j\ҷ!{@љxo`'R2AB'RB֔K%G&MWE}2%Ĕ(a4cyRC8l@ޜ !%ؓ 'xyKƝKwIub'w]hԖn)PِpO's{t5h}mmJ,fnBKRD,aү PS2H؊@]|yS'p~%ngAh/৔\ZSןaf"rIxuXl"'D a7u]{M\Z?~"Qʑ ;$|>-ib0^zRM(x D-ReoD~A*w,>#SKo[V~ΓSȵ97lHT/_*>(LJ? r>iv^Vɻ>Iݑq>XHJ҅7MF#0Yl7 [mwW*Z}?otVnzS9YygC՟* r/^]\`~&(]]~D:n-iry{YcMN@з(F. 3-w/(% o=cXw:Ĵ79v3n~]"Vg̑O!!JvD$Wg9,in Z`' 춤 U6Li]a 7̰Za 5 `;&H{D";WX:M7-f:w tƵsuu)$(S,pj1jF k\o\܅]걍B%Ie]@ԏc7(|zqwi$ ,']j Bip5ѿ}~{nr0I93n VmNs'rEsϮ.GZ$Pc^](dZhh&,3B73mSM7{:.s<?O'|Βppc̏mArOb_"f[1߈?F˯y%p>m3K-Gfo-?e"ʑ4ҾL'yx(g0Njb >yJMT#%c[>^xsq5X 3f4 2:rL#"Nbڬ-}aך1:N# 17yQG+~N~}"*z~qsgPjj0X/zBX-3RVè[d)hn}EIan|h}xB/m0ocQsd1#d,6"f"NȎ N[Dtsp ůpR);%BB~'1ixkYcZ3 1EW^)[WfGuԊl{6/1xyy ٔ?ގ[n{ !s]o8d=s+DLXcKhp9N5$f:T Ȑ" ȬfkeKС hr Qlod_]r/|A[k޿r{ޙ*kBh2h1[W2֊V k5rnC$WNrE$,89=bhck*vCrPXnG4+ma0چh3m0lK,"w#zWJCLJ:BۚV^mY( 6"6/2sme bqE*"ZhѕAlCi@5"&GB\ZPoB7g/a/F bMfиxE]Uőɢ  *[Xm0>2Il܁ACC E8 ;!Xxa`w3?{Wg[Tù\tUy[R[R\ɶF*=v/r#gJRKԧ& H6"1mzaN- ([^~G@`[JJnN-rRV-Q&|9D v违mW;Fp4Gok&bƵ YPMid,8RO|҄BKV*S,0d1eV/ 15f]\ p`]7%ͻvsT**C`(xevfrgX]ƹc<>B՜'cűIjªi._S.~#Ba@0NJw> YAWu¤nKQ1(L f`@Zfs0ĚaC: ;O`3=f8'($3$,,<2b#4Mϯ.>9QWEɦ"^A/`e#+ZK٥ cv Bѽ%[•YMuz=8-mi1vGg|_ۻ5!7[hoqjon܋qo|9n(,u` v /S\sԚS⡥!`UuNBoVli#r2[. aCOV"K0wy\F,.2ًG=\O؞kɆ/E\8EnK22/&ΤkBMRc뚇U^paW@P݁)dP __$ @ XPT 샂33C  ⮽L.߽I,P.꯲Ncrf=Ϣ}L9(yM=%j3eEp ДKAIK{JF%̝y\Qv;LIz:5^u@zsvN(A>fNaȋ~vkxۀ  |=&Eͧ7/71USރ ,ԊV+Ih4=,eMt `,1CѤ̈fGYh:a0,оF4 s,̽%h^ "Y/<2)~kl`P"L$jю%uJNj$펡 ^{&>k %8@׽(w%~SVr^nMzݭlKͰAOp|ef&Sv/)^;*-XnjG)7KKki.nJ鱗g4% gRŇuX=Q![Gdɲm7EP]lUF -XIp㋷ڋW1/̚i=P4:5f04)44F ZB>1f 4W=~(] 1%g-Oy{sgu xSusc'W0l Uـ#ps.6vF gWTK#}Y(ԣF`k`C]jQlxf=<@;j>$=u [$Toyu} v}xW/yDoeD9c) dW}G-.ux]ziK.%2,xK"ڰ {Eṗ2Ŏͧoެ 8l k4:. &^Ğ  @7.& -e)89=mЫn\$Y |~9/cڛukֿa paXw!%)tg)XF? 3賴.OmzCZ?@p<4tgN6"a^ه ݻ+DqÔoAOĂ hOĂan^6n}ҟ-#픵oh=g>?[O w(TPC#1QėbOT{d`̾U#,-bCN@ ~%xi ;z, ?|&0Ǽˆo]Ud%y2m c: `X7r$$ 1U֛!4 c%u;)R-WH"P~,ё(cػŽm$gS=C"[H6˳B@fCg܉qZ"۸%6+-k]{_A#KLPmB)rʏP 'yVڵ?4\{Y)jK{/ cu*@gV*B'Oo'-T@:u{ ޝ i6q1qT5vBR%$'AmrG⏉OM<׉R!B#9U.%vV}.&ݎx6N{-8C]7sN<NKN|K8M1>@nĸH0!u{j(-Uj2΃0PDt{J7f a WĸֻXpl]G= {"Nگw$;A "v?Pv\Qo;h<?׵?޿هVTh/`M􄯲f$˼Rpgk,٫|u8 ,Rt;ѹM(JH‡<=yA$Ex:. H8G?nLn"d$>AxGXӜTa/)xaVIߞ4l=ɞ"VLY=)0񣶫^ZxL7Ns nG<C ,W|CuNYcrqnQ╢m\||=k~9cp_3/_߿|ӿOkRL*iT t,Ƹh3 ~N^}`Q;iǕ,*r H{Z5|yߏOW;݇w͖ˑW.RN쮤"Qမn-a3DӉVCOqx˸Z:fٓf˧)npBqw1=u~͈2'z6?݊qB pulˏZ J,Nl Ԗ#tg`2`!? uN$ZG[7h$Z{4 yv[1%{z݌x>=l9&09N"Wns).׌b).1c'77ZG(>eOqٝs1luwbx$S8ڂ+]:ʾPk5M:Vw@Ы];ibLz+ꍀp Be[[ޮd>%㶾Nu.NoG<w]L%)M=؝މq$.pZGŇsQ-kÌmvp.;˧ie;OPэCfw6s}*e;mj,Ssf8k /QE{NO}j)aopJن'JT8MMl9*~"$Gdu|2{}yU\nˤwsvQ5r^SÇ>XvWr-u/@yfT'D\BABA–a)6_|m>bc/]*V(&ݼ>oX*-7cPƇ)-KA5[kp@|\5_ж96svTm1["YBx~eJW^M֪YnOk+lB 1ZhBp-a%\ĕ8cv&l(mV%؏@ _f@?.p=AQP$x׉_+H81eI>̐Q*i Jn|C[[qo" Qˑ]_EX+(P̧jDl}Yb 7F8$ sےU88@td@K2P& %M;aY=D SW _.A!P[1IX y+.E7z&b5D$6UFܿhWGJQ1n Py@[\j3[ :^DMF5'R;![C!x**kx:S#q=퐗m ($RCS>jlzDz-G &%oVڂɹ,oX&V#yC-b"؄.*SMo6[@MĻ ()(uƐ}jE@֩]dNQwdB *LeS+$:SЦ v}~ԁBs)< 1 N:xae!(9Nt|Xb 0*\ bPw\TAk.`̾ hƱBJYlP #iН;UL[*+])hl&p%Ig,QqvߢQs͋^ILs-ڹI̿h?8su/Sj_L#zu9rv׆VlUE$luɸyWhI%h(p %. E,*`P:49(CҥtlQU4R1^igdd'v[>\nriDBi,hk bl%bE)%=lg]DAvsϦQ[+P.cHЬQPv4'yN?|TGdGjԡ  Y pQ:*ބ6A'zOxu V~v; u1mk d@" &-i ك#P;FZ͌,W~4Wr{kԒޚ޸+e r`Lȷe`uɐDw/tݿǀ5Ni6qf_ 4[1>xwŹ<0vp=s͡{`EjӪh4 4' q"9@>!$ӽ">}8`-h)>kjye+Z>Ȣ?~[]jui1&og¾P$GѤO]vt+7+fҤ|IcitM>wjbmJ˚)RZ:U@0 l硂زkgyH_G]v`=~t-Ӑ^Rkpki0-(ْV(-iXQX@0$dBwNܹF6J/$uŻ1,.-WA}[Ț[ K3ZlX^ y^: !rcrÚ5gNo ]y^ @-&n<^j|K\KeD,-%֛m@ vR C| Q1{JT*śԐ(vl'Lʠ+ҒmKpŻq E#Pc}cH6piڃ.uZ1-ckWB@IPh&qNnU݋ŪPi /ɢO[l(jS+ލ KH!MQ$O_Lk16om:<򃔺֌#^7E`9ExXIޫ Vh ~KpNY|1lh}FԨ<E&wrF#g4D>-3Sa,^NQȠvA@q9irFtw` -%3ċ``@ %0 uXs`PpH*1qA mNm9 !^heY&Ֆr\)뎄שH.'|7PTD(%p.xJ['vF%N 1L`̤BYebF0a3;bG<'6PNZk('L2Įޤ;R씈A^u۲H!#yK؝*C4(wQ,^YefIo;V-矷}KwƓ_s~ `W囯L>Gnt~+M^y4o?}˅I?Y/~Q棧%^\w n2w'I}tQr?iYWXS!uU%+&^)}"'h SGP2fM^R*kFY]I; ӿϟ>] T:7x%Kv|wI.N7wf$W"v`Q#Kδ Mם*3mܲV6.^d;)Qdid9@aژ; %RdKW)U ugsWd*1PuCiRQ2ҏ"HD 6&fLP:fAP}JKa!O3T!/'K|ebbĎԡsMӠU4QNEuƈL2oI:I𔈝V͊SA7L"$ZW;ꕉ;VL6P|32Dl.$0ns L p12{%;Qhms܂`A]͙=|68*fbWG9p%m [aQ/t274uS-x)1<-Kq~L Dz=J{O,e$ؗ;NԞ ‚^U?Xϝͻ^ 8X\͎'~"W.t`9MH[h Z<J4sSڅӔXSJ#8|>eCA}01s~:;nf8Y8jjr^CU[93= `eh8?9yB3mp xhmQQOIΑFMwO.xT^Rz+qK*5 X U:b喙Zױ A sa* Y0L2!w>E'(^Po#e"}?#R7wwUQpU8)BTXJT./&.iu !o`w'Td'mH. -AF}D¹e28T2~CT49*Q푄 2nY??yC:*PxݏpnW~PˇA;r^~Կ-A * 6U*y;4)y"%hRѺΏy7oK2!Ţ%J총L܏;MN,gCI9gy&-;[Skz'SRӡ懊^^+Ԭ)#! 2EJb"BTA tDy)y4P<-vVv6%Q |yEswyA"_ŗs|i!~hDv7eEN+*^xE |ӉnCEEҽ/k =7QVOѫvS~d}* Gn;2}TZ}F`E#'e-m CgZڙE No:\VP%APx rvm'[o|*NU;aF  R^ES_j!o]u>/m]u*P}mvpcG599{SZcۛa^[rآzb{B-WhR䓓4fhpNg'B4>2a}"Fzcx ysIˉyOZN=5Z]3a`̓ldRwͫ~Ru9SiuܿˇVk*tW+bώCR*Zi\okR&5(G9u› s2PA`aU҆%O=޹'\ޡgP"2y>[ 'f6D%7w=) ;cuA^ZbJKFAXP11H:1up.9ςT8w?6ėjn@"_54lCI ,#Lt ~x6QDe3!In:#I:u:pө L!VCklEI+IER|Dh$ +ʾ z!( Z_*5EF-QkRѱGbdK '1 &iNK6d'Z@~sZE:@M\fA{Ar)KY!Ey$F @FO(JژF[柭`Yp4ͷhFoCo\ $dN|B_q}D/[TѝvSB?^?<%%yu2w^W_? GT~þctew!\)HE2Oys?󲋟Ң!W'2߹۪b^Q-ZH"QuG>q 'R[!+A  K4CswIr#zQ9 VthAR]KׂIm `3TZ?Kwn]wU٪FtkKw `JtGm:*'ҧG(sR)py nM⾹fi3 In[%OVKvRx{GV˗fpC\nNzuPsއ ™RLC裗CG)X9SX*EwrR.&_b̓d)%k\06`8MDN=YKdhWfLBma))N?rZ@ӗ$M |88b]Q/=BpxeʳΎf}wt`6K\Չu'\Ony){u!Mp!́8Q5^s&r4P {d{;V[X} W!{pԖXz2~P!T!̿ n/P 9XZQ ΂0_MO4"Em]{ feF:bh:M+8d)E4=k?SR*O5se^©6Ì"%NLFםiJ{H`DKm_00*9k$I"$P LambFH$> !YIȵ2J&r F9)7rg>ļZ.Mu2A?E#L~SBa@3Rsۍ.5cvB[$. &-]Aymq5XgQE'#4&WS+, 1ɘhM1&epU,5:2j 6d$xsB3aPN9I@e2!L2Ɂy P 0_@F͑%ZlHqKרnEH$Bd(49twAD61\{刳̯54OjCuEp Z-?,M\;o:?5i٤,09 cpvEHhhЄ2YG"c g,vԵj(mgYqeZkp7qph]q6T:w3{ylAHPa0O9&ER*uGH)gƒ%(fEX1gQs<6fR%}#dHhM|g %N:qj)JZKqvqvuܪUFtfs=: `(5ɨa'.WPQ[?☽8f/1{8fM@ ਢb IWh :ꥳZK03ͯj;Q϶oOhq~\f8"6}+J=Z@|od;ials^I䴔ͶL=촟F8Zk}:[c8?Ы\*8V*@!mr~-61}ZS%7D(W`!GOW(-BWhr I!PEndV[7y|SF]_>2\&7Bъ9Q4BjmbV{b>5s,Bʞw2ةɾ 1!NG%P@v^}bvRCM6_@`ނTLt WnD|c^ZaF>WCĪPtBf6%#@"zE6"WځPM R,{\yXTB&Ug8p~H%i @@#*v w@.Y¬vGgQjFHU,!5ڳHky \8mmZAQKب|c,SRe *E.F I|I8m TYJ.=ڂƘkL·^`'~Zh*sNP ڢbLƶ W6n@&R:# kP+$5N%>֚n}.A_Jߌj|IvV;-oAH&(!Bll-N"O\nTM׊5tR5!&N:?u3\_SrŨ"Wh4+pԽ鄾t&3?ŇO@2'܏eum(%ݹ_S4"(W؄?ajq }u~pjxwlJJ ѵr#(6@Km Ura3Rdl!(*(ĺfVpBoNhI= 2ylI60lks^!J RH! 2$!w/qEh(@8|,C|?yx۰ˌyIpx;T}SUN;o2|1b5W@Sec5c1g;@2S^Qeq9V ˨D@&ɓ?W ,AFtQ\nФSuШ~n"[`]#{Aq%3$ .uģSĠ]}3r~/Ɵ~2(`pRq{3׮TQ /b@U W)y B?KV;(wwa .$ N'y}F%>G6c>Ջ&V O|n\IZ;#?Kn8$ j PУcnج-6f -h綧pr2m{9){9)WOʮEI<)t (1jϬOHit!H!LkPиUۉZj;og(:ЪZD2 $P P> *L*G"˔0RY1q&ˣdODXrEQDbK9)<MaNRp1v?qqU}B`DuQ[jE?e=.{suk5]܃&߭)͆ ifbx٧ zEۮAG@1l^QGC[5r8CEW3NwukƦ5*}X!%Fiqcxc'񭟺 6U GU cbM" kQbg @%F*4epN;eM]koF+?ה~19@&ȥAKrQcK$;v3KB(ʲâum"g2MHPu ЌܫRjlÀ Z>{Ap\906EJS.C8Fl\ XZ~vA=p gBVU܊L> FVC,F P?gN!%0 *j(ۘ;FS `y ДH槓?e(TPhIKSc*b!kk`VS51&%Xe/@ZKyq3|0 NNruE6j+ooBJDKOC]%ݾK.s}F0ìiA76zOӪ7N7䞱w0;t] N,3uRޙ~ ؖK> F(*[d'}M VRol~k FhI 3"!X,):l( L_DZoI|oe(l 'TO"R-igQX6*R(M\8M,ݮXytd!g_deFf~N@ZHJLqEUpZi5[Uϰoeھ cd}72>,Y5GSrE /}i/g/<@O_kAЇyݮ)_™aTNj4+ G =PB9[[Qp y)tECXsm K9Ry;]KfU M_#D Dk Y8< Ei% K`]q =U`2҅::gqm|r N1tܦ!O_tZ P*v˝`,)Wh&= ٠ !,B B57BkН, Qo45K;tah-<6q$`1 "`-U_TO՗o,բ@jY KAץem|~߼?E1CM^OXmBӈQK&Hp1ÌS"6X:H/X~d(V7Piwx Mob-Z%H=o{ܶG[P?Uq5VUnn( 𥌑o_<g ySsi>܃޿bwzw0Eg):-~ GùBc& \ELD-ٷ.KbcaćR*3IaE T | WUc+Qw.dn{Zn7?&<ΏbC(%WYɁ9\yo/8\cR ]]bh\ Vi!ZyJ1W[¦7X/a9F|ͯ7)KL*Yn SVSTB慉*xӳ%@y!a0AB/  pӽ=% (i4QH>:|_K J 6kO)(s D t7;m/8_^W zGK.+eü7᠛w]g1\M4L\ؒ@(7ew+[r/?B7 4Z+=+DVK>nU{k<vZX)&tz}Wz}WU HD,T[JxS#Fsc NK >k$\̜OەJQE?<\:~K6f؀|!U!\R"DHiDtY>2DE|A(_L)%`I&pĚ3f ńBx=O0 [0kQ_/Sf'}3z; ><}۽O.{ xC8ǨT|;P%y)xWouo0z>Na8:^Uol><~ /7s>{ e_rbtt.K'׳ 4/K7^=c#*^u/nOڣv2۫84tH۹|zvL v\q9c\³%^Bb߹H j(4Μ/gj忱 z^gXvug^NFeN7{pΜ^fElzwInO_|5zOOgFpx,('</_~QtAje{u棺>O~>r]z7׌|LYgx״Ά5i_&/PJ"M0wt_?\k?ގ#c~wo@JY,^2U7Sc)gs>&K(J̰mȂ*^5ld|,+a4R*^Y1!$4' n#$eJp)?ٱTOvZLl=̗ z>0[|)9P18` ~8̘:܇̽pa1ف<vi|8&Zᙙ!R7Ƈ!Uoя0^ 0IJ VJ$ºDRc@i3P,hi]R4.ƅи6p!\sxBh\ >}sk\R(\Eꄤ)IH",[^vl\Lxfyٜ7 !^lLaē&eԗI!OFl,Ƣ#4EX-^-/֝D(<)L4)V 25ƜIN`#@X*0F!- tMH1{sX{dEƣOƊlƊ\^R\]|k If01Z_Z UQE1IX'fH*Bj Uu3%Ce7('ߴI)ӔrI|TI|",hG &Lo)kzei`N++蕤Ut0+⽱k Յ1`z+sUf(/'gG>JB mۗ2W ߿3vλB7-r3dc}AA ꃥ$S!=?L]'ys3yV2U]C%A!]sSy Ur]MV D2+i9=Q^JQ"e!q '".JS܅]/tay&]+`(4 wJR5G8U YrP gۗfyfBfAÂ`pڿmEaeZ&铈3 $fsrT=^C"JpJI#$k"כ&lрMQsGtNk43L G`c],2c\/r01",%܂*)Y]v+:8vBzPUp](-(ͮ.RHQW4Zz7]6fH"\[OѕSȕ< s;EwK=KJ8:.M? կg Mѭ0gHyAoE:VD,5cȑnմnێݼ"wu#bp7rSF1Q763ks^寽CQnCUjڵ [y9d >7(yA{"1rd;*BMy@Nho2[m<[V߽sͷTG9JZ1Nueq~P3ôV?5U Gm^46(IBrN,%LHf1VV;3A4",qfKr٦rf%[j*8:'[.hIQj`݈߱%&&>ѪH">0<iф;fRe$N''1'7؝` PzAGP rk$VZ(#Qp`iIK$R#0# jgT l(fS_<9H}@.!w.9g*T?o)rPIRtMXhAXs27aMXhZ]ź%N,`Kd7Hdb#`c#=%%a$=Rt”Be747ű}EnW/TR"ʋe$HlnKbT,vbl]IZWO9( P߲GP'NbzӌFJR!gEmj$JWXܐuyTqq ؀þ5e6yk\X9fU+#s? `7d{ֺih'>:? :xAILՀKoe~ǂCEB/ݨe9a})aӨN13uI;.!X7MH q3:,Q|փN!SURQ b9)fMXDO|uMXDфEW-п&*;%.J`|8E%-c(etzr˒!Ra?…,4lxјaPa313x3}t313|훺Yc #K`눥XD*E8B޵q$ 쪫B`Sra/ OΊ3;$#AepW讇AlPx[n~sytnAT/1%? aZ_|'zy}ynkh?*?RLzFGp2GD<OEZNiҒ=T ޳1K2f$p9 1KPh7nNF`Swwp9?/Kb'dBIX4QL֒T[&'W-< ?ed:‘4 =IgVuE+! Dp[g,dQP(`S@bN=.l׷g݄–[&4.'9B`HZ|C }=8JaBV++nVA:""aU~ӳJ^?}VPL3 _fuV8}{)|gVۿͧ=\WNz»۾OP~Òrv}s6x?x,'Bo~73k~ 7ֶI1>lIRJwRDjw)Ҋ_眞>f]JQ[2/Yp"Yo-Rz)ٺZ tV}XeFBP3oʹ?w N_"|N ۚ?stP}3!.} ,H5K ӃvÁpλjڗ8B^\xGƽŻS9R@@4 >l=J|8*Z>!^\ޮ?<tNn q:c!7s<1-SdV$lr|v6Ĕ5gŗJ.LvЙ /.KE G9ɆmOu|gΘF뜌htLE[bY/;p#VKh rQkkבW ^?NPmR" ,NQ-$"O2J;7RVB6kkSRw`E"y%J.\@E%w5 (?JG 9[=͎-Q UbysoITl.xDiL@RHKƆz̚um|l~LQITw9ZRMb]HK,`͕XOòLEڈ,UDMۂ2%vbI )璫9C8`<%] g%U<-0[,mzgK9j%UKgQhN8Ԓun@鮡:b6҆To^8&ֱD.Uֲ@N`i@wg(y}2f=l`,Zws/%"$lŘ!XF$ 2`fKZqf 3oL#CSdg#K-S ZiFkd.VRT$ 8GJ JHS0T 匬?pDb2 )oXHGVd$eE#*+8^SB0$aͫ4 O`r؃AP< nF@PAP>64dH2ےCY=FG8f-4JmMסK:g;R1vSp$,hA`2Q{L6_QYDp0ΡGh{Ir$Ow;6Z[QBD{]Պ>=i6mn>x 9e._oǥ]|0Z iYm)K:g(i=")G4h+U&Rk9(v? 7CO}1ohT$(K,J()rRS.#ǂRMJ`/3Z;5F ToLܒ} BYcM׽Zה!}QjXqp)lN81hUͷi,2֎D؞w&̙܈&bG.AЖmʍ:47 *$~4x|Zuӟ['Ы:KUzQ?曻Uu~w{uw_sD~XK{65YщXh[ܹㅭe5{:d0?ʰ%G} I-(3l0qb ^+z[=UiMUiq?|_s{N˽Nr[\nrˊ:?= bYe>t bMng{ν|ñJ.UFA9fys@ܒOh}fdvS?T!rEC*ZFH h?z̭dhv{tf7aֺUȳ]skja&|]8mՊ_:S-ZՙV>81S/luV^v;8 8dT9|Xj\(pb28q a F*e $eԺl$ǫƁAJّ*$ Պ.@5hI'n hmfOt~ޭ]e w'(b>EڠO^Ė=l@V&$mf;GUhmH"O\A=N'@hyWb:`HҽFs;7;B=.>~ލ .KFk9-r))F,;R. #&:vJZNS~4a9=W၁0{Ϲ:@Y@RGLY:lgٟJ 1g"V6b.qNlLT*c $(/&\G,O|k{}ˇ7Y ̚}o3\KU4xɗ[ ++ֽHWX !!}I{? o|LN| <Ԗ;L$~(Cd8Q=#+(M4Ҩ]͠rm/9P2s|Z B*%D`"(B*1EGɚ}{ 뒭rOX R1dyĬXă*@1 MYY;|[ȳXחQl`ccVhu(Wl%̐5m肵0*eV]xwͱ?dXn+PɊGH NC* qمF;@pF+9X6ǖ+*,b[%f]Sd`jJ6d^:$&I;S(_F`;5b+51pLcoUY4oHybjLZk׍5#d s8Mޘszo8h _s(=>sU}tft|ar9=:w^N"D{=. Vuw&ze:os&_ZRֺ淳{ZnN=#O}/@1b3tkA H-eݗNc3 _8FFH<?߮z~iKgEst?].jwqmz}Mi5Ms29΃뼒ql(2Q9I b d '$x-g83Գf^9bswV͌η[GLbOmoc(8^_:#VCv?T_f9 Ρpa^3gG_`5|l\ Xe[/-! Q. Đ! j>7O7-v,ʷ?_j`cj6_CLI_0uی㴞6NMs;L)SF{%B(Hh$ pw{F#+'\*";8h^R%vͱT-:O<J|A)nyD&f(L P>);(H숋ُ0Բd&s 2]OQ}b<$6X=)wz6-/ŸPP%A\4Ά8b1E(]#Q)V6<st~eBë'G~ ̧[\ \\~5(y ;r}r.hsVRq+NV>o*3)kj!"DsU$n~:H@+]w$dCFad!6_fXdvQmL`/y)*`)Ui˵L685Pӌr2;#L7kGy07+d#HPtN*)))2 )SHl(`CQhxc24 (qJnGfy)qDl5v&HTJoeki~~+ZET*i+nmL5F%wE)0sBқ'—~?`pۇ˨o%^>i|EAVJ7Y逩FMQ#i<-h0҉'~<[8$Ko~IҩgtVO#x9^ ʠ_&X餠p@/U2,Rpฃ]1 ,'+VkhMQ[>L'X P L*g9,31qh`r8lE\ඖϢOy20ު}ʓr@cmmy8M)6X 0ia6B*`T+l.X!` LTQl%c[r A/;Qcc5xX9("\?E*\|$cjkDXɾxQ+ܣ{o_M<3݀߭~^4fm;~ZqNTvK߭K-{ƫwIqyp/xw6a-ǜ=ذ^o%v{sg-WL(nd(sPn,S3q)oZ]u{{Fy'aJPHgSJG GZD8i8mQN9x#[ ` _~ͪG Vhm܆TWyG)Ck+2׀,i?Y#ty5b?x{|B#hv"t@C 3|s:tMO}ߍBw niD:]Jjsn_ DߖeN~F2E8>sˤ:誂;ڸ5YUQ2Ynob:޸Ͻzgšڐu \UGU%U=w$xu.Ga1.S!|S ns_^#7Bs&:4PY=dҽNƧ{URtUgt{Xt9C9h+J"֌tsW*8Fy|@-2`ȨW$NEtLL G%m~pIVJ}U4ĩJ%!7ߎH{aGߐ9|ALqA8 W ܺ)<] ΄o[u:?nP&ء GhM(9lW/8D;6>T3bpj U<.eYFGB!~g#@d8hsm$SwkvkJ*Ӵ: [5hUֆ|"D(V yĺ[STmngb%vkCB>sm$S=TR&DzI!RccRB0EԁVa3ep"@a[^̸)&6]Zpϱiʅ3G%ր)Rz*)Eie^ƥ:0_{Nl,\3f%x 5k}jQQ`DR'8;׌QTy[VYA<{j֛ghHģrlpu[&$h& OC]>^f,}ƿkjU5,Mn4 4cпPOfq/hOWg [P PD"v B(^ ㋄zԂ .B@QBVř&h]+@Ht<cpn@? ]5rAd$fAL֌ I$fAL1qEb!m%x(ВV%%>(@XE3@.j&ZsDqŠ1M17Dkq@SBbQDĐ0LȨ@Fb )D m2Xϔ;%fFHlF1YĥsA厇Wovֶh[Hy+$`͐͜* V#qKI!FRLR=>`!^%]\z[RqoYtiBM!m@pBю}]wTxx&5Gh/\j[9Ei|x'b 3,wʊN N"nEtk~~pe>nkd8~v_|hy|_E{] _z?oXL4ö@K 03>&Op~yOWf׿on/'L?}~b?FKv'%?>Ld,$b[{ȏ+p˓kjOe>jp~6#xY;r}2@<6FCj۩'XkkOZ~kXF IIYLnb>fKŤ,&1N˅IK[OX# ~1:'A{2i0 R 1 iElm#_aˎwEÄpXaٞф(aIA$A P ےBR8̬<6ZeO,(fh<$=e+uhWН!W?ţ/M/XҤNRFdd S |㽿Ym8\<>7_4(YGU%kZ?HO-<bKfFH8r. ~CTCT(+>$ (C-־Dgux/_љ: /љ%էLPBKtot?/gb9'qef왱E]hm"-Jϒ|IyG`nhX\݌`h)ŧ#qC@W5Q)ЭetVA-Rr{~o[~V aUPv?'5)IF-žj_ҊmWV`,U>%V1b$%byĴ~)1 [DLsCp1m9bJqE["F=e;o[g@Βt8M k=\. KMn~_sHʈoі&4$hD*y,*V/kw%gR)PP[V;IbN`4( c5H`k"tQ3ecGR)ޡ#0#`s]c2N,<pdlbD"8$O#q+ni.EGc.H@ )%jJD<iLS MtLVd$դ( UYa)Oƙ]ie0oYt4~\uٯY7F{`v/,@J҇Y`:|Icg@fՇ~6Ճf whn.QPePI>3(x qGVo qajR;?G9G k"S?Al!0v%5d<=*yT1DKqμ,(P3 YoFK;{`eCQ !̕ n #C۟m8{3:̗ƀɽK6  Ǐ˅^~j͊a{UN~.vxoˇtnik8\•' o`=(/O9kOsN+Jfǧ̪{޷v_gUnJ)|*v8\^w u,fER ?xh%5C(!6( (L@ΌN`ƌ_kg #Yq24\<1!> BۃE92OgP|}&9ȷ_qWrݫO[ 8N"uEh{Gnv1xcݗ3&|! [{2Xu<Û@Bu ?Nck0[:;Hp쟁J"uT>-4XiO *쾫r S^mNSISuuVv!^¾ G(˯`b*ms<aڼ!=R%u5EC 2Gl_5B|G3qn YQabhU>D3^x[;_e}_o2VuiH9]L<:洤 Z"@ZOjQ\^#GQWO:se1 OSZ"?X+UMO N=jkTDʇ]1+l_^h՗ꚤIGoXru#"X!Q)KG*aT)!:Ih"*A#Z䣅gI>#)R6ƱCEV$f&j+ e: )6":6%QILID C a@,<,XK%DᘆQ/ѐ2W$P';|nQ j Z([5y+Х4B 6dITĭh<$8Ċ#*zBm둯{yTEN%4ee89})bd-2^`-XŶ!&N'ZNzٺ:Yu Hq:"ڄ ż"VE_UJ+=nB)Jv1Ձ_#5n-ڭ*D #!keqՅ]JbX y07M݄ޏY%/lׁz +vY_GפKtYNf`b`k`˻i8\vYID ͋G[jcwI:g߼_P*7|9?rLət=.fkr7p߁!5[H<W6@S:^NؠV w9GZ󔁃F#HbD EB9*Nsm*e6[ ]mR*_d<8&Ӭ &<2ؘs ,#rEFIyƭFҖ܀,R8bj\<|H!S%* a|& eBKp} v4B 6zUޕ6MLV4ѝJ#oJIro۹Q^V(_dM)Nf+P!a!D&I t"€!k5rXOnΚ=^Tę%6XI\f* .dn2]U~3Nl2L"TzMɧ9 U$g*08"W)X N;Z:dg&yfFZ{<) [3[\ `iGsX/.35آ -&V|G},52Ykbꖀ_T7%3:1r J<[/`at&|R5ᳲfӝ‚pNfZ ~#mb5L?dsx<3WB҇5dqd0C:kll\-d7s{}:]΃+oVcL<\!ӈc )o]ԟ[ i~Lj0Png0 Sd93v~9B5vamYXx9Z3Qn+P8cЍ6ipw~-vBP1m/ So P){it 0Af. 烺T3鉺F3'/s-QdhWbWnA_YeWW5!A LFbkM[A AĬn Ś44 &rw3 'jogS 1!vLh{Q3@j;i_]j9ʂ!]ɾZq=`ɀ}ǟ\1r.6 TP<? /G"*+X^>MNo )CJ6Ha)t0)/>7|΄je!zCPK&wR e fk%,ݏyUS~=k2>fwNX~ Ng[$XK0 T\EY]oxi9w7A^±{;ki pÀ%Xd1I$ #~*Ohy8ojO!_ - م%K.5ڷ>8/?d.T(*|CH>U{:_/G j|r9qf |3&KSlr*aW9%<.ctYOK.#z͎/C1yoV`.ٱ;g*OX"UlToLObg3]16SZ/¡ka_;׫lGԳo.ܤt!n+:qdG/q$6HE:^-cGFL)OSP’K~W 9%l᭲Db@ pT("6%:t s!fMWg0Z k]sq6cF_ rY k)6YRѡj%UTgO6cIae,0ԛm KK]wSob3N'wJeJtԖ=l/j"ogl)Tqm=JNt3 (q96B ᡸgssapaPM k>~4+*.ւZ)~Uq&iu6Ab %C0-JR`.㴑H,hDzSzn󦒢|1  Sr-r o2 ψ ]C>IRpMn…-sz >&I|Up:&u4LhfSKk4@2aNz2}S2S"0X|8;|oucW157k_,M^oRI7#*iinf_uN+|g]ȆZ|\E x w}9nyT?qTy{V'=?]v%D%1v\-}tP=_ ?m$ٿBN $A ^5(Ex_5IIe"v<ZTYGw]]]UOD^.I>.ƽܼOOѠF4G(\Ԭ' 5DZdàOUu(}4MIcD Q3hPilxeFڦVdySe1fd>QVtxղ\|Փez{ro=쓽iQ,Ƥ<%*aAoQ=ƈfsݚ,uǏ* vi ?Mus1?Bm>ģvOTlkۨnw _ѓl =^k3EUKc']{]5-e-!0UڢedS *! 7rS -DK86`Re-Vfӱ K 烏>?1rŐ ;21V\Asݧɖ臮$W阁9RcI-*JRnե\Y5!t٬+VCk9=I8vO23u֓bNQ8O uT GF Tt{c| oo8:coX砳г@L@%AsM`U& R%!BȿG10Y}|׎\]?t5遈P)| ^9t|QХ8 w;kPL**L-ΟAhJ Ӂ|3TA@H-1QQireCrU Tqd\s+ t(bY>^̣XKmlRenq5WQןTTcԷrQ &Y%xPx͸Ef;¥ 氪i##DH?s2[o0 0Q$'JV]c.hÁ20{cZ Iϣlr S U!f=S/dZڸs6 coO0v׋,j~}: n0FU`/jJHq@۹Xh)%C,)`0HK@]+F2Ϲ,=Q{ I[8OktP)'¨,)`V.>49|?<8!F$bH}HQK851;nrv:&J Xou?Y Ge+]|9rM(aoJȭE.БX8 '^*)s`[z$Vuj0<+G-pw%yGjҊFeVvl,ؤGa|;IḾ rΜOM49Ac:)'b%f2`UF-.mCH9K9^ $}|θvGCq"cH>0Wc5dlP3Rblb긁B rc|jEeƏFK3p\'hF|%Lj{~$/%@r_oECh,rЩJ&Šj(V˥]aՈ]% j~u+0[fg>\{{~o޽xyqv+Cώh5՛q훏/^x*UVs"sA\r1|"s⬳YoTPU/K<@xE,'}w;N.?luC{~E8p>wɶ=;aӁLN%G+i1T ܍݁ߎlG^v{v/)YʋNA#ddH5{A^Ϭ3I4=3ax|0&e>7"yîS->. tb>~9A o09(p Ma30f,LK@{E_%B*Zh7%SF}K;궵Zڭnv[j"Ze5Vk,\Th}ܼimVjUo~4 uOxssgq3S;y8Ĩ_gDl"5(>V8?^1h휚Cݝ8oƙi>6 j|̔UI?Ld 2C(yB毿9 _ɧ.z`,N:92wœզ=sMg= pe &KՀ-hhm iՋ͔1d2buu5nӖڜyϝʽ^VWc KjHEm$ߓ~zyiqr0$sUgrli'"c )Lq2iqqǣEOݓE "A{D I]M/5KlJKz7dirî Xoq)(W"W":K#d`` _:Y02#lz7yC.{+a-:\t (M8{}A0#JJ0ë FRJD X0f$ fnHNaXMKL(ıf /zCw~n"{Wƀ[ӕ;}ۇY\X >9hwID>}zs'ww| #fq ݒn{zMJ=x냬VZ #c!o\{7G hX=|vB40 Ma4rUk&f5d2,V׀u'&kk$9!HDo>ꝁ77/biVU %D䎡kϟTVd<5jO RQ0C2= >(ӌ8|*O=RPV'C˻"naFZR(SyYUf-yeWKk+[VkZڭnv[m.^9;pUo~[ߪV?"T{39? ~縂߸9^oW+ `_ %{WJa,|d>` WڱUV>(r[y /,&8t#vLo`6It/akӥ|t9ffl4%b}3f22@Dd |n@2;Fgy"SCeG=z#ʺax84d8UfӚy2}2OV$ ]6 |Wv. *Jݮ> H{hy{'=]'7Мl\"`[1Q Rؿͧ^z~c{"sq|V ,^Yk?Z-d} p3dnǙnfYW u>&xVƻF؉9")zWi oX*YZM*DVkbdDQXeIsdfd\$B%r8ǟ Pj.AՆÜ&=7spvt`D;K20Ttd/Rm--(=7K-+ɕx[mނb01 f}Z(!ZHu0$ ZvMǮ00<ϥj˩Q~ ]!"DHI8C ]w2SLsTPh.c歒%BW:oGjV&] yu#he.;&+ dXӜ` 5Lj;r]KxkET3L@ ,1$ ؓKz? 0NG꧗$ ka1>`J *bS}3|CWw!tGr٠hb=l KR2;{:\wjx$DF)}ZBN^H!RA$ȞUU=^vk認 F I7 =* i 8)pXr%f`0 †0 6B գ0_r$veA^l|1$=,׼~]<06Wûp!6 8oŘk3MkKioŅε ėyˡRRV)j)U=$jJHxh(.GP%S#(#]oHܥ7x ]Q*!xȴKyP m˨^ ׌o$fFF,Deǥ\a$PuD`T ,X>6YVvE,yd]5`Ùx~jW7 XRy4bSF_Rg_ VT6t xbc&L%I"e&7+PSn1 M]CN8*Ҋ+!:em{ D骻;upUӢOI tOw:ʢ6_㍓&bceR ǰ?+Gnti=$siujNB"8Kv6a$Ksd(xfh:5W~L)ʴe!ņ|+dYRʜd!E2D 73~F{ߒtIv6~pfJ$ǿP+`FVFQ\eTa@;JN%4^"ˬұ}&#Y.q$Jx1VdBI#N 9g)֐4iM+{>1o6cQ~P>fQ 6 ݜl8á㮓 Edw']ŵ ;Z0/^M-W02KbVs+0+,3D2Lh ׶֋T}mLOC0EUw3@7m2rdh-?[ɔD2ߍs‚B{pWH,mdk,k)S^גfj:_oQ 6uoWٷE%s8euem-n#U$>->hUՓk?|:;0hƒwEu2*lXq0g_r?vZCr^ܩ UN)vukL={7l2ʳmem&P1b\&mr:Z[q=Y]p$A=CQq[ 3MߟVgw{s4!"}B.˿Ix}ܳhߺ6qYh]\IRc1XO06absy*n'mXòVAjǺzs}X-H2nѱۢ$bm•w5wζO'tЪ(ky. u{! v/ ’"E~h̰|=tt#LYwvL(.\?}txΥT:ETr,#F!sv! =&V%Al$ݚE}t8E(#])fG) P : (h~ g8?r"<~*Ŏnidqܹ~Ԙ0p4 n}\OR4@Z*/R֥|enܼDM(' Y0_4PZ+e2Ž> 璡"7aSK_sh4`F XxwHP֪\xmQ`UN'8;V ~)w;_feXťa6B;+%<<0L-ۼ# *;mtK>OH (RKSIwړra͒ʍ&Gx9:-MDT$9O ! f2QX9U@R+~<$8" !pB|%b*O8( ~dUp yW-nt0o)O7O3I;EbK=0]ubb*Ek-cՆނۼ^r'X`_&zf/xJ}&3 0_#umG/t1$wZNe LH]'":Xիyej?ޫQs,DŽMU4^xfQyv/ 0g^U`{͈^[7塾ec$k.3 rIJtn'6fѳ)i׎ŗb(I sJ3Q u%kjq8cd7NyWln|aL  7 #;Xiv*6OG^zVרWضngq0 [A2_$v蝷CzWCdx,W$J"+FH Y3',Oa R2jleHy[sk~k9zm_tݛvfmts=pgnoж~v@U'$9Q=\p) G(lYܵTU}8bF.زD^X=J䅌D^8\I])@f{&Ko~xWsC5_.ZQo(rx'E]t73:sS#fAL31%2eXɲ-*W,"W$Ți8g$F>)wfa\;bU3&WX_3UWy0!7Hin4׬;-Yט0&h& Fp5:k 5rr\(S*r9by*l o1ZPȸSNƒXg"鈳ɍ[R.y7/;dcn6IsU~̼w8wxOcla ~Ïo)δ\~~_y`Z  G#@[23.q-fKrˮ?E𺹛O5ÚlXS~BCV77Jk5I9 n٦3Q d*!\tnǕ^Zk,W?<}.ʥHc s*!73CVGCA》<~&ߢȘ߫Oi\ MLbƳds=\yMc)Kt43;m;^&z% Ѵcև~X#Z>/FIyOsx3g^B!Ǭ +J|;w0}8 /7J ̤ЀoߠtG3{UHIH3%8aY8kRd7]>'3yp{?0hr  qo9'@3:KĉA@u-4kOHH2ZG[P.)MtUh 6.ɍvDu}dk}Eg )]r@$eu"cp.С`hV9 c)DDTou Żn%w]5UGIVv%pu@P-s<Ό 'ܗ!voHA 9莤aeJ81aA3iRc"38RAYaEݚSKQns”UβL*`G\X2Wd E\d:[S#%c:yY S,kTSBi94\#*f!h;cwpbSb5P&”0w/xךE+c|gO;d|qo.1JSCxXHe"IS&SkI֍lMqi_'2Ϭp*5 L.O%+ID"S$.υ`ê*Dm(&jk)Dm*dkKp=m+͍@bSͷtJ/(Ց&ރ9!!g s]d֠G$;ޱCƨDaCFh"70;FEQ&f$ALj!f?燳zCW=crQ wdM]:yui\KHp }o)YA6l^5wW_=)uỡW:z'[E8"6Z]?}txѥTET{oce؉sGGqH%wDL{_|YW?JA_=1W_B$kז@&GpR;p4JZR)5F<Y2|PP9ıH jin Q+PWO [.weqIzهy’v3⅐1n.1.I4lYdgWex?CU6Kkj_㿲&;&&Sr[δ ܩH.Pf ufv|J퟿ Q/TDI$ BДF\4 A$=I pe\%}|OU+yɾj,wC(#޼D>Kb螖 IQGSWTw:878 PTuہFS|Eļ F婯XRg8 wD&pNxa]o.9EϽͻX|?tLUrZ/册f"6>w=8:RTzmޏGr}$hpk$՟S\킖8v~[}<=Nr|VM?َStgtYn(PgJ?xnщ|ԮǡΤ~SAa-hL0`WmkRs;W#~X`fxZ?j8,]ȼ ,BCQѫ."/Ogz ntJtU-56|^z=~ 6y!ë} q~b>[g|!Dzg}{mY +c5r/Gfz|E;v3f+N7,8jޅwN~~L\u>GyH6g'=M@L{M$҅w!(d$Ya\Iejԝb/W&ucJ4{;": ib;hLce|hS<{ssr= ǐ) }ٸ-Pe&cX/rXA}3-'@j>{]ȨԜ+gL#' qPD/p՘\a0f6pG.4SA WfUܾ!-bH>|gO̺UnlU%7qY\>ټ{"֮&)1\i7s6_ڿggZjIhr+4󀞃+!rk]VĿ.K,__%6^qFIFh+]Qe$<2  )ZpL[|^?^i$Ò=/zuODGRbVDwR8eHt1_kǪ4FF3)ajWgl hrs(ק{ 5@bf=pģ~EZt끴=i8Z 7/UܵQr#|0aV(c&WZ q]Z4 +0TA+"V\(%>Ii8=׿VI4t–|$KG*X0+\]cQ hO5#r$mbVW&BRnfE!PxãD`B v`Eysϛ i1*LO9u>XS_ꢆVcF+kP JDV֣w )Buu:wZ؄#eXLGa!w z.% bu{sʇ.UQ%!.1If}{:S\'WZ܄W$*Wm/;)ix'g*UT7%%xOeܬM!ig$e)|">pyS~pZe;`k)ëضmX7UXuO f,d R1#sH4/=y FE{JY#FyU5$iW.T#FeaÂ%l|J[%Qq%S< gJb6qTi3hEP` ԫBEkrnbQ@dttoUGX%5,ZO h1֝ո9pz /[ac 'Xn11S:gdx?c5VFX=@c=g5RLJ.hbjzS¥]RJ==%EqDN"5ܛAHĄ<>O%].>eoG\;|%aUw6{ S6e7 kk7]ց|%5FMTw,匈U F nkPO5%馄2tm&|qQ<=LvA [,F{_G&G_@~ٴNW){ T64!k g eOkZᵬerl %Yj%9OK`z:3Bd}&$m;<,q{b]m%J V- ʓm~.gy#"f%R$Hgp )/mʕF;%{#TI\SEr"ҪW@ANAwK "g`T9Ze{/-.ՄnN9Zeh ^(Zh ieDzE{2OzϘA5&; !.˿:[dEobnH|_X`BL0LeQ榣#?.Tnì(J*T}iVh6xǧX>r+^ѓnNOc% JN ܺ1Dz*3CqR2?:z񰰈pM/zhdmA=Zf~jpL5"s$azpLM5*s/7iL<^3N{ҘL`dxe;i.Ѩ̑dPC=gϘ|ΧZ"aS{MlÌŚ]SNЗDڼ+)iStsŹQRL*(P6%8gFexTG _Gu JEx222>ĔL,1ΤN xN)Ϥ$\:Vڊ@]`QSNI FEˉAq\GQ\Ao-"bFGGtA}M]2 EqUrMC.pO7B8W为h|ŬU[0="2yB{&\ @%TJpnRD& j4`*:FjE:P-U<&QfGӸ˭Nkp2FÔ%(,Emʊ'*@vrYTZ@TfD_ێ՝Zˢwc.iqO-jVQ@@@K)lN`4x|:15Ƚh-j!n֠Bv!CFDhI;%hH0\P獔D a|6 ~wl;SAeo}1`ױQixyhgmf?|m~JWy {vocKs_op|m x}!ʾf/~gޱ3͍*a@ MJs ۆZDMD }@B7MQ lV{?5ulrwh9A⿾w_mji4cLrk+wj*M?Mq %=x9KJҀnmn-G#* @r._".Y¬5Ns64Y^~gn}CO bm%Z]IK0_Fh3G\&3JV8FPVEH9aYvtuF%A1qG2rnؓ l-Bf,H*fAuu7hRY+@x{:!o$j¥N"j|dr. S,FK 3!V#!=NoR&1WXYǭ Qn`樦P0 uYR.^BY.oyֿ9K˳zd楌Q- u<( bK,f 4QX1kD [uTIfm$ ʡCȕu IJ;sY"p%eK0k͛3\jYqݥw{>l"Pqk~ FC3~w/w^чo_]و8K#_iXJ@=ax2FT"3H"l#=omL.qͲ.pFUcEƄѪ4 IY]|w F~LL HWZ%BҨ2*a*ιN22=0(Z)fV5JQf@RT3<1 ' z1{!̼b|f^q:qX奸c8l1 R!HЍ8oMT8<%ziq/Ϳ<Ā$nuc_\M쾾n۫fHz8jfS K.ORhwWAwmH_aUvwHUm&Wd$!1HC:~!% ɡ8,.Y$4.c0v@&`Py<꼨!HmhR>n[2n7ǥmMEFoctmrv~eݟt>x_lQ勼UTiTz0?R(Υ`K.ކV*MW 0H1e?Ġ-}|̂cD+i՗ BjcUʠow]&jϪq@)t FQPǮ]μZo~ PՑ-#*uH %2ʉ(*;j{; h`J-r#PC@D)¦m@&LD@ȔjVۿgw<48fH@ eH'chq\Pm1>c+N9v /`]Q[g gQ9+1Žhh6,$!7lD< fM<8t]{i`'TspbfwG3iu?NW*-}7 !96%Wh cEDODEAQ|˜/B:bJiiGWeI,Zeٝ_/3p;[E,p.{I49]5T ±Sk䃝v&ٛhgovP 7 xi5y=IRTStcP]#vH#v&4REir_tTL 8/)dXV%Jq;&\ Sn V jW .H](^yWS4?7P]&W7F0bV[m& #(ȋNDV]z%FІq^^j-iFT텿*HyvS0IǗ.6T7PdQd"ˋg8b/igظtK'nǨc2>''ovǢGkaxl`C׌kqIyUR0!lY*Bwj=gCpK^u3R bFNlp|وa@-"(zdX`О֘KˉHc%V4ހ H,٨)KjlFQ'̤rj*Np u .QB*,5GJT{Չ\K|qB#qf(\R_ T(Ӓkl2R/t>.V"AD~dOiw?6!I&pQ}T冤]Bu 2hpa&,x(ӄjA1I&Tawj`6 X$< G%%_8[>]߳xiUWE[إ('iJr\(&ӗk#?4:K1#^ nMThKS3("&^`+ђs [UYuyRR"RB$yd+BSImYV|I *P͑Q)qcnIH"P~Yě?apad.AD$'sYFo0f%Gχ;1c[*1X<}sg|"nghgc=&ix9 tQ#4*k.#N wƾے6 +A0:^].xLb^6J\!xZS1.<j XDLn eF &4%%BڂmI@6<%I3k 37Dxa)VWrY?S8 .sp Zx`HØRc qL>z->l>|%y)(<7:$4ZD)##RFt^N 5Y 1MJM0d&VZp]SMQQDF{B:h4zxj Q<)ےhtJR%vbS4?7PpCa}FTS.iа #|(q@HYך]jgFIiiOYF=i`$[ Sl)6ޤ))`wF L{O1Zki9,901(GpRCL%xZ9B ;?̬LR0Y!ց3u2碱j b:s]CÒ"s@(,Hu`ZHV^`ˑc*X њ;rea 2< 2≜Nc_;oOG&qu;$,ߎL3yDTZu86ޖlT,i/ni"OǧPFAU-Yyphs;Pj QK-( JV{s듩dDh Ҫ믅[1I`ř.1RKKlJBd fE tKa 搄^kjXQ^}ܺBqJ5LJTwAk^REccjǖBKl5ߛ$Ԯ#tG9$<%_%ϫgV* '=UJ R92Y=K[8gٮ!@z;aap{o& YC`o ,,?c0]ËllTV>g8/tӌd& ^,t7h 1&ԨDA^TuО1K2#ƿQ˒Y@}4XYXl <eoocsIm!P|7o)pmmj}4Z|^p~;->Bt/EpO6cXa<Vn]Ggc0NQ8;4G=:bϱ&ԹmuRʸǎκO<0̘Wa |)z#U<2#٪Mۿqt>jL*Űw5یvU㳵2 z٠] UV*#5!dW$蘿8(2wF4Qg4vBo&XlsCp4i IwT E188(B'>ߥ8(9r`p쬗# a8QFFMX*ZJq̔Ɗ1;JJx)S@Pr2N5`bD$4^S n NM GX2.R0DE:!@]`r3 8Vpߏ?QZMKAEgĻQ^mnT<^e"u' X~:vb/ixud8[v<:tfqjpܽZۅt},F `f\W=E[m|$z9nU`s0m>,g!JQoʅӒpm S-'Zsw߼]Ue3Ėt}Og4p0;HA) [>~@Oy~xXw=Iì堾|8[ا|6Ϯ:0 ߾<:ڽV F֞Xc,"Y47Co0t]{͖vBNqh /GͫQ/^Tz*s=$V7|))>6|z YtWy'cF3?Y>KƂndLz9 @ZZC+?wfU!Ԫř ̰3W&ra~vs<b]//,z\'G|&GonW2zj]0l_1 z`íAoc*!% ֙ߍ$2)e-UnnҸTm7_g*-e*NMʹ5Gc_EI:+k o ˆ)n]&rQ`-&i-2"]uIw/i!iiCifCiS.W9fA! TO-,Z\̈*iDY}O 4X@| %h)~eXarD:L;+,!is"* p]κRbBzAѥ+hoJN8*YVB J/ DZAzț(]kEEǬc?UcEKTɝ ^2ƥӘ[o%hYzݤ~XU KRq˄ x/ 홴 FrdbLa%x{(qs1!~?->H 5 cϨjiRpY( gJ%;M (kɓ"e#|۬0e{ܓ%'VN#--.w~Қ8a8!$V)& KVЅl+֜rL1g$Idq3" 0V*JFsH3Mwл#-1!%%ؖ܂HDUAG(&tfrTp7%2)䔓˲T/>.<(6hi ĆK3E9Zx*c0;=Njwo INF8Nr,<м+iTiĥz(N})/ޤ,_x3%pQg}, j ad9̥ $|y@a 7 rSrQu}C(߾D6;5z?ČTv6ghU/pRAZy#x(^fqHAUG1|) =V澺|a4\DVku{萤ttXQ+V"'gCѸ{`wQabjڡ7=T:ӕjʼnԝhJ~_w?cZv2]?=ƨajLVpk/7ȺgΔv}uwpx4lr2EF{^KmwGQh`5ostx Vg`f]D/'ays=<;A \hX$EzOsT?m~nP}V,E%& LzӂCb)~m:yߙd^qg|'z}X0'l^.M~`3Oc g*&9y^Le~F ջ{ P9u* illX,?;@Vބx&olXUOb0<ީ1\ c1b[ Oڿ2HnkP3SP)8g cߌylpgA.ղ:8*j=@^-ޑ9'ќ/"!g۳ НZi-PNUϡCU9EjW^[ӞzjEI 9lVK9Z6p_g)|wsNG4sA*k^Nf>[$ *ET!Sݹ%;bNw( YVE?Kv,h`H:QN8vSi aN=5uίeO*XMGzv14XWx-|s$0BDRJVT< &bF&L О&I;Y8 E;B^><'ddh m OZ YPsHesO|S:+49d]j} UN>\.Ӎct I%#2Ptt1:7eTKw=#zOɐ'{6C%IPJg P L`'7AKk/ftIa8;Fİ3QH(K:P ')j4%\kK(\}hL9PJP;COú1 L?/ #L IP(W4% "MS6X.j]R!;,B9 C~ h i6S W7%@4<=MA$kn'Iƅ;f KF$-B$5vfBK*;g\$ȾlҨ8EnTmY!,]vWYgS ka aȓ A ڦ6?9Obᇓ㰃NXS)5e^cӵOY& ')!s$2m ma(Yz F,0D U͉brW!Ot47Ĺf ݫя@h1]M}'+r*^~2l>PrTNg3{ _B6xx߮[["Gcn Ì{DK_/GUU:f\V`2_V_߿I_ |#\ {wބt՟ ƾ@LJISX)Ԓ[Ǽ"P4^Rތ7[zv׹ ~OgxR&NO~ݻ7aH{7OTnޙ#1Іg:0 /+;7ioz^kXZ޶8[wmQeUK|R3HVb] Z^aD@{UU8ĨJ"ddYq_(G^0V:\+T8 pnPZY9:p08+xg2ժ[0.K%xldx[=D2dgh' #`Xx.4/@ meĥXD$|NLQfQsf`ҕHh),di`j/H`l<+$'VL ꝃ&NNrR;RLVJ >m+BF- p./E` ZxBpaiSIAҖ!@Z[b BKJ`n A$IQ"&v  aLhusHJ/0Rq˄I0?((xrLZG@Frd1c`9Y"!5'(ڭ?\:@;N~rhp-2O]|E;7qKh(?N.h*?!/]c}8jK@v&ܺ ;ƕKܭ6y'->rOv9~B&)~ `37کscNPcl.010N_ Sb~i .wZ0SW r3u7 <N6!ñw+ڪwZ=ޭJpw*D PJ~a2}ÎeG )"WR7Gg]oHkR7T]oHZߐJv!xPQ^ ,Eĸ@"`X zep TZaF!qNn}CFҫ CW+7d\\oL WZ,Adh$ײ*o{(>- {-H㳥d=z֏ٴ.}1 q}Ooh˺7ҁ^&o>Sǧt*JGQoM @VXʗ N5UT la%KbJ2W53 Cl6]6*997f81@ n䗭(ilsi~2Y|M\tgt8\Ao֚oF%ood|H}~_1WnѮyDS")#??EXLyG/BՍzkI5RgN3%$]>Jbzfr{vzk!Pv).o:hxT YUwT.j{3{Jr~vtmǎ[lwnt?];|fmmz^ Mr7W|ћU0*Cb{ς뻒r+R-_X7u DtbhbZJښu? n)$E4F!_aZ7uL-щmun8U=rhRHȕhL1ݭ "`b16Xv,+N%jͺfZr"%SZ~uʑкb":m4nݝlT|@BB\Dcd"ӲMJ݋i$AndV w_2M !W.Q2U F]XX BD'&֭H |@BB\DdJG߼nRºb":m4nFTIB2Y@BB\DdJG齰n'[,!Fad2ς n)$E4J|3Rr"?j>76(A9(yf))xHE!l(k"S_Z8' GW}$ *CJRMT^Z^wmH_◍ww(Ur6<\8RaSx_cHI#!FX_?t7ZivD}VÓjм{t*xoU0?)GKhiZǽ+5M$1YCSyJYkW}\=`dM+IhY#kmjRu. ,IY#kmjB:YF\YkUR#k}dEM0P\dH}dUMxY3JpuG֞Bd(CY"kF }d)DP$\dZ>Gv)k5\=V#kO dJE L>Gd"k0@>֦&0F:wf VԄ>"kILD$9}d a]%BXY#kj&<\d#k}dMMsw&#wO'YK2ek5C@#k}dMMaT"k`#k}dUM0`YFgZ E@Y#kjYÎMY#kmj^d+'Y3 {5\j>GVSPa* ,KI^MsqMg˟QՃL$RE027tS|.?}8Zu}̏p Z]d4igDgn=}d~t-w¸OL@$/׌T@ WtѮL."x]2r@FFπ[X'5;&t@Uxq]E 6CB\yw?\Avq%+%nI-%9sa"8 \u?,ErIeY($%&h,R*nD\D1JE pU <4A2hFӞL?N;JqR -ϋ5)ۦUE1Dz9eWBm:{=ڤ=T2E"z ù኎F2lB^B>D7t¨ux#̂A) 58qފwY ,r+l!"!רG#n Q[TBp͘EpL9E_@qO:!\CW b(wt4דF% =RA0Z{L_5C3B 7H3pIp5N#78-@]aiaeYq5N9n4 h̊( ǟVA eVR9m!|gaL:)._B8g<M񂙈ZUDBJ[`96Ų&r(r Kk]CDY<)IYFiAHe> O!2Zۗ/|z1s`W|1ws~]1ykb4.=)i?.9K_0),};#$|ʦbd7Zq}+8nR}#cPe$hϗ(OżJ"˕sgnhD0oMhRg0Ӊ:YXD2S!\| QU%n4S7=;NVU%of.Ҙ~ QLl:W],lz.}b%hAr\G d= ǫǖx8Ϧ \qLTQ\4o[$:}oK9ᘳ&~:_Xvr|,\iy=.h{P )}#ʈqMR{fDomͭ$q+hO]$O ʸ[9hGm&R\s?{-VG8Yh]yYJKחt gf"[ y7)1o&&Oˌt z~uտa)z/ ©B5<0 4 rjhS7A @ųl2^}K*3$; wTq.aO;>O/fW#D8D4_W#~3bŚ*q렯 bϼ{2ib L>dJ!LEɔLˆGP)aL$OϲgLޥg?z Y2I-iia3lL1\d)Yo9۷v{:f/_췃ڢ}fYg늠kPRP@ `"lH zV"r ҹlNwP[B Q lqbh*M6!&Მp%ӟ>z@"ݼ4)t8qO3lZ%jN7J/yJ҈H'r R&sBꉓW"|2.c\<4/9.3e|w+p;hKL$ bDq\؅{s6`lS"]Ju15 ޡo9i+c' uÃBv:Ţ='kTKB},'$\˜hr)DcARF>bq섌poىk˯܃șwլ^zpz.+vL| -@@Lk\)D{1^,|I̯VY arNkؑZƤEi+ , q$~M<O[S0_0"HCtta &pb8{ZpwER=# NJFrC緟 eXGd NCJ DQ&zipKak]ⶦR0[Lκr[+)h&kF#?>\yWV&8OqKi5/aypK\kkͩyj '+=WwϷ wlw#_g-WfUvVmLNƗ8zJMgOWK<w<3;&zfҲz5<:PIr] )R mCp4k V;VKJHmgPܼqkEN]W_1_sJ4]QfOmI-wznҬ'?lBR%jR/TqmHz!v!,օ#!-)[26 B*fa'6FqOSMR1Nև.I;\%Wuw_/8@9K)xw\SV cGTnz79 ikzM:=}r…k&}Q(~tHѤ'[78B'>w 6\)4kmT?!W'N43vL{At37Q$$)jlYl9.T> 6lt/[Fp(.:LQcŚB(L^<$\FN0[\mǰɝk qhw$^ R-m}TŠь0o-v4ުe6\ab_5{'M{eM%٩=7tQv--H~ܿl\@495ktbN%-.lYEp-9IVtY$ a8_bL)rR"OD&T:-|-ʚ1%C9ޅP B3 8e۠`V!_r68cge23!\H/"QdK9pPOy*$- 'ae)CpYS(#{nd uM Ps`RVy= Ӊ6ż&10O)mЮQQY25F%HOT/i4hTX#n`hjbhr`g߮@`Q]8ۖ!JHɐQ΀!vrKI߬{_P]=DF#\S7DMWibѽ4bEWu-Wuj=|[U_*Qt5лJ$PMGBa˧dR(l7jczdjLխ ,?<8^ps; Kb5\ס x{^.mLT^w" rB|QQC! ΉQrs܀*AC TBX.h0[CH 6nMh׸~w_Nf+gmFwveKMؒז'_%٭׈-v䉜A`Zd^*ځgp_-:w6'xN?F2=f@JN@I,=B)FiHj%7sdɦWM"^E6ji-AJpE%cSTAe)fx%䕪x܉.N~0G":< I}J9'EF>^pȿr#!/bԯp?On|` ( ժ[bA 'tVVLb,mJx4c6jhsAց .nJ!8>-Z wxXr·^z D2O/Иi}zQo0z0JBBɧ0VL39(2T0o gTN/@3Y)%hzdhBm,tP?,aGd?Z?=ۧ:Qb#P##f3CJ-V4a>hT]Ɉ>VeF=c.QȞN=R_]nW1؟qBv#cџ 5[)g[\Ɣ"ۚj˛B ,M sQi30\G2@('l<[pESG]XOFPq$ޫCa7 h2bvfsp*QCt=JB񜮴| 7I6=Α *iԜc0Dc^q 3`c#){Nۃsft׺l?Rr5UF!VInҢHl?(ér$v#IL 3: }8<:+ZR8!R)QKwqkԺK{?XRww]OL}כĢo+F.AiP6^Pya8B{nm:G@_]Х)lե 7҈R2IĀ|chc"$Ͷ >FHZg}7N o,4nZJ8[dP@gUL_T9Z)/ C!x_g+d+zCDnB %:(uٻkMELig!F5w VB{"PmAU v/RPciucGL~W1*f]5Bix`AlR驇 KM 5eJ{%߫`FIʲRɝr'L; m]=N.96TOR҇SZOY[)Ϩ83PGJ^%QZJ]z V:IˊT[OBf.^;U*&S~S0+9Y4T0SvV1r#$uCR VFdyb,Hiϼ%)Es0iH3;;29&RnN,:|:fub@>#'c,ӫh^E*WMtŎKK|pB =-ωci%DUr`[d;&s<;&shBu^F^)J&0JA!pFK*%n_@aRY$JHkgfKvdy r`=vUALAL!2b 8&*F4T@Bw"NSJа@hb\Ƅ1%L]R3樬Ƌ ۛ6^Ho7GOiQ{4I܃W[5uCVmNqsLVZ4b9 Ǽۀo^tu-%6e0ƽ ڰV?7-`߃ȚS<]a^&(Z;FmG8:./! -./)DĻԞ@@'Nļ-N XˡnezA(U~\'\9UKKIT3lGS{6O_D$T$?Nf!>GS12R2F#jF9(WP^GmTanӐ1)vv:FpcF-[tV09oo?Ll?q>|d &nSsOͩǣO3c2oЯdprLf!䉺a[ *k oV:t*鈠j/ Iمr1~/Pw 2zW17+a.0 WNKS1T`%R)/*[gtX#jΫhڭ?rSBb#P^  L9W0Rq,EPV P/d ERqmׄo&Ye% .HbE5C/)W1[7 ϲ}kڢiԜyG(шY`S+(Ko"KTRJsn}PnH#c=[ն3n:6LqF#nYaqobj1't w~ocmA`l}81axAs 67\ܼv˘\JpPi=| цKMJump%UE2Z"QL`3F *%7I!`j-fTcbFl6RJd$*Ud Fd6Ag5,QmLAqQ<T&1:8IMpA3B)ƥC?I=C=PDyjHޭ{0 uJ7R/85Ơ"61Ip,y#Q0撡m/8KnSj3gAQZ4:$zyT;WjQqXu:M>ԿǨQL ky!&t~q>^ݖ/u{}4qtL83 Ě`+HU+cp㜪p`$PQUO`Rfiq1$z T <daKa-4" #Ji#SÖ5)9ApzLBJB[-Qv˘* T< c.׻m2^x%t^$iziD nI{ۻ͗!Ws]iOA4CG S * g@H%8Fdz;!xu7EQѢNJfFC 7+o.FE!ͩY3Ϗ~4}=<_.nfww\[[{7A;n D5P $ h^KIdOhi4_,%w+ʾFW2s4˂Al!F҇{[G^z= YE.*p@tMIdu kU$:/ Op%,l@ ~ZmW :ȟh[Ek2RFoJ ܗڕzThb]ߖF+ǧ^㸦7zځث|.rI8E>~Xb o$ܒj)+rK@t2WKھ1xg@i,=(J{vVx +--m q${F~0@%58AJ{pjn(o ә=5t7 bt^AcEBz ,Z"&" )#bgyrF5LO[HY"l5 O~' CjBm(NUVT҃Q)3;ѱbx1i* mqcIqW<f.o$sg6h9$RKuս0OF}t{=Fi [HҸ({,"I#c@-$%I+dBLR & ГQT]rDSUuI.ʼ^ *MKʅl8K󕓹}\%cޕ6r$Bevv"aXzm㙇]CȫeSIFIxIY CGFFO+o jG]ekMDWEKwbξ: ySM_~(>]~蟾lA4{ S\{;of?4%7w*tS$On$IjHL3&qzQͧ_+CAy[.h( +4G5KJJ-nZ@UQ6z8n.E+E=sHEzXCNXcQT$׏p󰔔<'U|LtV/:`TP{/'6f[INAS$QO1"zSgLԁV*?;_NEߕJ?N0FEe9u> kJg`bS(EUz ' hjCt#otnK؍b9?VU5_ -KFzl`1*] 4h xAEFa^2(Mj lƊjա$dm\!G T8t IJ25P^GZIdPd7%5&3h0`@c*/SmOWR+ (NGo9ʂSRlm1d6t*+7Khy6/)%Ȼ4VukP{w0#T*SBz{.ZO;DY/(#GnCM$52žaUFX܁:2K^VNjj t60ר 8qm^zzۺ${5=o5⿚KJ[ ] LPA7iZC{ |~ue_;k KAwx]ڋCQ=] Oߗ XW |[9,g3wU Sq6k=odo~^ 27/9m׬M/zO,?޴Z'֕kz}=o1N[G l|nAxm \+lPN 4 -$Bp8Y{Gu>)`^rfHj8B" ׏.i5$'&S{W٠]] k )W$}:P>I &K ON_TSNZZ#_)9 tJBM@^_6|l{ :77N :}lS-$ҽ{@Mšbk8~XQzHW<& VzL(6Zt0M%>n/i,zX-c!t J(,M1!6}AjGbά Ćڷ zt>|t*"9^U8vw1/TrbaNjQAf_$NJQiR+|s0I9>H%v$QZгtA:Nb983"?7 S*͏U)1 |dĻ)?i)*IZgQw?vs\9ϓ<9ϓ_ׅ8nApe&ie(a1KƤ;RzJ* ?n|y3z?^W%ZDoFWS#Aoiِ0?ڤ/GOp0B- 5F V$J]'u}yRuu]ocloP*@UF9]|<KQcHe3sDOoMS2?c-&9ۂVQ9:9Y5[MhȃV<^h}TҖҸQ*7dԜ2x% F:GV i {@5h?.soS6^fHdC)xJ@eAޜ+yyƾqZ4Z(.Y]rTf5[:6L1mwgLU{F B>%#{'?7BssvN&x Z@o.I,P5g,08$ $ŏLSѳnp骘.=B9[X3䏹vטx)ɍ+!12ZbP)QekL :QqYgMTA8qlք&liFjf71hײBXr.mp4PحL$wm% *0Jnk+ԼRkIEcv n*?#PaI]Ap(L[{lwI GJ'L'4#J"["̰ZS23@V7S4N5V|{oŖ$ud%z*uk&LӋMϔ.PGod N},xS⌡xCRK'֊VS YMПG/&kC3N-wQ-9T:7dv8i.vV;-VBco<hmIUMZ㰯@C4HXoZ+*ijBtZ+hZ;5j{%sGE)J.{RT^+Ż+nz'/p4.#g˸r2AC# ib!R:Jt\SA5V9pʸ+  y–'h&tLR %Rॐ 9P/!qH6uhRYBiFh5l21dUA^4H`8/Jj FeiMT HĘ2G%p)$Q*KFe4gG5vw0pMWD) ~-yRF|VcT:HvD- %fs8=yQ 7EqQDޒ7*lwəx_k`S# Q ̙evZcu݇bV3ͣxKQdhYyOY{bp=T>kFE(؞:_[:WLF4_6/:U:"< /䐹\Z5V<'hYgܳhb`rp!ka  Տy/62r{k/ u}Fn(FrpTIJ TI|1J% Oi U{X#Q);!Y@P,-2?d}bk0pu)1܇iy>]̮g>K.۔~WimڸnyU^#%1Monֿ6{3;kYVpyاV fH+n`$Qj)Z}b OʫdwyܵZ& xudxq ~OmW3<:.bݿ.h]kեwH,E1~խI xh-/9C4FiM2OcM(ia3 C0[,XHQEOnCV=+HbR4qdGފ2^8\m(ttɸ.f=.tio Nq_ BH(畆bmku]oɓ1< {sx_L.J9T3sLZvb0lAX lP]L/cF}. |avkj> Z_,!I-jRh# hq0]OϘq"LQf`(YE4$YDWO _˞9^HЪɵ qڱ{@ɬfLOY43Nj+kJ M͠ m=b ϺyLaD~Cd{Rt?G|hJެvwz j-+̐cUCOȋ~h$ UM WU1Lf ѳVutII*yXlنq(}>H𧸀Lrj{Z5ٓ~ܗ߁ZHg"qDsI{j/@jeBʤ/!\L<0w)UB)\c@{ufP/'a8KUF[I@Uߎ Oz;lxo'pC;~sKm^{O!xwWGm`kGwqH_K1Y| ȇV%p>+<$gd 0lkdX,ѓ-)#*[ez@z]W VU 2Y\ol;Ww Rz`LXjbuÐWCй7}[8^h>a# WCO^\€op 9ۿѻEX̞T.v-y۵\ЊJPx8eP%:! qx rz Ewg"0jϸ0V#C%ot 1@[v.iTO2 z9E*LÝIJo7(Kq up8%,x'Q-놙$-a5͒456H9Ȅ"3cxNb&X '8xҁ*4GcH XIR ڡ W()DrJ%^hCIX%bwR2୒y#8B{_M' bJ#J#J75ŕл -bWT3Q ,J>PܩuB:).̗fs6~vBQێYVa'2jije;Bw-tǂ2[`ɠ"#(qFS✂!ֆWdkCÂ`vl7 کt0#Nx>N_m6Rլ4T PN<{KF] &E'2ڹ2,ФP!6I9Z_2!: 索߻OZ]ߜ{眗$%"*l '2Ν4HCP,iPT.`V*ifiukYm8GAAʨ$!X$J̌yj̴nRzkejΚ]o]?=7F4Y4peWTDNvh~މTJ-*z(P(ap%|B~> TQ>rw)FnJ\deq,KK f|!r&dF:;2;٦A[x^Cf[C8ycMmuz7lg/ZϯD{|v Σ=d,'$bE<јr*Md]p?LYơ\:$y3#džoacԁ =[U7sajz+Qm,R^a$5j+i {M(^K7ͥ T*x唠K5qwYL(fJpgy z7; aIvFM}scqo\I.4?6' +;D[Rnݡ%BgGaV,; O۞jR_!iab4RrY]dħ$T&.# Q)FD@ EڥGrg 9xrvfݷJ"zhJr _f)A:BTqc0*ЎcΑG x WZz]3k-$:Ц|К/mÿ7B!UX=֢E)Q`B*a({+giZkɔD+5?MeGm&8l^Ϯϖ9kj{c||sx!jǸϚ9/gS'y -ofvڟ+}^0Nl6Lz]_Ngl'IB^Vɔ{S/k79ڡvŠDtjvg}!OM+bڭRևp)0{oq.i7A[:yڭѩv)BP=v&T!!/\Ddpf4[%<]4 8,=磳x(ru;W}^=fB Q"Gc`L VsQH [J,$@NІцo[v?U8]5\z+*Kp@u1X\].8G#->>'RG|h[{ )oxxMS ݗ *Ņ@R#EJ@:P *G,`5pQUlJְ$FaQXQFNfXBI\[g=7NM(ѣS@:4ڤ1 S`gt]ӭmҍ nQ8zGYh֣7ԫALC(T& :r 0dMrnhj co"(Q>㡚  ";fƩ{l)%WuT `'<.D׌q^Vz! Gd\^粏0V=H(xEBڌ  2ko]pI]e(e%]l.fO+颜x!{*.ⶒ.:0B7;wSD{VA`{*Π&wGpt!8zBVgmGOjZgm'RzctOoRCH;"D"1#3ou* $W>p'c1J%B։TET泖VK+e4FϹV(ť"**DEAOъ~griU ]ŒJ cE(FI 4q&-#N-P$|sK:ו6tpa* /"2::hZ JDv.mT^MhvCB^Vɔ{o /k7E>hZ JDv.m)[5ڭ y"Lqf ;̗7|1ě ڕS%< HZY|^@Rj1Et¥G )Y ݂Sd|#5$Y%Xi,(BdXC*13bTpZ*Trʄ'1 ?J;2i"bW.5WuA}]ș9-2$xo0T:#`RA!1`K TbqmiD2":HG東Z^oФ"q!FWQΤf¡e x :0a@IS \'}JHf$}g 7Π0! nЭog Zm!xKōriU,1:Cfh!9Tj)EHՆF"tEZD2)By [RQhMnf`NN) ^£ 5g;/H6!+wt5!O&C[4LPXr3#ƒ8޿a@@2BOKp +LB@%?bc$.dZ}~Q1;;*U<]OdYW_ΛU+=I]0$̶7u w?!^I3"Vu[6Y-*28+*r1S bk; $/zjvNza łU,aU9 +XjU+K**pg_{4DGy[ܦ02K_?qPA)(۔G60Ƀ8/N>D g9s9/p7o߄8eGO7'56j] ?[O.V%^LN]>NP^cyc*XmqQ0΅e6DǷ/KQ gdEMɊ\iR ߯.O.3|xfW/XW앯}XN|#+jja9Z%*'>k[eY~>_]UA&e|]CqO:~ý]ro+pu:t|/M;~]ʈV鸖Ri~}$'/Y( ܨ4:&,6dʤ$>^K|.E?YL/# J|H/}W% {뀜p$K;MV'C.\xYXZ%|+-@4NҰ@l-V9Ro"W_]z3 MWPTq}}Gf;qrG??xCF~rv:{ҤݟGaț's}畭f t~E `*-sn e^n׻N(L=ɭ`g&H$i4R8A!qz•NpaS V0k2P..H"ŋj0r:Pl"=5\S*֫co_T([u%땴/A.5\+m}-6y_S;NLp f63gi` GQٲ˻ãW3jO &W22E @)d [(U ΖmƭdDbY+-Yw[#yiH*eIN}/}sy\xTSWCjs?7􊆙Uxi3kOǭrk4|̯-ޏ?{F_1L4 {;ؽ2JulCg's~dKZmf-%'3[fYU*>.2|ŞٺUU GZ]n w_xe%GAp?%>:8׽eGy:o3+n<`(d8TsN/γtO~e-Ëk^Z6&[,+[Dd5r,7nz8{9J+FJK 6?tLV ͉'R Js)iБZN:8)OQv,(= y-N f ݺ(-N?O?O?O?o~kKF Ǽ#Lho6zdҧI[J*jq*]<_:~ږN%;kQJ''| g)U #j.LZK<q h`:B1K,Qj4?8w3ͥV(h=15p ;_R "?.W)¢ouT)%E&x?Q^W"w*"ݳpC pm Q?5ފs))`4vsB=lL ׍OctRZ+zP0_81)њa:?^ Tz IZ_GuG+F1La5ɦ|g:g=&(?!B0ɁYgoQN*6KMZ]"|t'=*gd(IQͲe^8c%C&h@n2jHjf LK[.%tafO2w9ye 4G:Rf D`evp>ţ ԄNcx}``+(9Ah$pQF!*\W`3%iGz>ۜ5u%qOOd-ˏ/l TU=ǑW9њ9 A'<2Ez% OFEQm:!sLR}~+yK n-}\$=H9G㝘FkٌifLisMޫғM2J?Uُ;rx1RU? < Ј/p|`3G&C%!.)oՋ,(儗̓j*co?}rw& A@+ʝ!+koJԡڳCL<-[AebrJ$L;uŞd<|? HN.qg]`->nh(rzݾ=wҩ߯x{Ri !^V8- x=nhrs{{\(i1nr[6N0fg?_<*~zۏ}|ETԬSs+ί¿#RXEtzs/G!gdca! )ݍ[c_-_)o"eޒ5rv?2*mq^dV$tg#+V>OThp^B%TW1Ѷ$dyY#2vAhtT$ wEZ! A2aNW>}3, jJ(z,B@l͇ƳHm* )[|Չg S|qkWgB/nGl_(#ӍK1 ~9ӍrzU)n~P ЛrwEny<^1TjK ~f'>(`#ZgTѬ'eyIw.2TkZa";D k]H!Q_QҟjTwƊY܀@MyvUDe;zPR ̮H< uJD*ѯ*Blh㝊1Q˞fQTEY⾫a*/翏YEQ<_eWii|=)3w{>s[/?=^\\] YYc[,}^\kE}}eC~h(5IOUYb6@B(9b$ (hØb.#p+}FтAQ_S! ;R]B]yy\2AZ>'[v=0D-ɋW|?ѡ.?\A1qK+qW6Li>lPk>YG&W=P_-[x_Q&h]T yWOnw8^A U- 0^i$F k^>[p]H1u4/N- ɔgvW_>6Kj/'W ɗܥy#E&FC_5NcRfT8ejD=) ČԻĦgYoiwYo7./&qucWm5|V4T)rY4 y잜֗Lヶέ:-oCxYGWLC?VBJ!ħ'U!#poW)o:顠C|_ pK'oЦ˺|X:şh`nm;!Ԍq x.'*NY]վk5B=k<3%ҭ9WLNMpdi ű(kqhƏucԙxAhfxlr=.9A&YZkŹXlBrx& Pˢ_DM_!#/ !M5fBIc\^/5S"1/>G".?Tuѩ_uaw;O>0/#S]rÔ49~n˷?Fmǁ7JjuΓ&lio϶hʒ|K4+KInW727"c yW zOȼ= @[#sU5CI7!TiX+$k.䌝o^w\95^d޺m)vMg5J{ce (,:PBf !8 &Z?=C^wT=HQ3!,E$fjvvK z\pA8%OsG4,@X>wX"ŒKHc:MȌ  RmhRƹwZtzgdf ar`5pz!YUxƥ*?YoU;R}a [ͷ}Ԟwq; TR, KrN @ 4cɚ[>/ZhZi_[n,vk|Ƣlf܏b8[0/JgYs +L0Ur 6*kKkely^L㛑-b {F'F(jwijU[KǓ éC7xƒGLظgM80AiB)8YiUI?NHjoD't`NN҂a±[ЬY5Ϛylͳlq`X ĺJ5Fg(U,lY"zR׮bmYQ%cMs jA3v[IR.UE|r偎Qy3j<%z2 5l#Q%*FPyʊH¿ֈ؟r'=2z5o#츆xjDum8/#YYJݼըm #Pe,$EPY_;Q% g6Ϸ븆(`ކR*LMޔ& Ex%,ya?Tcp+CJcѐl4ga_N-V(JJM]Sw|ϔ7^i Vؾplbn);9sՄ[JF˭y98m0!GJqkܭߴ w75OÿU,ŒMs~pYhkq~>/[G,yAZ,$8:A(R@Z>h|vOu[w9fϡ ۞m4x6 c:P ؊!II^١ۃY5"yC ujn/oF])<?mjJ\LUZ|$w[I%r}k`^jӐj6&~ g~cXIHҔD^;Wq7h vFa`g 6ړHV逧eIInqß5׋&s:69ojG0Rɏ!Fs^%v ڈ6 q$BNGI#\^jm{+cjTFv0cAgz#mI칎ۿhh%!!o/747*++ $ FEi2xeǚnv.C0RzA#b]OE(5`-(( +$L+ $ A2;bCA-aaN#Hv!JQi( eђUE ea5ɽO]CpF\IU6;|kvU/݂Fd-,;|0Q.!>hJ,fqCiYV\ck%gشO^9O'p/Wf:1l\,jG=*\\_we ZF8m)Xcb)m6wm ·1aRA{50x;IyQ+&Nl_r1VI{zu ixA)KBjQZGt=<;K UPVR9+RZ2ڳhv&ǵ˙4bzVGA swz NUŰ*yk@Ve|kz3-zL`9]^K.N Q)0LG{PDe5irVI8}B^{܏Hx_.SorC1#zn{hJ@6(%Dp)^B9u1OB.3w挈5{4}|.n~Q** T4T6l@k6 ![uHPe86Ӵ}OZ̊26ǂ T_^ et#i$ÞD/ոǀNU4Rm o.FJkmAE^[>iZJZjlOUMNMIjiΌs9# OyO02D6r w-jJoWAem2yqͣQ쥪ufUALtg(hZu%ڽy^\s1t5Du:ZѻĒr nJz6:%R61RTTRLGA{L{2xu-9J׼ IxcY:Ă>wP)r;a{|oI62fE j oŲGjʄ%nm`H a| p1XSfߴ &]i_eX9[?;%_~ǻ<^ݛcp jgV=`!(E}ysp؆r 60uôsw3/G@"P.|4Iь"P 0 H)=e|_/muWPw'Xe2 c5NyQX6b+SXo;clCp)e8th{Q1Q)vZJ )cۨnk{K.7EӐl {Z:A!oGhLYi [!#SO- /RuS+S4EG6Oo/YNp٩mX6$D皋|A>(=Zv~f+ej)KX 7v8UKcI}G}2_/ Ԓ8-Y̖KJ|II8Y8'^&wFCpS4S)E0ELz!zG m&QUU >JrSsiEK۳awm0*&ݧا˟ߧNiO-7AOm[+-릵]~Ҵ6n}6\%(g(*{N#qN'#XA\uJ:Gr*Fm>u|9"GRo/uXZ*TZIBQ _VP<kfV$j Bsg72 Jn{`CK&k4©𠥕mBy)eW刓[ic36:AkbT.BK\W@O 7*q.6rcϿ՗KkM bFBևH>#749:%ϗ=X U%?#Qx Gt_cT<ہY}u ouHf~!8Jov֍6Y75= .Fuo Ce CR!/;aH| 8S={/%F)%v7 OIǧNʯH4ZlHat# W-Ǿ׉D%c6i̦`+)3C@_oFKNX4l#L6%ʦ .KǪgtcJ )EoAM7 qz+}lv$:h{lPym4vȲ덖.>1_[t,JkJG 2PVFDKUikdZw*E0`=nZo# (pd<@$^cMp});ތE5*E)%7%҃Iup9ÿS߻2T*PM":`zl\}I'G46YL@m[P5s4&ݵ}s=D 4w_@c'tOvh{ o\Nj~Rn:c\f/% ru"OP+1| .f"H_T%BBj::M6J+ШZEm 5YWEPړ5h~T?{'΍:iV*:;VxªNvqJg<rqҘ!Ӏ)Jcgp1Ž`!n˒-ITI@ٮ8;uƯF$m>+'qbYrxtgS8Y0uh'N|$ k}jXJRby?T[bsԩ8'(cL=+!+-2"V~ͯ{w:)(GO1GBgʕDd!pAlIU T}rmBςJIR텷L3}S76h++ J?R(>u :E9sƂmYPV晨1iG4`*,"&:Q"+74/rG̕*)kՇ/TJu Ċ-P!)zCQ`Tn\cBMe6)LJE0znr1(azTRQ=%УrP!A\od`y6Z133RcL[نdQ0ɾٮ65;U H[@T@Bohg5?ifJk"v#㗥q(p5V)"pia.:6 ܉Rs ".RL~urpF23D"18  ,x˘\mjncŇ ܞ-&Lr:EI!?3}%'b3n9w̑\TwysZ^%*kpwtfy%ekSXkvsDhN8Sjx*__N_鷝k} $l#΂}@l,0,3A+v[ q8#`8~e0k;lcxo3ןwp r۝N޼{oӻ -&5N~5G_:7ok׮99O} ~{;ݻD{ qU>n>?/\2F]񵻸l59~n\7>cuMp>^gR$Ҙ4ݾ^3[}Ax9K!B1_|̫'ٸyL~:49|O'^?k`/Aɧ~r?[+v/m~9=c_/@Џ7L=ํZ] ʰUrk899|ƭ\zdUW5r`4n Iw!WD'S9qGowMo?qZ;'`ّӔ*w|s ,٠4^9I.`?!L^!ѱt])پJ5T;\/ W?Qy;jo a"$ z#v+sal]緝}О֯?o^Ot}_~|wh䇎.{A#G7z 4ĔzfhMÛppybrϿSQM`np~{{Cucw S~w7?O0;V?ҧ @Ѻ_fN~Hjn'߯p|y1Q FO7w(Ld>]-|N,,,ଈՙTǍqigαUJn dx*B1ilpG}*G4J0LU5|N`_j0X n!;8X #D*jHp;Ѡ$RqfL)pc`YB[o LSL%J 2zKpL"2$GPΣ.xƃ >5` >\'?N0"+4fVb.HA C0_e Աu^]] 9;݋ 6qpdF<-}sJ.GykdaȓnfJDT?9R4g_ =SsIXڙPD!6Rzc+eMrLi/E5El{jF%9uvJB@Npe,q|"v`0(2j'BSp|V#+Hnjot1:ƝTǸwuyusuϾ:nwj9)B03!0`YH%Vn -e3R (:?+NT0S0OyWz#q,-u+v)R,u)Rإ B`AϬI;y^rڻqx:d3FWfބA?;}Vod/Z/_VӔVwM1v3ZgR0QJ s`_V"KI03"$\ U$'!<3fܐ4@Ll<uD >F {d"TYMTYMwSnG _s]kޮv+1BLn lb dkmVx7v '9F&1|H@V'!6 ;~դn=:.)EʇNzP+2Ua'Z :^+07:i$=y8VH]K2)~9^OK*"PtI0P =퀰K9ZȺԶ*oP Zi̺9 %&؆>JSͫЩ̝VB5cm&7y$&uG%W~e;GGY^hެ4|Heuh[}].JxZH]% J<9T*ʸ4*&qC K3)aL0)IUWx@y\2R\\#Pޠia V-F4 \,m# >bs ,cJ-,f[*lIR&8MUœt(lTPReDe§GK;J&)nlzjeQ+Қ OU aHH"N9: J%b#B fy]6:e+1I\G۷(iJo^L<|nk2R)pD2E員}mkt+9Vf8J,AYun*w$/Hgyv].4u{=900GΦ50v:vW   Z5Y,^vKwWRi R2hJRх i.B)0j`Ý獩3ؘ77 /ܤo5`oJ2a\\@ALAu,+\{n8|FUM?}:ܴt䃋:o)3ɶ[w[!D+1#Jj&>u&Ykg"eΪ_lV+ߑ[2fG,.'}ACEL)ml,R g.#0zz^srݯn̼]fsfcr9 <`HU0ji?=]]S9+wv)J)ݘˍ }0`3oj) x)TSue/5I gWW^y'XYPJ>W稸;P[:~;M沊YRZIm SU.9^^<!"K;91qj jQɃW&; 2!A s(ڭ$Bj&195SXsa195ǰּ%e=Vl.yn`3rup MyY+hjLYыR?mJ 3~I5ܘtQETW`F~7g!c $>ύZhʖ:;DEw;vK6oXo$M,[P缧W@;Uk{(W6ߣR}U+9ٝEDs9#RDk. )c$ŨXb;]PT$x?q%Ԏ1#޷#IHjGR;#\OsM0v :6]t>wLr/ hmGX?g@}=SnĕAx@ hk6Y{edRh#8P+Nh5:('WR[VgɌg>l}cg>#M3mbއ'pf;hZb]rNA)Hߐ,7GЄha>^^dw<}V+M ~jh/fP[m ?U!)Ckd0f̐mNY'yC>2EOd<(^+)\1\1+pOa5u[qn: ÔP 9b}*&g⥶<;ny9^ܹJ2;4H*wv *HE1U?$rPÉq$iOyj{-Z?hS8 LO ,K_1tcU?.Hu} /vv%|$^p|w{%{;u[qϚwvf=}덵7_>l~Ō\;o[yY[_A[页uFJͿε81TnUVYRp޿u.gs->ݿ=;՟_fE{'Q.o+jwgInz[fm֯"Y^VΝA A.;wX9 O4 lrx' yo?vhZE5X RRZ{TI›s # _jpQZJ2:H2݉G[UrkgOG,$v12$"6ĕbbZ kU)Ǖ4Ssmֻ+0<)l/_|r}x<?AP/17ר 4N=[5S|֙zomwos}Ʌkm? $K=]moşNnSA/[ɽg%l4zÀ1n~ܥ/s‹>LQŤRxC6ZEt!DZ] V3LŘ@T $ $(^]7erŝi=]xͯ7t{G}٫WnY ̈u1)wpby|bVQ\aamp9CO JfPU |E}*o2{h{y;w7*$ZГx"PV4#d 52lFuKҩ%S`f,<ԦS 垚DBVAKN"KȻdc))bs6.YjcXT/; JJųk(Wya5:NmlbX`NrvP֠n%N;;tn]I -'uno& 4U:Aw|Awvssc^}TyGU!=hφe#'oVygc̕E 9D.z<]mi0+mf"n3B>RdDf"@l1+ѹm 8YWcTM+@%ܖra bIYIFHFkkW[mAP@2+c]M–耺м6ΰYn(qLܡ"eLQ` A!m`R28 1./`,c-4$@gEŔ d!D}p6`'OY51uBJġ֝LxZ>_[ 7cTހ\K׎ ơ=Y#4A;YjE”B;^>l7}腬84>!t1a.CgtK05N X Qʖ=}huV TRεIjqYB+kôyCLۇG3I![Vu`Stɀ% } KAVcercKV}I墫\jSf ΀o2{IC`F \v$}gPcC;WB$=b]Ԟ;SuAj ]u!P-L b CbU>BKzeYN+LK+C&;z)X]䣏g g6w@j&T˥ZŊ*# =fcX堂BVYkRFS\Hhlx,)%ƛ@ x۫2^>SDF߿/Nv231O`2sQBUD$iOQbˊw&2ɿVO8ek >= M6ޯmÿnll^7QPCfC׺ɝvOTΎ f5o8ѯ!3Y5wx}my{F hbr|8:W&jLaA޵7:Czpsۖg:raS3rXfN;/$Qƒ "kp% E϶$gJ pH!T pEkL[0B! R k۞v8$=9pA7xڵ$ eaCId,1dj.# 6m:=IAjЙ="[G ]o:\:ra}; 5#{6(*>%Iǁ7ҵ{mdmVq]j";޶<ӳIj{`C*נC}\AaIʪ;{w[Ad[ɘƜk>L )0rV \ow @0Y9p b睜)T^aAxڰ3 Y 5> $QO@ A(@nCE$&N NMaTsxT[INNGMB+8t_prvP[Ղ#RE暵D9i`CIn:A\ /7 *!V|Ta&CuYH/J'$AƠ$Ci'R".T"}8L 7d{8]VIxFA袑9n ">I H "&aMhcMNcXY: ڹԡzBfnHrsfJAn7s3%Z>MZrg|#y}ǩ#+-ӮD|"Ͼ5[]_-ϔĝ&_d њSW"Fg0S=#s7^;aLy+I3j3vRY>$Q6Fvw03vL=Hsh Ng'mC2*ovԞE _%_ qUp$u[~ŀ$weAGn={MȜtRTcB|0|q=P!jDڴ^d鍭0>hsG (2|5ע!Ln9nGQ%_0:,`Tlvowm{&~"ۍWvaj*{(ݟ폔@N {P_>;[0H@!W~;; AEk lNt696]\]YB"6i>Zjm]+V2\8-gow\>u  LJZΕ퍠v{рS^=T6dIBՈY)%`&kIM=ewR]\Ǥ Lu;޶=#pqVff:}IpNp^>8ko=%6ddMy8T}+k2"b~8| |T_p.k~1N?466rs^}wNM-|^)xG7bn'5 Z&8> 8!r+$$_l]g298^h}b/NW$sf,`Pvz+lMhI^l l'Tc>eu,:^&N#T4: Rȕ}eһS sEuPQX))Ë?^.Z94_)\|<9=e;(I2]z9;g}cs7o}S/~h"/5'jJ&WVpDPS%`N/dܸ.v'#ſ~/ŢqzL YVኢ:T xo{>ޛM󪶁]-OaеK娴Ǭj)|o,~aj܎şAz/joH$uv~J!Tŏ߶RYOBŠbh>ʦ9*:''('2DSIeAW<1SF+(EVr\ }jJ/W?:iOegHN0$})DvH*T tXxhcJ;m31ֺXBO>ִlHEb rm_KpW)‹eQV 9ɐFtKt6tX= 0]lB'IBhN"m[r'mLUMYz#WHh:񨔪FU23q(waD#(53E'JT4!@)Hڮ9,y1$Zq^IYgo)=\m;DMRc0 ՙ`K G]OC:n@*A\'tq;q MdzxCc4]#p,:F(aOM$쯯_?P2]/WT\>]Gukq{R;/^\m֛3?b"xURM2- |X>ܵ[EG%o%^9wa߼*5_(WrG#E$E?qRdu/ ezF!E,jf׏)C`6ϗ7 HN( OGuQPptnoѢxB_?m\\qk/}<8/Vw ~-uP!+iqn99]oUxck63e$/ZΫr3q˼;$ xbd'MA֛+t5jTR;ktPIUա`fT'!_W+x?UZ ]}l(]5 7VQ9o.1AW7 ߋ^ݜ sJ^R7TyU21ó^mxj8oc#yC&wf4o4F߽ҐM't:}J̦4jH'w}p.'e̗o{ ZKکgWS.us>4,#1ae")>i4R~tп./b:%ɃAgQJ2e`΢j% h=wH'DVW*S}md?dڥ$4#z1@PIa{#칦oTJx9UPJFè5zFKנFQV)O*mҥ Jf+fmnm8]x@XL=yU'Yܗ9D~=$=(zU$ퟓRYɵgq9&yFƀ7J=;KG>LyNfNedwwFNex0}@!d)XM.,?ZǺ޻+Wz_9(i7O7 GVG ] Blo}DŽЍ1 X$DRŜGȏ0,4>UjENZzȬc %P9`lL˚Un3`zXgf[+!%Qq^LrSW"޻cY6^32m dy)‚}4(sr_!qw Q 7  7#\Ԛ5+;D`! %T=a8a<+.v~C`s N뫚=utu E4 1X"5lT!R$jPQf@j6JHZ)ȟ`a |ekyj3Fai mo>bN')=Hz`B])m1){S *M&2m6#Ù#&v).t-$_yZ'ewʎ{tRbƈ=ut0:Vz>upS~ׇ]jw-BD|7h53^t(PGٺ82_"/+ ubH y&DX <5kV$i@/l7r PHf31:a+>% ʚn9sNr!tM:a$(vx?8:RC8}-Hz#^ `/)Gǵr;sf"oh߉(IQnk/k&,KUXnq؂*ɬ }Lj(0}Ic*u1v+#RÂl!kH(خe@]p%+I.̓gX\sʈ, 'T[\b:H}?L`X8W^@% uCFB:uA#R9+ԙb8{$C+M#/!GN41cZjse$]39f::KFIFĘLt&Od[ľy.@ϸ6vum߀l&@Ӷm$kP>V ;=:;Z6# L0*ZyR^sƁl*ĹȇX6^,S<YX,VrMDڹvpslMiE!Ej{j3qyZZbn;ٹXR&l4\E 15^WaTW)NhAm$VTcm[]bp}ݎ*-v7u%q VRJZb%WZOGb-ѨQQB#ߣajaZ*Xأh`Rѡ @{@FEd:Nkwf~h+B%δfwVU+:F,+![`+k;3fjMeC fYb.T %$BM?/kD %P+>}fJ)iŊs "ڲ6ߍL-)S4BoʜٷKsKƱ`^#D"O#?$"A:. ͤε 54 XÂLbiyhwA_ %i0 E/ ^k̑[WOi4%d#HG-쒹T)d-n- wQV!Fm&Jɪsj ʪb* -n6MapK+u5i~ؔjQcPM MάW Yrv!(ԛތYq: &#=-O"/ IהR eDBw;Zw ʈG:Çn3 ygK!NǤz (L"yRTL 6!. Fij0fv&O](34K:q+ASG9|Q81;%w&x-W1˵GrzM*ʈ=ut zIn#J;G-&h8DB{ Ѥ?!}ِ#*=nlܐp/hhh+Rޙ-"mj*:lb gA k:OYٖ8SѬq⶘5y8C%*q$6[F\:F\VnwqtS,#ꉟw:}u);]iZmM;4} ,>s51ՠ-ZGaYp Z ۪s eP@nQХdow#{ч2.z rQmid;f)#s"lotqkGA8IgsI&3&܄$,!&{_Thl(rՆ>U:SyZyc &C &Hg*6C`8E~=r1HOyAH9iBjӸl|DG;rp[w$={#Rv`Gv4o2-$e#֋ 6% cKΏ9W { !6$2>ﯣp Ru ~ve6.BN\dTe2k΃2WeKN6m`Tkc#!l4D_X;Y͹8&hUCNcL-o+9qu0:+l~6>|Gs鯲_rCw/FΓ[~{a~ (G'|"#3=O%LΥ #7fOxb[ TaJ$Q-;J_0`qi1P3rV?-n|q'̸R7 ,%-eސ;˰NnE@1<'m #JTy <`^ a"t}\3)VMj v2h40=fNtBgC2-hD:C" L;::Inn+7op ޿>V<[?Ώ7_bݶpgng֜G ar~TS:i}eM-DKrK᭰gp0Z94Y]ѷ|Ɯf4پ,4Sͬ?ᘩVgN-# ql1"{gۛj:p u%k?HB$/ۭ)|. ;_p0 YF/Ջ$.Y8WgoQ `!܇lnffY^̋wq4H0~~9lꕹt<@w GA"d}GG2!x:$$2L1pyD#&\CJ+aYn"oaWW~^ޕ0 רgbt.Cn`40xNϾ{py:3sCE_O/,W~z}yla{Z@Ug⭗|n͐xyuiaavE7p RDڲ ,y~MA1}?>q`8}}{5_a򏠬6p|yzvQh ٿL{B%كߍn)WzO??o(I< Dsޗ<Q_}WR>B龀;.C!|O)xC F/૒å+u=P`xpB›_9ڼaOqC\T_xˆӻgA˝ݤ/yC 12i}x6.<+iS|}3W[l{h7ٗt?ʸ|x?ꙏ@P/^͕!GFï&>?_7A|x:=a»/hxaْg+˗oŻ6jzi}c)ry:G |=+GI~ _`hιM_VewXQx8~L~qg? Ûn xw:%[ս~w-AkzGs޽n|dҤv3z p}O>dz9'8e~2ӯp8[V.j?Lyie:_ÜߠOwiQ&MJA(u>Ǽq g3[.Ӧ_z2y=Oן<:{s`,W^r=jiE EOz>Iuo%4wy&r{m^CJ:r[oї=):RA?c9Z-q0Sq(`*H{ٰl:L0`Z&sˤ=E,;AvEYO^Yuf;gdw h"ſ)ZfGMps77ptǔcT/C OW#Fu{;`)u3mB}0<xT_ >d̔>l|` ޅ{0c} ߈~B{6@y'xRAޢWq=z U8vB2$Ai=Hk$KBXqĝ*4RYs*6\y&LV \ r^. UUDI?ol ?.p3  0q`dJ?{Ǎ`^ 6=!Y,hO7N$݇ŀXd%ٻE{F\znuXZ6`Y&?#L8LTI}]-=2MTQ~g+>W+~ ,ڸ5 ^Jjm2mtq9 e{!]?؝^g&wᅞqQVo^FŖ[@uZb}^myW}(;RAs s#LC+'1c. ŏdbTbDqN͏>UAn.㌧2b  [u6^݁قŊ}tZ,9l?1_}_{}4zh {2r=]ŰkdXiiHLط| z{aVal(gOZ8y'**wl~ڏ^SOP2+e AQiؼ<3Tc9k?߫n[~6:B@/9C }f2B|i~Uq_{a|r'r K"f ŏ_ew$fqm^_r\ɝ\ڏڮ1gחo~xb.W=0:K>l>nHnhӴx=K%fÇfiO1IgM&/VQ4l=xm|ޭO,f--ldqs~onsy=ICF/LfK'|\h7&n!^]5ƩPf!P!P$TR(Pك -\5%/uE2(BF袌K%`֬IH$"QHDmtTz/Ǿ||H$&qX>J^YiS>H:r"d66آrp** cWorlzz/E@Ν"<OwwawF3@ؽʾ:b?f㖚#p~POOKZӓ%^B̖LTግʧ J(zoޚ$Yv${HvR.z+/ƾKnљtu4)i-:Pad,>v\=_Ûei nU0 GIY ΣE_"G>eGGNJC iß:U|s>vݳz!_u;l^fu^:~zlbFLڎF}Q]DHJ`.tV3c3k2A)jBNx_Vjo,(ϘQi<&XTĸ94`VhШlCOd*1Ad(5|c= 3L+?n;<4H큅.k rj7OB/ٜELVl)p$Kg c &WW%8g(:d&Զy4uϣ{Mh6<ܱ#f:(*er2 HI6Z@ϪS^8}:0_ϯU6 (GtY >x;)F F[ YdH$LQ%Ghfe_+E Y D6jJqʊ#%F@*hi6l 1nbdMn=ׇׁbYfEs?Vf9?|VήonPsm. iTaJt?8 i)^*8?77WF:o5όDϮ~[jNwNӾ՚d;P {)\{m6v6JZȧ/NPd^*q8߼%Ov~~OMC8h)ݴ,88hNMوn7Uڿ?k_޵?WU~pήѷxUIWM7tX{>b x>6>-aJq"n"tb .YKuReTgyfxyyL:O}dWݘ(OvȌ}P/iX@6kg3xtem4pcw㧎c v}ꙸ3]&== a5/4L+ktEB꾶:..A Y7XT%_݂Fm_zkNO42H(EɰRDr+w{_RճCY3%$- %FQ.%i84pDl] }TIS V둰W/BC 43̰50 4C(Uq(b;=%V)&فC?z3 x>IIdWԶ̌FfsRɶ%g`!ASEtk?}xLwY.TloѓNx&RL5Ct)K32U`?irNٻ޶%Wc}g}zdq2KѴ85jRuYIʒ-'E]Z\أ̅` R2kل`I'uȅtI%dtPJ $) FFd (AsOAT`XĊ#ѼmP$i @J/F\ӥwA M9sҮ6+8Ih'muB#m$|aNpJ"V 8Qm>>5:'tvA8z.7wqiRQT9ʛ8L#6.DrFϝwN4ZHDx3wN̗(2:Wa4DiMֳwRԕa" 5 nV**vsgLqtn$ϳBfx*V̤ ҭP4[''t;FFM+=+Mp6?L@BZ;b ;P-Y[ {L6c3!`z*@$!ė%gɄ~䊌/aEuwOIԪ}ۿiO-oq8ՎAu{Β6FIB2*OSI2 uza+g9`rf.S2nŋz; X1=?̒w5ZIZȚ_VeGa>+~B/.*~~y-T[E*GX^Ҭ@\q&9/Tep "(%-lD6] _kHb7胴 ]S*sP5zc~*IK}N'\ GbՒ6vQ67p_(E kD3V1/'\ՙcZ *f1#1܆֪CSŅTVg,jbAWOAFyۖ,D6:9tcuEz%u!1੠v JxZi?}/$d2~Ǔ0Bl!WHՄ  dK?_kqzW+[QaJj.p 4xytp&OK9SʣXJa@5̱2(m0bAI$^#TrD`2qw *bS-+=eE];kpE=5 uکIEYY~4O(M_!K"Iu1+wq:>JOt2I6=/Lʅ6@@Kac9P0\mP.,ǵn==іиJ,~v&hgء*q/빆CUvi8HxA/i &@ܻVN K.fӸ%_kj1́G0aXvEjr$9;-jS9_ meynL 6Ol}_QvPZd@b7r2Jdz\nKg9ΰgnaLW /閥}qņ}_^h$9}egl`Yvbm!ĉ{p S-O%idjRmIqS"e˚aUٗB oK(3]RFJT͍mfB̆m6%Zd*XeۚQsEz #!n4k'T!~%KgeNv޸Q3 pKO鮙[/Ow]SGDF-Gq3L7!'\k;']9I9iXߋ:NÁoT 97.=wN]RL-[Ai:q#9 G CfBXn 9G0;--3(3 e @hn=Cll#)Ϗx56Jul 29FC !wĭ]FB2{ch/@:h}? 4zd]\(OtY)h1cJKH 4\Gg:phO0S(0g ,ic~f3~* tk&G _}beqw &]mMTԉu&m gN. v]hjIOh$ 1X%hz"@m,}qi&kCkaЅIm3ƜOu'xt'iGߌBeg 7@E0i;Ҍb7z`?.?_ u;`x%jFn;Ψՠ>I<:[`f;^t; ^d⯇wz@G%|޸qxѾ7w?0ïo޽\<` !†x DCR P 6J6Kxa ne~C9<p3ćRZ6^5DH˔|J7FݿL3h+QV2ܟ^b0yv BƖYha\,qveFM?GS FJ|>jEɳgKxYӍZɍ{NI<"@>9/,A+nݼ{ss2ً{OoM?fJjԷ[)\GgBZ}lH»vԏiwWv1Ъ՝At^e\jD @]Tcz,)3]KJQVAWAK}oL=(휁=Oٹ8U=viF]]klod7@3sU`:^ݟ|=<^]_K@<]KMU(;?8`s n-N_8@0 |lv}סیMZ[PIq~2A;|4;.J{B>ٟ1zMhJK)N^ӛ{Շ[OiK>hfx?h.d? u؝2h |^D:t8Q u,{?Ű\EƠpJRA'q9$&kƹVr5b"wY[EqcXPBQ+q3Sإk3엸V`lz酱ܘ녱ܟۭv7zKY󼜉+WAa4Uǰ}ۿi˅6r+|ݸ\o!@];7s+)ܐ?“xq$Sv DŰf& yj_[B Xye0}?U2KfA m|XJ+7)#STAY3>V#`sYpDs+i ywdBkmJ-(G KY9Sld0f OPu8q hUoAf;w}Fk㠐;W}s8ɎS-_CuAm$~Yr\qx,'jMZ~YW4kףZVc`tha#Iv7bn!jS橰]@Ftpx10r2^vGpe4w]09@܄mZx-\VdDTWa5tɬp!Q@ i{ R mgsOK#YB20EV'W#$5hםvd>}shno{Np>1kPlX'3;eG 6;?̯]OZv/~on~}Y=cѝ}2q}7O\tt0JSF|>2iGgps#rLFvIeFӹ 9(n#Ӵ{-(_Guh|Q9cݣK "YK}arՆj_k305T^nMoĺJzTt |IC`s'䳍)#6KPTDdL%zB Rir/t:FJκkz>vsb%UTKYyxGd:l%/>}ZqJ54POW)$MҞ $RutLHތ0FzkgZBje,;%!E owUҕYa}bxx崖Y[k^ˬLAJnZf鳠tiEz&J42VJ] +"?k1oD'ݥtѵO}F9_K9Oą̹܁K4i6S,%690-АyXu .?-r̳qŴX8C-ըڥo>YRWkQg(o@ђo6QAj@"d 9#Be⣽FFkFciTjL FT{ vpce6t3{1b>E}ɗ46conqfSYq=s~~~;@a8?2ʼnr[X8]Ȃ((r_*LaqS]?_qPj3zyX٪eT6qW!P`B(7/Oc4 FĨLB&BSf@Ϝ&,O1B3 )dBB+ɋL+.KNM ;V8^$X *dZQņq;ܦJu!YaI _533BDtѭsl7&^$O3״-7_LEԼ31/~G1=;|wϾ0mX }j$$ %ǻ*к#I\\U5}+eƱaPbpŏ/:hEHJ&5NB;-jQG[*D"*۷v*۶8 (f;s\+ޯ6D u!av fp"R궣8B :θAFv.F8R*RI0,"LE@8*hi7YI;[_aSeXPʙJ2b.3+Gśܽ&MXPEͻfCWpq "o+RӜI0C/f[o;KD0$UD5GTSQMxFP/Fy΋QΝ2F=Y)c3V0gsˡ i8P3𜆃\A>›o#i5«Gڗx0¢n<@G/.#QHd8b&&<ptFt𓝜?Wp/ :^x{,d N`o-5Bh;1`iZ:N}\J,'I cE@Q,EܢDAl; 5fd# (lCgߞ 6u{dd"oGPR#cE Lmvf/= 8ˁFi*\a5Aw>(]!C=.ro]}{pvt z. 5LF_>-?>/M}'Pqa]d<' WZ(c%!+'eٽݬe7~?hnkmg̑}5%/YO^mfY ܯ׹)"7R(c=2ILY%XZg0fi1;D6.K,W2&]Vw}U )Tb WV%ih&б%"qXw V=a^ך0R\ZdK~1#NG㗴 Beݤ+V Lp -momYmih2 CÄN7 ovyFq-׵uGJn=u^mKq3XjMA_]iMtrrexO{A@ Ŝ)[ut& &u7[WmGW3v5KeW{8Ǔ$,w&і L&G,O<3Ɂ1ơQCƩ(Q}iA9\`@ih;ϋOᶓl', af&Ox҄.r,g5ƾB>"I603%% + d@5qyh(92J]_\"F$nM Gޞv ۑ t0բvc]_-I1 | RkPM-U$ƥJ%M86ъXMrY0Zɲ]B*zgP[7= f%o{={-?}@_6^t4pTԄd^" quyTk5kc YǴDrƣ{m{^Z{SIJnG=;ׅ|,~.}<*J(6%2ppǛݻ0h /.FӟDO%|E%*Zq9yʃ,G{t38;0~ aIIl.))V2,8iL p{LBPi2nFBqLSSƳk29EiJKn ɵ,hNuinr |6%*Iq•AER;1Ȩ͚F: _%4]Iqw`BH$JT~ f/D!s=WOw|Q -ҷntmn>Ku$41/_Ƴ?1?޿\VTxT<+W-W&վEyD[yubB}iR!Qxtb7ϫ}h0S>i@@/j }醻/R0#Q=(F i!u['O}}'O}}Z\d&SGm2K qc:Ӝ9CdL/.sX\UyJ-cp'5dtC30 X'O}'O}8^F*(0 hW)+,Tiiqbe`h:P#h:>OsJڄ Gꉡ76bSD6 [N}ԧN}鴚fZ5< [)sJZSXA:Ւ[as)YX(NgEᎹ@ ;!ZNa@=$u_Gc9޵q#E_g~{8fH>=J4KE#[Q5b7[-G d=i5W$bU01geeڻ٭}*¿z.0{Գ/B{ς}wGcU0RϤkekՒymoj+؂ ƥij10\ 8տ$sM ! DXJ# 4.4&F5lE04WWW 9 B MXaR,/̤Hh1UZPI€W@ !aXt퇲g!h 4~^»]KuhjܡLЗOA-)ߪKz|;tW qW˨>tȠ"{ ΢TOj\IxO @;1oby<1Oz8|!F['5Db_`Rg/?ٲݶBj$e) 6, [2 z`"Pq͔BYiQ*lLR8Y!cܵ `Xw\?JaT6fNǭc\WbVҿʔKR;k-p~rFg"t" a9X,R >z2?4*8o,5LvLBM[`3`s(wX(4X"2p3Ldtߌm( Y.f&ezZ0B,6h^wh48V)yőfa+k(+ui #hT`r 0 OEe4XEw;?yۧl:>`vsЃ\8{]xw}ѤR[kԻ-\I=XQzcewVYFb({eb`i zʫ.Lx01Xme3^ l߷}\yZD:e*޻YQ0f,IK*͝bnDOIҴ-?.&.%ZD!q)Bomn AnNnuf#Jۮg쎦nmh_\E7t'% о M7Zi.RڢDQ%DT+tǧN>5$ǰ/]wV {:jE^ ŔFTuٛFjt#SΨd5Xvx?yxç@f`Z% 7Lx:I !k9BCN2kQ1OP\-NZjY8ng"МCzxtÂbңϢYgv)0=,s ҵ)Lx0ASVy)6i4&=w44'3xErGa!Ɍ.}-Fʃw P YL)FӇOWI<գ^MjUH*ql;Χ]R|)ohJZlW%)8񌷱KH`ҫSp 94vu Q$u %D֯nւT[;TۻUڟvӞԩd.s%I%=[)i:ݻUsϝzYC|ZOU7ݻ 0(D( 8wt~c !hdk_![G߾}{Ѫ,|5#x3Q%<|!<_P0WeO֊aX"g^ oԺ[wǟo4ћ~fwQ+"6KEDܪ8EXuV3yT i9NWvPT1[f) #ueEIQR>("z(:RW)"-PT3zvyDU*5#AmbtC;⭮Q5wNW^mFQ;.nN[W%qƂ{5,NTO,6wz䚋,[xgNf/71bx%Ѳ#Q1 >a-QkĵKO)؞LHT~3m'bp@]ة.}֍5MVD M,H t6PVrxÆkۍ&vlvoj "J(`;O* . Dc)kE e~W T2oEfw J5-/?G/gqϿfRߟ4:juԴiUM[LE# E Kra}4hxgkg[M%tpsχWWwAY`xЫѲdDDTo7nf7X}9x d§5)U s1[TE2Pbg1uZ-p6' Cy(!EJ$ryf9JBg:t[3FD1?3#! CȢtC oIw(%&u 2GNݹ5N80L:]A,2ɞ] Y+'%qʋ $-!.EHŗK(vS_%yC2 (C:]uTZQU5 $F-P F:ÂKYC$a&U*m(m mm JY J#aEF'x҉l9鮄`nvh^7_bnyMi!Z57O1 =dl,=t4SBJѠm^v<;u;d{1[(z:NkB6>[Ep N bq0a,ww0c8;ńO;ם%Sk}5\;pִrK,8]!>e`겈c |}~UqGF|bj*9ETSZHFpAFNS1&~K9޻<O>O}yyұx߻?*FD)yaZgob_?GC|/Sb]Upz8+&}jw0(^G:F*RDYKDS.PQUzLqFkaiYtcqRDiߓjf=*WPt0Xu rNK,XK!,HCa>@bg^oc"l2i>1aGd0 @ aN Zd;aNC bR @@-)JS1sǰ?Ԛ$&m Oċ_`"=Ry\25D嘰=_Zc*H̴2R~@0-1I A/q\0Tnc 0gc2. (tKDžz^ɈiGAZ0A M&ppLBpos8D%!zzS7+W,fTat?})VQ f]L {PWjz5eWq$ymo2y5;pv.xkOjvB m+.{.nTIPQBY)C`) }Y kV&G"&95s*-z[%[}XBWʯ<kD= hoS/$̧92Cb,aɆZ]7d 8 b.a{a3) M91!\?Liag.^2Z{Mɗ[ǐDҞ6"ʪ>) b.w͋KM#j_=~숞7E0GY;/Q%iW_Y>u3 XK8B"g ?9C.k=.8k;My<堚V=ќt6x]aZ/v΍l>K ;N:p*ו3IC@),LfvC:B,Q߷/j>׎fW,VE#Z4$U AN£V w`tNzd;dF\JXU6BAiWH\6i,rJI.g@PZ>GHI!S]vx ϲfcV9V`e'Ǖϙ|Gt>TgGIy.Cf[epab WGʂ h$$mȅRvoʓ *{Wz ހ@$!+z6.^v.YxR($F#yjV $ ۟kZ.Q-*ŵ҃4E~ t#mB>~3#+Xȭ 8I$~>ar&=:Tz)˥ ށnAORj&=]DmgYY9aޝZ3=EZY6mFl8Y6MΎ5W$n͌g|4e2 lzB$?O,NOT*=M=aL#{Q6&OpCOWp0?&]md,:uHl NH`LybwIC`Ul$$Ck |8_nϺ؊mm=VU>לVqQ_y}G63L[I]^|Bܠ9.$M^VL{㌡h4:pJxIA(di `Ɋs0fo@\o>:ZH@D*&< *cKIK~r!+?_7m^M TǤ5xmƢ笵Y4&&Q i9);jO3J]i;)+OKF>}:¹]Aie;Aw./y C& l"d3wyM }vEF QYY/}u߁4ݧm~%u SW1Bխs+Ll`tde$]:S\ޫ>|]1e]105d4yW6c|ΐMqm7g9۴Mcp*C5d]п+ 1޷AmJmm`t@Ĉ}Hs ڿ,\Ewl"Z6 NWŇփ]4`T Z;垦g C/J-( ?E\"fqDi;RE d >+jWkL@E8:&@I! '&K2y.Dꕽ +`jT&V9+q/@; L\Ei!'We`+~@g!r^g G<U;d%+~%v܊x/q7P{o7; cPsT]^ =yCzGr}#Vfٴezn΅8n&~xݬv[mz7SHrf8;h8ҫhYVDg9Lj<,ԯRKEt"B\̸WgA'!A^nHzav\Q'42X&pHI3aNi^`*:T5SKBF5֘cHZ"([&I9F+?; <]N4Yu !p]di *%bu *-mҁỵoT'NK} Fc"2)K4(ڂ?" f]: ;^ {k fn[ZT:͘p]kQ.oߕ_aEj Gͯ3xz-eX߾[VTɪMOjLW|Ə~F}8!ᢞMW'`>|T%{]L:Կscoc߹_`< D[F4ۓۋ`$)J#\pk^ FJ(ׄ&vT)1Hۇ7hZ,njJ@~776 JJʎ|29e\/zpڭNõ.I Kr1*t w뙩9< ]O%>gj3'AL4H|=N$^58yiwZ#ݻ R]*p1*%. }zp/sp BCu"\֓Í/VT*FM3>4?1`x%.$voTj;qP@bBLV,LӣѰ8 )/0(H"a %ޥ PKjDА5Ŵ|~M'xJn>Ϸ-8M,T3Q0^E<ÕX(Kmv ldO.dd燰fҼf%+Gb[ݫ-H*hEn5lT5rPqAA)v@&bq!RԶ ]K)C;9)!O2]%qa/cq:/wIhpcy^R{B|"B́.pB]d-EK`ԯ}-UZtmB)p-4hTDw۟l0DwTqnUuk:xnևB+1a {gYDK$:6X.y++sV;BpI۾u|ss^.#YWZw ɻF (U~X<$!wy2?2AJ8euθ $GctHΠZ1ws+?=3bQ.PBd"9qSV44Ym{> >ursA5WlK9+=L.h4$nwEi wˌo?H9]#NʀWB2kLќe9&Zz\IS`P|Ag-џ'TP+`%L %Qpc*XAkJe#:6{EgF*8 5!aJ+3-{cg/aW'>[Ҍ9ii5{O^C B2RTh5Z@ӱA~< Q7GSgL΁?* Psy~tဌ}yPqxqig[Pl|Pǝ[G +1+#oVx0 oHѡJ!'={G?%w{ub%)9>Xȸv]H59h/O y@$lX|җ }\1L=ѤUWH/,|̚?)˘}rO.ce=dp N,KfT^RJ(QS"E ?x<>Ji.bfsϏA\m$wɢ"Cl^?-Zͮzk=x| 0D*"mHԸ1t"-⼖e,t]"@eP}!4*PO!9eB@w@6VuSXK6yrYjZ`؄1U* le dEsnhۺGS s9_z`-Mnٵ\[S 92odwx& L '{ Cl8z 47}p /ߣ tb(pq .ceLAԃEA!opi!+$5_ f; Ij9aL)IkaL~#Z [Bc `) %Ha'[A^mӪ$-),DjocX3=6i93ΰR(JoTAV" p *n)Ub3,4pcUվR^Η bb<&ʣTwx$LI砛 vN'^Q,W-$1^QNzmt+ Ue\Y3 ʚcZ ~a+.W ,WRn \YЕV,TY+)ﬕJkE 2??(ךvㅜ!:`^vp>`NN+IYc3kqxsOdtՆRoaN2GOPm&cxj5a{<;{OY g AZ#m:> 6]_/.B+ lr}yzeMQ !^qE2XEA[&0u;)u%f3f1X7aܥ6#RlLFWb-8'J5f2^hkAޮ"8RxgWH\^qecDH)ꐻv aT&' [FΦc:mE Kt^j{^ٳ[Cic Z?}YUjN2x~:ag`ԀEƯr꙼jS;0qSJìX{ri'1+z͖yz[?+v5=a"Oϕʛ/'wmn&aDuKyn߁EZ}("%x1iW UE.N{ǚ׋2󞑷!!,&Ck.>釼 bӧĂ3'Cf8\\аa*v:Cآ7;OFSym;)_= nz]"8 1 9FcLxMIzO^:\{lZ.wBSncĐ pW9;W9ij⤇iB5w|?zpfn&v嵬9fK%G"My8s2Q-Ȁ;y`|M͒DSDyZXW+:Z#mήpp7'8dq;  E ed'tS%kE>Ϟ]$a>2l1,4'Z%ACc*wA -L+.% h, Wz[ZeGs$z!u]B?{W6P/;,}C4Ѓnl/3(t#uR)%L,Òˌ82oWks]4[8*EutQR8|CcUyî!)^a[ !*%{3Uh=3B%#V!UBbBk VZQX[0Қ n3,5e,u.m6;7+eAμsl`ȲkH3K=]t0.8?~Mw2,}tƒ)QK, (te8&8\gd/"oA2},m$zQ| 99̅Go#!%dԩlf?r3ĩЫ'(a9Fm@)I&>v´{)>'?Tng"ZJ(O & cY;*sІ˘4y7Gn^7%#fJy(Qgle9j2gϡeN֣v 4/YS8L bvmGHI*$G <`ПN##V VTfɉwh0IŊ I, ")G"8(ƾx!H64I:GsURv.#Gwoyq*qsMfe*3QN%變R荾-sډUR[49Q[v Tkhc!Hsm~z,(л/9z:71hQT88COZ80-1 .K2j )E,BO9$q)Jufی|{%0ڡ@BAKHm$fm< iYiGl|vt}3%Ay\qD"ԃX@}ٶ6m%ҔpJ%q m˞Y?9Eq峹JT%QLbW xzikZtGect R~w.XѲ!Vc\[03+l#;}ڧVVARr<᪁ѧtv֭޶+n;8ӫ Kd=?F1~ͅ$ԡMk'7*2V-:HbF>{jυDBt91'[wA9G8;qݐTbRiBQHz-DTcX G ag+w ~-ě졙׳0,`*-ăR]yǰ4 ;ĸƚ"6 R&NBL )Q4CFfc`J0c`v[S2DRAܦ~40pbx] `W: s}aT'Oc)`";8Wȟʼn4J܉?o[r'א]m0Vy)pJ !M;:PI;>>{2Oj-1]ɼڝ&_}-5ޛVjx+LmDDŽGy,$IAn8J'd`ҜR,)ձ|P_{ޔݝ0P@} &~gzS#excWF!")^2\n %r"s@Xw;Tbt X8Ů7~+AN`{iEIS egHjV4~Oo/'fI"Ws۫ ܋pD=F $^P$Ob_UV~U^vOnW*8"0?q&m0~D)I#)M 8ƒ*|$Xee2w2"a1NxDDI!Bs $Ǒ10DRJlo{jWd!D"3 (y}&*4D$ 4H`U‰pSDk,@m@R<(N0AJ nZZq90uxXD^\xF8"-pZ)&k E0UK,`RAM4,r!&!„@GIa$f`@(&ps-11@&Ejj#Wp /Ug( lQbI0`+IXv)VRTQذdzs\V]vw~Y5o(30~[ۃiከ0s(nmwd.PBȇW/C>ǰG"Cۼ_O_F3K~`{wbooK7Ki2gɖ8׃bk{Q`͏±`P4[Nw2dt┬Y)5/YÐ3X")T"UŪxmSͻA/Et`T" hhFZ*+yrn~W4%mudIPPzFRN_XU. .gj[-NVK*$&}ɧfwd7iO+=UENT`:KsBY\ST\Sfp`N p6tM4! cJP` $#" w#8Y %ɐ0%qLLBU(*FX $3TE0$NIAR TO QS 4Hq8Pb}G6shHs'4{U}ǃH)`'Q;N潈/6Vrz⒄'Z# DK(1W9x*I&(")b zCE3˨% X]lr Dg03_f/ 2Y^2K%Ae1,3B&=1벗k$塉 Z Gzu+ZHRt39MY`bc"4 BHB d-Ԗ ]ts5=*JBL#BzKaE cQjJD)PB4Z4Nh4e1.B;w^7}.le1) E;e~WLh)&@ (TOMOcZ4DaAeI LkFLiu김T2?!*2"b*[1TR&qEotDh\eZjFTD*LXSaRTA+p zR .]LcJ,jW,3R@eI`yb$dPdr >'0dXks@Uڗ4//#@!okb ~,%Û_ "0LzϏg3vV x>cw3z8^=2{uz@w$2ݞ`O" X&8ͥ9ehr%`$^s,<#bw_K=lX6Χh'5Я7t=ܾژig&=C2gwFQS޵{ G5s~`ДDI*dy2ztdLn/`q#g6J|Yz31^ݛּyf:h"xB<"M/}/9 Noe4AaI 5Qqpx(rLeX-yZ5zF",kGϺgP4UMh/h_r=¨oP̗bg]oZu2G `0uz?^7@\0v Mctp| k97] ,/%K܏2K-Br X=%]hn= p[KAmL@.I ok- YYA,+\w bP #kU) Z kFR/5UaUث Vm;zRaOV_+V^bd0Lf!Ч}jWs}Ĩd0 K<'38i8;Ԗ'3qEXTo*K++GXji=\d9BCզG۶X%HX%Hj\ mc)6=Hֈ-Fa{l5سe'_eոl >k:J-Qsz(\|wuRuV+u#F wB-[lc$g؞ْHg-M&a CM %lo:1bi['Z6-X)c@-[ Pc;.pC ?!dCБM[i2Q3O0옹\gp{M;_9Gaf]CZ}A1A30Ѽ᠃Ж׊FANsF=s$Z 4-c]̏qa@ (M%=M T}-CG`gv;u{(^do3Ttep e\8QhӁMb"pAnm݆yvånwip( J텾Dd6޳V6Vm (앚Sv Aa*qÛRU4m]]e 9s3 B-Iۗ(#y;y5HѹLEcg8^X vIN Y9u%LϩLNx]E#>&}!1-UL_ [,SS +(e蘈"K*H+Ejy õt  RўDvՎp 0+JȰ+יtt#9N&ige1%o{ړ݁R}N⸶=vӵܕnf!l8+a` dMaQW?tx{l3-{c\h_L}L߯8泧,6í7X_<_BE%99;R݊cj.!|r֜\]nM<W 9t07kNk?53^+ZvS6qBB-Uov CdgĩW[5%7"W ռ8e@5'hJ++rjg `{r4CQ".g @"eTpp/ȦEW2"imx~zkm@ɬ,-Z~~աͰd;.gژiz#Q@ _Gy_z}L?_ g!y4VC'3Ztq5R7u:E2O%kE7֎- @ LͿlo7' 8"z >s=X-8z9iGb$ :ޔDm i`ްU3";uļ5k@ޝSs;bk>Р@Z<%s26]PVA?a)%PqK1ymfɼHl6Y5hi>^sFdVdVuFwt^x]cy{󹝉SBHoyc*hs1,}?t-CL$ r|9c3IURrf),F I19d3kwD]TQ4V'oc}(QqE٦oF$1a$y4$ f4 ΨAzңvhD(iJHBĸL2IJ .c!&x&0R 4%2IcH#rSS޵{H[![yK06y,~)8ONjGU{T|wAg|?KսPr`V* N90v t:cۼ!ˁkml\,Xo菛`|mS՗gD/ˆJ.W$fCE1Mg6,)v͵-0 V )4$쏺V ׼ϐKęL˕ޜG9 `d9Cx籽ƖpxRӌ71Ƽ>pc{ͺλųumy/ܽ{C$!"4[>6Z[NbܠQOo]S~qsA(C{K-SZgl_Cv܁j[Oa;u `_nsmi͖ۗd irHr6Lgiq,0"S&dӘs )h <̓6,AQ6ՠgB`L \[N]վ{ʭ@4c,0e ` g58 !%H fi"0(spkix)^u;I k}8l<ٲj SKYl`޹tB+BA8P+o()@_=""N?JV77|VN_ 7Ueul 8ٍ mmy/O)3r?zJϪb0}ҫjz`ރ{tᇵ3Al)% Qw#kUef2 0 *c]-r0XÕuA:v{(n>l@݄A60ʱD$5dDH1je4jA bh(3! E9^WYkzaBYd,cJ:mvlTE]6C 1PFrJԇ]/mQRnobIELP{T=+NqU,/'^L2/[P4}ˊ4 CCmU߼;'"i7` dzI,[p 1ӚGIQpyL&_uJ-&F{%r4'c*^ѶխS 9JTq-{-Uo6} 7FE+6|wD#ҝRU4Y>zs7"O03+^nMŒ|g{)/G<1jc6B!zZ3@)I9UTecɧFo@j\@Slzt 93&eki9O^]Y5X 6(v.7S8s)VOѳTAX0Pͤ[&Q<װ ofZ%5]e*  .xj>ƞxAZKiEHYH`{9JsW7jVmsbxQҀQ124og8Ũ7gAfW?n>z9&Bn|(F}>,Lz@sfWa )h|ڥ'?\{_ t!e+3鷋ʀY¦itc%}ۙɯp۟ B~f3c.xQŎ+Wt5՝՜kEliL[[Bp- 6Bzg$&= \HV"d1QLN#˿?ȽL+W`TV \h5eH+E (ιő[drAU"\}MπD|Z\[\a$6UK-`7J:8 T8L>Po[CM1#IãϷs >XYYR(NՆ1 _Q)IȡXPMD YG]wVri `ܜjET3,8eԂ)BL! `hFa5f VlSdHZ9b=b1rkꅣ"Z^L3m)%RAFD>iP!5,AS61Bع !OfвCIC*`z]sS R$cg=8DdІk<Ǩ2ax>'~cү^G}'z?~DsƢX7ÏtrɍJoly21n}om5exw>KrB oțדϟQ >oj xu!"ZL/_8uX; SϾ}t)B1BV`9fAkntWa͡?`'0mJ ?@i^;:])GZsrύ-Y z6mZ&pDIEL!D2CXT8p9VNN@$^ 0 8`b`zlhE(fDV8:wSpܛts {$ua>ɰqazٲ 9V՟z^%w*W*ܠKTzEV҇b:Ft{cl7$# 1?JRO\;r(~o>b&F? RS)#nJd|VV&-wZ1uact/ љ2[>8|>(SLFɘ*p7FS+hNd(bEdVpUɽEN`"%X^!:HOi$Y$%x?),8t:<BEn#x[o^S\x*& qAN| S<|`lѴJeeKցe0dSJ.R1ʚFĐ4F*3Bh=_*șlPo Ģ2h93%x\3s"slEL@AꌪP[V{K#bQVy]|Vo;Kc5)$6PْDk!Z @.v O8UjJt'k'71*1b <(.#tP(VkV=r]j}ug5Y6JaOn W)v\Tu32XRhYbfKhx9ͲP, +Ԡ%֠{8L6W,Ri산k^oy{"&$&a; Zp8"d0IU1Ĵp*Fsi.:JQ^fο֝.( H%knz</MlpaMZO'@9z/s aF<[PH1m!"ՆX9 "_IH5~!@b,-N1o ނ *WPOenzmjԁׁ;u.wL2*$3ʽYǎ DhLP/9 75NփOP!D*FzRkk I[Hm0!FD`EK_K:KɊzEG\<:tbKZB\)O˛G5lϢBfj#?Bc,>St*rs!`V+pru&`\ҵ{Qxɵqൻm?YM ;]jF݀ԌjLOӺpug^'A)_"=v~I+DwKn}A,DEM>xYRXP܇VŢ;THb|e0.ƌـn@EcE%aI*4)HRGR됧1Udנ:f.c7qN `'Y:/ֳ)ߚt_[ӗ4-D مhJo049lȦ}^ :!F ۤ> l2?5 Ɨv) :1 3u+gER/v[wn-)k1elj[Lktk3&_ ^<*[SQ&zi &4 T]حkN)<2ʊY67WdJSR<}_25ZH$.-|SJi"V)0ƏRSM!6=殔?n5?=IZh&GR6cuaYomʑC |A6F^n CtZo C(M~*b{=i0zԹ%ہ¯|sx9W4[oK@`Vj9O.-|Ć Hӆ9q0U𧅲 j!q%_%X$i^:)nJe)iNO3R̫XG2nz29d+sÃC $7bA4TTx#1/Ux.y4@ibLV33˂DP9b6Ն`A֔`1!)FՐfNvQ~h1Og4q̗GQ6+iC. 2ēdKH@T$f FD1d.(Kn4)M*YRRdeLΌL((3pY)a3eΘnV.,*^lvoNGfb8)Hs)K[ND $ZӁ>ۉh$ #B+@o{QoF(gB=ꇫyœjV;t-x%sHJ]O]S^Vr:񚿞!qB7F]]9孳W>و.Ub^=x룰o~K iҲ]Qc=Y.'IuO^[ډۍwĤu>ƫ,:NtEKe'rXz^_|8I~y+f(` .({,rk_Z_ķ{u2X%Nqc˛(9-iD‘ "B Bh|Qe  9zt Bf|ɥu$@+^Rw|Y5w*,~w~-'摟-t{gZ.OW.W|m/WShv'|_xzέ0>ۥ#bN!K! *4r>!6?=k&\=pgDO~4Q*X9UX`h§THpjD@ Щ7Xh5=[3GnCQDrn;Ymf2FmL1Iڭg+gŇ>@.)kw508Pl5@|.W1 nPFTըA *ntQQ%RkcOyc, *ntQQ# ҷ2j+/=OYnm*O5r)~J7?ߞ<9'ē!R2rF='x#!(Oɨ A6u mzEAK`2Gg8*azse\(eo,錎`RT ?- J)jS2"jo S2K5NjHIYi.9VC3SڕҘ=w1Tӂs̬7͵w EXraxfq<Vr@C .PpL#iVڣbœ &լLe˔Q` e9MS0J>f/(TJ}I NuVTtgF2$Z(*#=٦_B@ +>6- ɷ2nmspv/;z]jЅxk*dgĽo2SuqR\}g6XQ؇ ﰢ%(<O?=~xwկ~.׷aggvٯ|w~V^ޭ7Շw4y<[tǶUsȫ߯VڶrXV_ݟBVwYϾ_džZ%ѭNmU^ C+c"cP52%yYgpsdEA1' r U°8 1u@-^f4cYM3-LAPeBI)LVYFm47`=(FCYUCEif4N_V ;~[ >ޑ s*1{2GCXTiesfEdNLPA.!kar`\sj-I%@UR}ӒV}6PPN\O'WٟxzιSΨ2ٺgΊ `-cu*!BZ(5CPw:_Rf- T#CyF" ?=~v$P}JVV_w!=5uJ?l{q}󹰻Ϧw'@'¢><׵ J﯋U_`ew?sauz}aUc~E0R-jl~(?Z,n;_k$A Ku\@\^`yN TH} k;azktxzX^-wՅRr{_Y۫?~??\Sw7֦nRYzz47ڪ)(373ן{*/7cɷbӋhA\e݃F  ^p:OҨ)ғɤ |^f \WM޶ =%>$5 Y_>}E6S"C^ٍ ,,jEc2ʎpe{mRߞUxLTI1D,KJĔPK-=I[e}ig0q-Do֎ELH 6q9)#ȴ5&19y,OvOdNMKh)jHjeoorr$(rireyrY}9}ZSNl!09{K2]Ɯ$ebF RSR ˍ*/ 1y,9\)U*(Sbi@(z/)$Ns,))Td43=H5/Q@h ,Ӓ4όFZ ȼ't\^p2Vj`TYީE2OX YhC-RՇsVƋguMZ @'Xpd2-Mnm1& p6dc8QՈͩ[x +A3.+TH@9 2MTDr̳B/+ڃ-5h\Vi&DŦ޾X{ 8aE==$nkߌ rD'Zx3Y75dQb<*ZL @lIr'1'voʢϜr:3QI>1bL!qw Vm׍QMI.)'ᡃ],.`0㒜&x:銽pEqa(Cm_\l~m ֍i%0x kc0|kb}?^ 0\"iT)LApq@\O͵xn,iW%u~yS^@ Z&FлWFׂܻ`N/*F9*ڭw/᥊X;PIz([2볺p\.d 4tD3,ˑ.QiƒTEy*IÙWıฉXe7!+*<`5(G0;BXZK^aR:\ҧ Jw@Kbtwt4(Dq+Ct3̼@sx(mUZS1״r|v99 sJxXl՚]GR2T&$|OWwNUo` ¨ŕBIƣA[w[<juzgdES!n22B,22ThF%O/}F!Fd;!CqyokUݗO.bK|][RzkFfeI'ݥx+L^j.c˦]JZRq,Jh2dۺiۭ߀m |> zGT_:DiYc@!SzrPD/ڢOABI-0`zj -Y,)IKܦ"+y,I[# \@֢a;x˦xRth3#3|Tbe&Rn< PK 'c suDHQ9~xIh{/KD^׾ $5߾ L\F{in_VkMD߂A-2#UkK}?~U쬼>* V){PőGy܁N@x2 &e(& *]ۛ ucJ*R>BѢUD`R1C2^8S5pjrd16zBQN7!#%P/ TCY .NC$POHi*`Kڗn,Cx/x9-cGI1mlr WIt7jNxiD(ɛN톈3c29>4awOJ&3;z $ifPG!].w@%FJi ct_\FeNlfY$NCH],@O\{"4rEPPJ ye2ify.ΔJI-/%HDY5\:]_6EH ~btBD#$Ӳ&GY'VD骪bK9@!"a'Mp??U[ k,Ž&\3$Xwn-Uk7;-@n':%8!z4yj ֺN0 Y4Mp_.Q WkԳ $B]nk`  ݄ݲ*+`]Vi_;ߣxU[g``fLe&F~A/g@!+ -9h..W+l^SP*l@,@w\29jjNJZPMT, &}:A0!y] %fhi#b4ѩ]իYkjg;(flКh- ΅k=:eKV',4wmm  8#%WXԒ'?$E )R5 ؼ k/ɇߡDTP;?Q"@'n \Ŵ@j D<"Y &5T9`9Yr^ڡG)ߪm0Lr>٩Pŏh՟?_SR͂hdqvm!n7򹴽|.m/KkKG5&9mV$xi,ލzJPR(5%?|a5ߜ٩o>%utXJJ)$Y!pqVב UzSU 5Dbj!-÷&O9ҴBЂ6PJG,rG"'&@FS"AUnxxyxyxyxmЄ'I|$(* 7] :t:A&#-" 6춏B稡ZTDHE7hoGWa45aд8ANECz5-EVhA (ؕA'89^00e2dH XT;)< 0WK :)j}v fҨ՝nM-f,T"R;K A?MA2< L_=gNG^o A;s/`!'1RXL0J;C~)4iw1F/ЩWT8QjaĴ$4Ǚߜ4n֭_N_v( Π&y@ŒeC *'zL BAfA̟Ђ.)k\_]ܛbK^w8l*\8÷ʙսa3(NߟOLfxd%#6yy` KRttag"tN_/{ߌ@p=YM(jn]];ľ CT'[t㆛*N=ùfZ6R雅IB&e'U0 ;lܸrօ,ߵ8 s3' z]=.A>s5ᝁ<C˝O4EIAo vnC3ZAՏюFwﯧqѯ+Ny|n8tYF?M†١Ȼ`>aT'T'TVJ[|}pe-=(}E!ˑBAR]u Ǯpt<3l1pQai)ѭ+@&{:Zwpg|eA+ZQl@/t2DddT;㘤TY (-t f>\x9R?vFqɎxAs4M}4u^^( >&""q*GH4P,6@PaJ4n❕" ;&r{5&J8n퀂,eZR^*p/ 0ymh[D3|ۊ |cM ^xlD㥲uj E1*MWeC Nh*ٶaKο;߳:@L򨗴Zt0Zh\ f] %.9jv`uPL)IcTp2,N4,^, d6t^NDdsdA[|"n* @h~gt"Alem*.m)z&EW!PDݐ3Q$)>FV;;I P9ӑkKH|: LRXL71F b'e9ŮrIqI6vDp~2v `ϧc@c*ʰ[b JV'c`֫ƿ`ciHE63H)6h\6 Bwjc0C6ppS@jJTwi:;-փb)ʵj=7|9# k۠h z%n X9Bʤ;%njBSb:hO6Jw"\z0Mfդ_.~*cJb'(r+?Η'aCT7~' TѢrTzr۳y̻pg1qc8 Đ3vmMm݉\#oic1kH6hj)SNQ>Z|L%oR?-7?&`t1٥xy5\R<~5q85,Bt0sK:{k;WrbC\wōهjirQuu3^W)6*H|b%,#m—Q9f5bq "n =QGB(A,]D#5$p|1O@zh{ =Ajl` {ɬ<6Ё.41@uh#\thx c51BXU[.ImrS/bބЪ/5N+9r0qhA2o'݂1;2i/F/ Lhc6Y $Y 1P]  ^J3 ;?"ZHm v߈ +*yk F܆"1Fe4\4l UCRʢIIRnhP|F@CG0XnN`@coe5!!$)V`F, H"@E\bAE9‡ 0c&sZ#5YI>=!vffc9 4YOF5s 3,Ω\23 LIBd`gښ_Qmv"7TCg+;^S.ݞȲG;ݚ,S)A$t[@;K& m#C!5Q1A?tEI_Sò%OU0^nz7ܴxT 8+`qY_zWӸ@lJlo9s042+,wWE(oRԽa(Õn<(zs~VhQGOBR0I!$QRvudǚA3F; U y͠B&10v-lKk)nIU\NC/5 L$'@`IXt8q dr({M ւw(lGvP\v|GKT}A$$^j^5J n2(LHqZ`ʌ!S/.`pF\6I06 {JTkiyDnPJSywiXP: :ʵ*W(lQˁ@& )l4# akWHNZ-/- ѭ)B 2$It^w AcuvlA(6\V"t:u-Lv8J&kqhIhOQgT30`Qnp_dpu*K/w[:+~icVhǂ df4~Q {mvB`y;N%j÷s-nm7|gcV/ģ{D\2<⁾FOA=ҢE9KK(d!ݏRlV Mg;cA5۞-AP}#";|DT¹bDފ`2E YIKx'-D$W/8Hzm$I C\b&P%1Ϗ2ǷNL+ #LUDumݒJX-2@%4̢t<@'ɐ>XtgE)C9B$V|>D.9g"`c=$D0 )&2Q(ѧ%1/1I8yk"WZ,cV Qf§:gb ҍGϚQѰ\Bv_BM'$N3ڰch.%CnK PA:HAL&kn&1< \n(f|"C@'(`Skj TedB,gqgv#\P#p!%Qo;%hNaeF|\6#4T9E_F1'X |5[zs?]/}r;/ fqw Rܬ<bl/>uܽXݛ=BBz&[_c e2Ƈ}op2 Dd-ͼLc:6s`TCRABt4Tl)NۺyѢX7؅9H q ; >۸ (/ KTH0ao$(sCaq+xWb &r+E,1-U'g1/Ji e\b(Drg2k)PE4a i9!:Wi<$6P/zm}`A$.h0XFqŜp,cl @ ˬŚܠ(+|P˻M,ψKs5]F (13@./ Lo2 b3=`[a =nQ#/x\ ' ȸ*/A"F l2MLEJ9"(NZHAIyzT={ʨyq Fkސ՜ *sHRri #uT3|ska%zd~܄_XaICEZM.`R7LnB jdR,'F(c&q(!q,q2A`BQɭ(I|1eʀ23k`b,8Ƹ|ztv z84ƸF_1^&~mi~ѩ|ᩴ{DҔ'%w [@ϙ}K|{ZH L~~>U 9*ӌMq>wGώMbCb=x 8 ΁`AU?^x2')וi桃(G .GA@F;0\]op5n`pQmhglC4.>my/5ľ_~p{ۏ&'EgveQ*=[c6<ؒk[uWd|q@THYF!_8F/Q2DGG:qtr,B>mtI%?N^hNCpj~+,Tk ke 92RX"6*P+$bV1G '7짛ʳ~uƖQi X{Olj+7YQj/^_+?A/ Д5^ϴi~~ kamW>ʇ]Uv#jl`S g1xz2[bv+"`O, ^ZnYYKƹ:M*Sf O7 y~v׵g><4=`էljqڽ߆ ڽ;fPê([|},^ Fz^|Xs!~sh/ -SW~[A@  =$im *Nˁ*bRnl>)x΁̢ptsTsǁ/ ` QH%&GNSIXgrqzϼsa€qjJ>TjH/hY)yZd46+x}y؁ױߕiهdxhKo.<]G޽}RAXoxx&}A@VɿfrW~lfB; =$O۳HK"lnkiBTi|4&~9n%V7/&Q|ɕ@J" Jlc>~:ޚA&)0"b%HeZLt72) 8$iRZ_JH1_S[ҝ4Ҍ\\\2cUW9#d&C.^^ĐlV30B/1q%/pIP.ST3O!us S`3Y>wJ0-kiI-:ӓw n5"wnrg9BV81Cp }!q td[k .5Y΀maB sA,5L N0T!7PSʔZH+4^ ڿR@%8\Krl9y\ !Y¡a Ps[0VDBzv'.bLWcW%~/p#@Wiji4r}P7>k_ݵ)Q$FQCJ ״TDqWTC0!Fa*%>7,ߞa"M$ckA5Rh]fyI%~Zλ|ͧA!'ĐD!28Ƒbz aXEJ9nz 5DrvXy*X'a_?Hؒ5R)MXv҃rVC'rlWKV.G")EVG*&s:~Pb}N<#r]o۾|4/ od]ͼ*./_SSlͩD\]e:ay^mZ^CKSTV;JenO8dt voUz Sw.N}+\Հ+N]| }j ւyC:7$ҚkMCausU~ ^CL-nm.9f^b]<zˋ&CUMlPY5(Lcn_Xe_fġZnzwέ"g.2¡(;wԝ T [ui[p`$'=jjO;Ts\ipi ~"@;EE;YXN1rM#:okcgRy.^"nvVp7IYܳG5Ŀ WAf| J;Yys5Nbə)K?d2*Wпaz08{"92uJzgW^Or0ף->Q&1 6P0 e M \T᝕c|xq@ldXbPDW0R!($ 1Eq/넬yN8yZ gN`M ^HbVgxH}e>ɋ'n%f6?:aP)" S78Vl8U۞eܦ١v=Ş*  ơՂ'=/x 84c!cI{̊Xօ?;h7^P@52'Y Yɳ'rgX>bx1i̟KGbJ°bG,!Z l?sB煖PY?hxY?9 ʔ >b QCtaEÀӐBA 5RZ87ҹFsVv9͔ !B~KysH(Ռ#p8 7p- z`"6Ta\D팻DK#3c(S\o OZKjkWiRmQo&ϲ6YңgFV{~2e*iS*T~q\ SEc8ܝ5N+p5pX< SQIoVW4Q3 'W!Yʸ%ޱ\2PH>ht'0B T T`kV.RKlu4$W%o2eN<417i4`CL 7hF{/谘Ҏ!!פVXuX}|t@Si TB;kr/ߞ`vwgT>-/nGBӮ ^]=Ԙ]Brv3*(%Ѭ{ e!vdWjJ|\*<Ј;VMioAjjI?uvb6Ł:_\x({0Lo?Lܨч4@7B+ċ Zw+\]5F^]թKpV.;9RuXL26<^h-:^ xRCm\R/jo?0Dn9uzJބ d0Z:{wEVU&Fu?z`cG7a!T˛kҏ/h蛅@ Mishz#7gM3C䓽;>Ő)(g9 gL,PSkPBE4ꣁfu_i*69k )xx_sxiKHۏqIǪh =2$Iu̟5U_@[,yY0&VPIR";C_CZkqC9FЇw4j#YOus&.K#1ĄpdcĄJ^;0<kfݵ<=>_Csfh%idžJ{'UKIh2EEYP3Z' ;ppպ55j\nkF$'l }B_!PRe.$ XA2Jhk iՌ !,ҋ &ÿCV WqE28BJ@&1ѵk *<:Cμ>7^Xg5'85jը!."ם玊$ ^'ysm (:֋OsE3$OSU#TX2Pf\A1C'$B"3@0VSVT1ZAIyXh!h VH$!1quh+Vkx`{߃VjF%;JBFf+(&Pj~MEЉH$T# auʠ@pV>YřY[9AU;t7|4Q(ц?:2T6ú ~i>qod҂Nʶۚ(v;N+$\?ZAX =XPm[ _#} tn[?tʙ^'\kzNAk 61xT 2u@ A)KeK[+%AJ'{c?`~}g׳\vF__͕忲;'>X8Ƅe~,p.\ e|?-hrڣB &ks`5 )FM!RUXN-DvQգ|>rrN<-0Xqھ|a2BU0p|3-9^KWqhZjjN pZ y|B}roaN,2)Wdh4<܍n=x(wRͿ)+zAg{v=vgW*v6֓CA*O!:+DЎsxDk1 Fle%t-#<}a5BK sPhE0!QNReDb(@pZpU&U' rifk (sa<(/\Y* AQ '3BK.ƈ`G%5dZ%ZN(w\ć\oq9C$n?n<ӣ0;n eqU3ɞ>ro?M;4 ТڟoOʼL ~JψO;v2nn77|&pm8#N>{⯔ >w~=ՌJ4ZO .gQAVRjRya (_W@\XXp\5(F5":Zj.ʐS 'RF˯RR{^\2T{wd4Wy.r3v AhbajPsaa5qJ` REmFl6x CT)qYt~ SJg[ǫ/j0VI(=»jPMUL鲨ɚ&O\>U_0A-o4t p-YnSp;@|^q)GvPkƯ:܏V|7sjǥggkf'-r)&zuX+`6Pi ڬ\V|r&%'S&~#5~-Or@>)d<$/,U`5WQr8zA`ZV:꼴Gk8h&r"&bip@I'Х tz-8G=P_@pMVw{h0Jd)])].85XQyۚLy2+heƶTKRD#Z!ӻtQuJN |wW  TWZ޵5q#ݑpJ)JWĥEԒ'~!E/Cbs-n4%:c;o"U>gK]dl.dׅ0|ۃ'YE,DL:]L"F\}p"VKY!9x)y Nfg:7rı;<=CK>_?Lq5n+F?_u &y['vn&sa{ܳE8p!Ey DRQW{$7F[kZjܞzZa mX[ՃM]jC8"_ }E^7+\rkz>oW&|Y[^]fao.1-W| &D^r}E>ԺMJ$t>O~jSCc,:v .O@*x?M$Rnº1SQU|HsZCzա 9kol 1ŸCBO-`/jvd ;[ɦKYaxSbps9n2%`%\}j잌a^~-5O5z!m/950+]hvo=B(8 ' XLuNF7Y~|H{7H8P<-ySEVks]}y1u]Na]9ԄWʲOwdcf98dH:Δ3 ]+%Tap7u{=faނ`Z~]ܢol͈ Na6g}V19uFP OEA4BgOa¹ 4~ZyC+APeʫbl Th?ׇMЦ5TtKNt&$GPPėZC ~*NCEM"vmS *[T *Uk `XhO5gW KY#DUw{4i|WH XKB '[u+,/+8>=Mog yӭu|p?ytk_`Dy(fD0#zBnyGtLlXy8B}O&f"{xBxޢUrfgg ,Y1m:6q+͇K;Ԟ}eTmeh}l=ـzquo%+l7 ZáZ7"P<6{0en1il+|䜦Uߑn*b.(ѿo12vMʛ:KrI mA; ]d/k_,;KsaNs;ͫ$;4wl#K֭lXՐ|"#S` Տ[Xb1wTYz82 \ݚ\Ddllw. >߄&Žy.r{М&HɌC<6*aZHE,N$ N,ȑ gWrYȨ 44"-Pk$h~F`?96A(7\y׽mUByUלs@s/3\ImAo,.̬p{.#^1?; DBZL~}~Zˮҫ?yrKܹ3vDf)$/ |};_7+7\iWR kcUHbCsBEFdD"Dpf`ʤ_)"K-Cf2m~p >_yG"Qw^Dy+JThK 2D+) @JŭVTDe VTp2|ؗ[uS*d-9PjvbTjP8ZdңmB ǡb"ƔJ "R:S 5Z4EJ(7I)JZH?[(Z jJp_N1AgS9qeY$"\dGrj v{HVar b^ڿF#4J+= )S!֏C޾|LwGʏt.ey){}y<ޛ6\~4PB@/ͯW`Yӳv=A]BJ%_2jԄb1r9̯oVy3)tKwGR2Kl꣚]W }zw ge3B3q4w@*`0q$Y p$K!{ DK:wh)"0K DY 7>bPaeQZ}կBUF Xg+NBU=/01ieT4:t0L>D(1Jb9N@G,% *4b9T)0Uo8Q(mL(%&N38k [́e,S2RB4.%!˅ 2,iLwƒ.yǠ!lɊ09DPɩAm1mCŷX{}CU!}ÐfvvW N[>:_ sUWA&-h(iG!9G뜇;?Ƽ5 3Op>4vRsv>IT؛ħ8v;DK@00^Vʛcy1&2G:5*V!xZ@V ڱVOeu/]D3᏷ ce _Fe8X(b*ˌ$UhBj(q:`k5 $M@d{b0>N?5MgaDPy!;&)J9tiI%DUojPT5URȾ+)B NܗJ+mIJ̀bv n4!ʭ2B!%7Z \H&$I[e}>:IxQgz/9l 0GRsv&""Tɓ[w&TK+ɖ~:[ٌ B3c K&lApV,^ h  gaVBґ1<w; 9-#h( 4m/n[ dщ8$ƖjUkc^!b*Xr4@*@X. PXz @@ $bnKQ*_*/?\t_^DEq 98qt 9)9 C)fռ]=^&FVb j,Ë'K /&gcά0Foܑڟ[&/dX*^|trwcv\LxzIbKh2]S׻.%֫bA; ]ɟ0_ &xJ&tt32YjQ~ȰTּvzxLsl H^}ykz)=KQ̽ +9 IW.Q2UN2Ԡ& z\b1wTYz ;삶Wխ E4B8.ņD|oi<G/jHw쓗tb3m<(˴1'D) ukdm} wBUw?Dݡ29W}ĆwSNCm)AqYov܁(/q@Ū VU86ҺtN;  r}(J|^PKM ?{Wȑ b輏^,v6cyv 5㞁F(x*YU,vӰaI,VF~z[x ?:.X+ћc5 v 1}0h/< 0'-J4th^? * b)p_v;{Rݷ%ФPN /j/+Y)W_XqEIkHZ쌵o~1)}$m Ym?Vjqou /jrwE+6s͋TG/RjmULy!^<z 7sn3V>8_C˜=c:{<ʎz4F jHu'K6>Vv _`B::`,ZS2 Dߗh7k1Ÿw"h/DjEre{q"aαlt|/wlH&YlQпM[m%#,ܧe}HKOФ6ɺ~wx=wake"Db'bZdZdZdZi-w:* %F|k:Pc(1Rbe |!&:)Np/8%wwkZW:лɲFAh^AߖTOai?rͲq4)̈́>|8Oz_{riCO؜ג9C 2iT<1"bOc⃍`b "(*ttn3}-ꪗ4D% 0aC P ( i="81e !:fkhs&Xn' 韡AHo871W0稕' &(UH`T)e0239՜ 3gR!#IPр)#*xiUEE8"F%K0SBV%uht?KojmoS{Sz⟞wnY(%5?/{OkY8Ub||ϗϯ<ַ!eܿKTW?!g_eq{XBw\w(;wK~|듀dH~{rOx 7gt㎛*Do&i2j5XBp`;Z|)TIyTOvXr!;0grސ!D亭PD : )ȣV'?#ҒސsۆךZpD$-FH}9V4Q+ %5Y J(I<Ք>ӗq;EgSOzq.:^j* et iC Z?ś0X [T6ZtљDIjF)o'$* t6Dte1D]XWnE6%C-ƛNJ111x͔F_[yGwBr=ަbOY]k8?*EpG]11V t{sF"j_uwf~JP^'K OGk^vvn?ʽɷ?:6=@HJT%IYչ] ޳ˣBA5vlH kuͅVi]Ei~HWvX Knj,9_M0;BvR;G3Ռv=Anv c+Uդ1HǪiձ1BN `GiÞX,4{?<}Yf ksYKȂmIW)Av W?}&-DP!  'C Ğq*T_D jдerCt1n%0cs1L^!J;줔I)JR9_FOr)TAh#<(T#m@YjPL1Q+0<~S[]Hc9δxD("52`*x46IK1i!x5x.<(#$i'f"SI"D"PNB-X zH`]!DZ)7pKε(}MJ8v10< ½" 1c:^k LPO=E%rǏ{{PZcGK% hi,20"CUƭn8w`|؇P{v,+w  Y2\+X9 Cv>Q*1Fb$ZP}܀FBsw`U+:Yw0öѽr^H:ԟHՏa@txn _a1UhJ)m4' +;Pmו}[BTCP E ʝFWVNy]m z_oG T\TŏXTJTŏ&T<*>L;9^q-3ŅEX xC3Qu (폣ŠrNiqS$c197hvϻ|g-oVѮMxG(r$YnтY1 9F!eY}67םG3d,w$ӏZ!{ 'f g;2GpDr/gKmВ,+}dYCwV[C[`: C1sYSSw,-5$`ñlalWbt²NsFm+DdD0,`L Tk;C{F!zGt@=WDd9 @RI>eog֥/|;dzKt\*s!v3, Ñe$~MW:P_3VOsg`/ɩ*j6+-&%#°JkF"7^( :qbH!F +E0A l'XJ5JRزh W@4ŒN֙(8g)l!B#D(C++1EʵL-9H Pvvh6//6ʈOAB.JmǪCd5gaPst Ga-ehY8[7nY_ٳ+K<,;; R/ksC۶[jQ"Ed_bdȪ|^bBͣqvĎC!9GW=!ű$ |n1ϦqYoO{~P| cGIê,XPDIL&Q5sR{=;9enBίρ*H,,X E M@wkЎ^/ 8j9 M!8g0B$+K $.J" KKbLKbٳx.iz- 8|OG‰l{A gw- .,P#vM4ݏ+0Q{J 8bJkZSƒ [\3Dv2B+n8/d,/t]%L vc^).5 |j֑R٘aJ_%>ZXa ! ^ % zBwyۉx7Ux߿1^Ƭ!'l1vZSx\{]ݾ <'܆sĖ-o?=D+g (zWyZ>dk{(K)Ʈ9=@3[MWs}l M{=VhjP.vEAykD4=[ wӭ)ξȩw06?BartrJ8D3*q" ܕB2Oصr Ǧ*9{\3T@ n!Aao]}.Cnov(` \`α_\DAhk]޸a5&w;Gc=Tu9gw2m'?tu|{ ;s֞9'QIC^iٍ*N8b:m&<T@lv bh+W:mOOnn?5|^w4u!%(Cod5<SH]h+{&oO,K#l%t\0vNzno]TGu}8{em~ :CLfT0Hk)_ @AʸCNԒ./f-r-9}2!/wvޗ=~!U7J-_+UFxYf8#|6k-Y-է*Oz/dz!bDEmLOEԭs4BGtgҮoHxSU\Zm0a&~ZsÃeX[ iv{Eܳ30v!3A\w [i*[c" r6i_~ϫȜLN3̄iF(+Ʊ+ݳ}G Pk>n,7ȴDNGD"lh8bJD)('t$ {cfGtoӋ6d3)"U^4F3'$)'fxBq-{ A]{(oie Wi#sNiMcqZ OH& ';pCׂLuWNU}zً w_,!1.bXuNᮝ܆m32ڧw*.DӴKWNP/ 8׏Cpp < O9knBLj;E/@pP œMّL*cf8,4!AJ"U)* iWK3ȤP9a􃀒#.A pءK4$DsfJ&Y e%)90brmՄPjלz% i?;g4^ଢ଼^% e^').ыP\"QKavm=K;2F,.я%kcMYt=!}w l#VJH)Ә8c%(C%~6T%L4ksA %)@К7TRLh*BH|^7(3oJU[d+hel;ZIkx49w 2JO.wf׮PNX %:m)2ɣTtO=KАWA:U 37zN gP |T'>ۄ-d[pG}gU4@"qa^'U==z唼BҨLqK!ܷB34A!FHzhTp gK?b"(zΨ t$]XEzo7Iݑcܣ8G D-?T;CH~4ۏw5ypΠ{&wo.啾4}6{UJشy&Ps1M)V5vek? ,u3Zexr*Ejy{m>$ qHLz'$"f #a)4*ThiuZX':!n7WU^-fD̰vh.tY_l"Gn'$g T`]UB1Q;D:``.DDNHEQQ Ƙjz&YHɩH紤{gWh}x\9B2^O&\5ёQN]q8Ej^ia9Kb3ia = |z[/9=P(_f]]q.ωs H7+w]>Ck_nY=7T1A}/p] &ᙅZ#<ӧ'yz^Z[zy"9[/MgKO ?| ~:9bTg]+ޓ׸=vQDh.D=\\ڕs*B !7ϑ8V݈XGϛ]\$دn:Wy6aeJdD]".GHc= ֎7 ?W+`J$9T%ǚj+4/ 3eTiYa +χ 77KTl.Bz@vB#6AFf70n| P4zQᑮy6&׶+EI1ʸN=ˁ3-yO2XG\+;YR㨸ID09/.;&" }__>j9EJwٷpDJv7Pկ^Pe7})gcωͻQ1.#>:OF DsI/ 1rYdD 0qr ><>5u@H<'ws̫g^P43 0MId9L@uYP4Zƽwx HuPh$O9? *e?x&?Dɛ6uÍQ2 90!` CiDw;zJwd#Xw`Tg/oV/c tCVtEdUҷ:1ꇻ6JN&6-oEՏ}rT\*G=?J<c ڑ;x' DlN|X:nx e)CIsQU9迿_fгL;Gm])'s<\OV9Szi*;}srp%jh]إW7WG,^"N8 $ TP@1p5(ǂcn4Cgmaނ(d4h+xA0 Z*)9E: @ JX9# +F99 Xdh (b~ V MQ*S)FrT $9 S!-[R4\,)Xk XH꥔07HE(9F9vVu1f%0R+v r6cl}΄c3D0DXq&>j%DJRCGR1K#]2G 3<}5#ւ(8)V֮(GEa70G :.ݘ2(uqܕA9 mO 6iNeo;[6r )fo%nD9nNh}f -\%݂;;АWA:5Lh'n6F/m`t[hGmU4H~zovbv GuB3M#d[CC^)ӥxY!5?p)IFBdBiL`ӝwvՍWH")X"d\[Cx1plч&7 xg9+_< 7\/8aI oQ4_S6R݀y"M9Rc`H)-0QV6hIN{]/[Xt9sKVt>) A?^]=!'%5Y`\"˂,ddnC傘*fP1"#G6Oh $NyVj!'*4E=BIGaz;'WO})uYR}tA7/WMeR|5N̺wx\n}>]=-qKtgW7կqumlwo2˂/lQ^?6L|\^|z{rHd՟o?,Mn͍} ۚ ng(^>~G9?GvF Pvq]npl|)8@BKX6*anȽ[.Z=\}gࢸh݀vy* ;EdM?WD%g⋺X^M΢yŦK~9yu ]$;rcW75+8Yؒ]˯'ސFwV}C%'ֻp}=Ver2$Jd ()?{>mA[t;!}G»C.`#*wI xJ|.A3zAb뽖zhSnz|#DMX槓,!:Q(cRjv DSPd6%mE$Ή璓+@'s4茎-4ܯtRθ5sGDΰn4Ry`b1]_ɘ ^jmRh`9}bT I +b. %@aCqdm$I ,r~h=bxcUyX"cIRR,Y̪bQ&-[R2+##_M3vbܑ>tf= (ݲl0nZq{Ho]fL9Umd).&WĢ9Jcta;M{9w^ns 0+tTt qA- DkJkƒͩU fӶrԄ}Zk.HOMُvn@̪]nDPjnly}N0:~yB۸fqU]:d-zC_Q _hޔx V8up νB>Huʤ]bg^RJ$jc}eߝ*3087;NBƧLJw344:. l4Rvh쫙|#":g+*pugG呛7 :8jUshͣwܒw\q_W:w>e"vDP+$j{^ j0%5 7D*4s>H+s`⩀,*• kŢk.UERz>05baxE =fflHYQu(xZfRv2NXt[ޠiҌyyI+ynwנ \ڰKa0h?UG% ?2: 1/w LrգM0NLtsqf:I0u)S&rDd8Qw-F>I]Ös~{:zuY[&|Ȓ8!shW5Fe+(m0I%cr= '^Xkdr,m_uS{%}&qp/w ck *&p"-xA8.\0(@J ^όA L W J AwoD-M_m?H· .fMWOw#\b+$o?OUqcZ9Yyuv9bfxIՆ $dKSWq5Z4΁Ѭ=( }~[4~jF TiU!꺾^ ;_1[=uT>F ~<)";fi֦e4%Fz$f`Ϻ#u rBТ R'K0Ha"f cROqiIQ t}4MK'ּ0RuQ-I֣nG!HQYj1/KJKt R30ss2|qLj s1WnL̺GLN&?~;.6nOqSg>Y> v܌SuIB)* %bȔ,?y?c)t6 MYs:-&+q@2=^|g {9$ĊߢzZ?}Er:9ͷx!Lw~t1Dt߯/l1vcU6}HYLY5Rݦyل8R>}^BTG4<ȗbtqh_08OqxZ[ 0yE)C`,g=Y C`֣.CάnP0XPVt4@J-KgKAlr2e,q i%3>'JBuqtJ8(C`pK),h(^+5I.Rh"3s훐6OFqImH{6 ""CfVRM$0"6j9 HKL (0@@Ϣ)Mξkx?ʾ!H^HG/$F .eY߶^҈T!UcJ!U3JUgeJSzԩe,hJ+]qZ$0*nHt|_Q)}@Cףa(KTQ>qhǍD2(]V6Riul$T&VH/Ld|kab(ٟ!?nLba=ň-cIб Ov4Jh/Tmf<R%*H%(aSژS2H{18)'&Wx&$t(JRD5]$5,qnN"ؙJ`\ߤ{wPL,I7faZqǂI@]%{gf={gn:GgOaG%0 y 7ףG\q79_y$$;;2dW5)qh8=nĝL&:#p^y`Bn~ME"3iy"MG΅"0y+dc\,8˿MA7t+٬@NsNNar nwTca!)S7(xړ9d3m&`9֤#=j ɩR=$ dȤ|T`L_by}X?;3h+9OйmkPtM;ǧCcENaBژSrH9fdiSD)͇2Z MaIN=5<3hD-@ЪKcNqpv#uO'={pԄVZ tTcP_0APh-ow7}ޙYF-e T#dΟ]փj2|i/.bOŵcCq9Ș8'yo0xcK(Эof:s"q4٣sn[=d(CpAkq8aZݸR9 8E7_CzFZwZ]rqbz.%_`GN>[#'9'⣌z1]Z҈[B쳱B;Lܯ.@6@R2o=5;Pص /ڂR" MY0Z7ocGm7`1w鵱&q ;B x2:hW7iqL)z3?0+~|uU]`SoU3؉' ϮռIBDƠ=:-ŧ+A]ɮÜb1ލQ{BE(5t܍Q{E]UpOw;\F uQڪk؝CUX'TjXTovZ]50CoUq5{b،Pq:.[X cdc5؃q %X L+jޕy M|\jDvR&}[5/ GNGjtSƓ̯^A=n+>!,`vU7x( 64aO|o*jXPn/oQ{wÇKOysN*V2q5~G(GCcʂJexLz$a>u rݶj;ZM8vEpERWV뺮6Kaӆ5^ӌS]<-%:+]}ë䶋xT꿷|<2&1Lst XqYG4eTVCZxtÝ=c޾"O?; )+9!im!-Y'YS^.sk]!bKU{؎?#~ FiTrW2l1m o?rKZ0'fMz) s rӦF1Ԛ7Z)enܮZacɹ2s)㾔KzxSo{ypr`zq7rkDZLb;jwlJ:{F9D>Ի⛳;gz2*YTVZswWʓݯ^2V|Vݹ \h\CC,4ܶq"oUd*nT}ll[NN Aiq^Ň ,6gz;64Rd i(k)Wd)1_艩vvӫV<&?ťp}w=mi+u$w+r%5\a U(Q( -PQ<oד(I nj m͐%$t[}\TT< r4%L{Wɸ {@O܁"5IW$'VՄ ÷T/E#>,j-cjypA\J:9Wgƀ5)Q'*7 e +ՉD4ZRy)ڕ@+@sLZT|IA7dԠZ^VX~C򟻓 Ri̿};Rb7JF54h乕ArAE!2N< O+4hXE?Sz11cA:`z5Lb:\ Nq3jv:Ůy\c$1I_-ݷgؠdlh|dո~ߤt7iO +!nyJ_E4-5m a$6 2nnT!"v%XחګDʴL{U(f|@/:"{)S$&aQjŔgn[i~nY?w68xǙbvs,:]{#z<L*򴴣إiw^MIs8WuvY$1킕$ELM:b":u稢 f#O%N~Tք֒REW.y=jN9hqTwjhvkBB\DkTVP+ HZڭ.)Sw*ڭǘTdmV&PvkBB\Dw)ẗ́[/5uU9F\$fc9ګuEmRypk+5A7H{mY">B~q9ψ{|J"SHĘDE_s;$L&jpl$ͅu!\RӄCs,l Dt9LCDx%yi[SBAtm:= ªx[ NWPnqHl0'"7v9(-'Fr(uFM;yK3ni'6_>%6b 3.#.#.#.ˬ+7r&xosU9hIg #ͧ~e&ֻg5^^p:ʴK'Kp5菉N"MiGUW~WJD!+0IՌs%R猈`!E x3+ͬSZRt"Z\dZN`UA) sVCȜ(T\Y"rDArj%0]1P(s.<(XC)yBh `Ba<%u0TL tr<=\T;{%$&s3zT<|ڝPŏLAǟyOpcاϏ;H'H,_g9Aj89{ñ⋭O a$֌'D"fNOɃ>ûnP)Q(|b[C#KI( s5\aegAaG !L!|z Pt))Tr?K)״ܛbeUhQh@{+*_-߳VB9Wʍ#MGkH+BBM[**AiŗZ{M΅p&zKMP@tS:PMˆ~n .vTRj xT;Fk`RMuTMP+4ёP u%K=)EstI"^i*M=gؗ6mV ܦkp"̶vpM>mR>Y ̦=!JY[u1$71dȹiʡ1eR4QSl>1svTowToGK>pjS 5իQTڰT7,cY,p4dHjm^w|km^JKzuɴf{Un75Qw"8:nZVOmEjxb@n6⭡P0{PlPTV8际C ^x7d:7:x`JM|BFwC/"WoWdɤk,pKuYKUXiJF2lPfV^-ԁ^3jv]&0 ՌVD ФBWkBGkI\| O4OЅ57%;'ȭ{-҄K2\ڙeDY0~"]ĭ?yY/ik/J\0G"36'P+M[z 3?P=1(Zq#RH%MG5/NMNH(9z> ul9e@jlY$Z7P3;/#;/#;/#;/,gr1Be @4V0qL1$g֝VlwjMh2a41┘ )R4ٕkwͽ@.8meiISP! t$ njVGD-#庋Ku!M!L>@z2؞̀g&,lLZRBdW1_[ a%RVœ3 %JI I!\<1F6^HK$}.t&J$kˡgkv3d_}{lU+C_vi7._ui j(\XC!viY5cTKn`H'?'ZA %z~1`!9`i2֍uZCI2ԦIlYɺg^2KI0  D%ܺg<)UiG<ݬժDwƕݝ\ Ru4 U߆=pw#ziS2-J 5,u*hV#uHY*sgmrxT*|bt|Ne.اveXIOl y |mBp&#2p#ن9b^R)oVMɑp}&瀥Z'd0W"uar$IOlF~IMSd Z 9[+tl@\=&-?\U 'sHg@(h:0He-L-Eu, P3hjVCݿ][1w?A]PƁ2N[~)"-$foRzgtᲐ#C@@2=Abn"|gy˛,IQTk(Mz18-y']xG!ӑ]:W~|n/ka[|:yۑ#-әO>0*e+ ߈<ル|TqodAlf]biV.tܠ8H+l_ҕWٌi'dkmC9U9.OY$97mW?E! m"o@ߌ"ٷJ3zvV\m); awOH$ )Z 5VqKfEq,_(E[]ޠ z^hDuF+L00>4X-i;e3M P Lo(%۝26ݼS#Ao;fLĦ sӔ$CMӶ]s˞;K.W^1omך@4^}t8:fY!o~21u 禲@@GTONN2 YGm@8 {%Rdwf_PnhyztprԘS(BTs;LØtq"S LoRi9΢Hf8\r8.xk˅W<bh+>EK8;牝ƓQCa"d?u":p@Q"W~']9[ ?om$Pw2S* "^@ .t{Ѯ4HΥ pnzI- iNT٪}.KOmH$t[Y{Zormȩ y< 9+G gIaA Z)AaUrK)WS ڊ?{F_99dGJ<]d, >ڲϒgof1[ܲZv%K'AP7Woʚ3z&wd\dZň)h`kaKџZipxg>R 4_c?kxJ׭MAL&牆e4 AbM>C"j4Љ#G閞Y~)L 82zd. yp\H6^; V:6sə{f7_ZŪm҄Yd#5-ˤV#}(&PQ)e >XqJhצ p)<( &IZHه,Ld q9%bh"֦7F+/1iS4,&2dA@N"5,‚1 & gTY)1f7"sd%0գx;Ij O?~D ϏjL!ݤ꣟/)t'G$ma}1?;f:[ptrT<7bm~υ~XmZdBh+kR{ty}e8*<'ZıaPA ,ң 59$T"dV'$&M@2A :pEdcsٮ*8, yR(͌E'I1Ʋ׆A @SSѪCy=;U_ObDz)i5}srԫ ">GM*ڄ( oW- f#͠&Q< *.K =DPdMTb׾Rgmd; džA t>_x(hrC*V|.mO[@!HV;w Y 4-)3fjj˺W1Ѱ"Rʍ}"*R.Y8P+  F7*Kon'3ҁvѰ_畞>\/q/d҈sLs1oC˺Ɇ>7㫷ozW=ȗ/)+5ŐEQf<:mV)7*5ޓ&{*gCZуu$cd)= mTxȚۈ.{ C"lXB 0cQDMa,V7G"XQLX4i}%#kΜK圆mR( SzIqŽL%LPk!CҟnaP@g7M&+k T)Y)e$ {BRQ~IIHnE}\qMJɏ3gF$RuEA'7HN:ZkcmN C]t@KmjRRq`d;ݦ1 PIU[6 {PHê?QΈV L 'Gl,̒.F~Vked CMB!&v;xJx/ikmM=՝=आcIlKr-)ZK0F *ǂ IUCe +)$Ίbb"MƆ`Rdu2V5f.lJ]f Vx};BjqH4Ӑ$"T"0ze% XπsmEH&^ tHdl;NtC4*B9 |Q(i Q/:DR\H54NIZG92B+)SfB(s-ʱ6J)Dwǔ,2D"\6OlUh~\SWvw-[v]P[t#KmC'ψ7b`N?'{],:9g*e=G<<|ǣ_'_mxţoy+{GrršlW;g`3qtʰL=/Gldb6ae(mT՗TRXRoôȤyJlyڣGN8#e/2H4"sJ{>և0> (*;Jtf2>K TZK R=L)JGw5vV0Oi"ti~hr>zNHG9׫sfDWLeX KĄ39&gPL܉a#6wsoi@P?ڽ|s0פ TOm = n "5[zm:#)3 (K5^n^- K;%A5R!4ЋϏ*?˫_Ӓ}Jj=k:Q[4OuUYMD](Zm*_+=~:%eX{<}H5ZJKHmUdڨSN1QxGȶ0?0jCܔMn/"%'sÜ_zcۇx~?_ϘS^^m<^i+ 9S\=bIY,2|`>8ouH s;ޘKO^%jq$!߸ھ}ckM&uA.cٝunCH7.}dJ6zJn_P\zΰK O2vl )!Y"-YJ|JmmZ8Lz}3qOn:=2L :[,[]zFoxVz<(xtϢK4\k}(`/fAPRHFnfU:'t3q鯡pk}җnnމX20zg}:P&緹X|¥Rl|i_צewIڌQO$ 7F.bMڶƒ *([.øl"ڌ7>bڽn3jCXm⇅v#V vZwl{}b6<0ءTg^Ԇ'+xmַ>?iup~bJޕV:8'c8*%GXS:3fZL?H6^nV&Վ+nͯŹxtxv7ԑup57m W rk$pKRA,k:42{tzkk#XUu5;fCo])ѐ|`D_]ԓ[=4诡X3ԥEP \eQm_ӹG1WL24/[RKWduG(C ֚Z֊[8Wo$[EH.5|)o٨jZ{<`gا,el'cNΊh:+uVj5G w!, r x δ.pCR'V_|~x;:-U U9C}vط |PMS?u%a-ygK.Y7Fчwꃲ-2*X$}9`%'\ P$_dvWJއUZ1=m<_=ຐm ~ %p/mkYf,h>pZH[^kٻx^H+{؀%b>; |@r(&F¦qRs&k/g;d (Br^^b0I2c!in^ bT(W<;,)lsL N"0ɭJZ _gYrB:ιڠ"Y F[k$!T[!DVwd_ݤ *A$#IJ98maⰨnvjvC AlLp9%G年\p=@' `>%$.͞ɐAVY fkF:q_,閖BL'AL䬂'At$ 9G3d/䨓Mdjt@6H[fXN/'ħfOOo{BWgtq~;Q#VD/-=,}^~pwɜVS8}Qi5ĻT}e]^5 Ԙq}dHoE"B89O7bm~υ~XI{t;&ڠQG'ߓ$Ӗ+J"I]]ZıaP4ʠy"% ԮfCڌ EOƥ+ m`<$eKAGM&krH ` B >R2G 3Z࣍e4Yc22'Py==+ML0Z;Y>YɕݦanƱ:c˴@ƥh\ ڔ`ֹj4&-0!`SH oN1ڨ ! v f=MB>߮Qj7d&(aECo|hT}=9Z.f_kpßO.( .H,i e'zZ1:_?>0~zxN^z8^XI]9c3 72;\iAJOQ*fk IȄm?J2 !)G(GDcom#YugXkCoXQL3+m65<|hBU5sNK37I8S,WW9끴5{lL͟VXnww\VmKqWhfZUOm+ywC'un%S7NAGOegI;ς%J&- ):v:E,~X. yHA(W@%md@5Lj47oSݼr2ـ۫aIAV/課xJqJ bD,T\h5&Y-?Yw!!vRXkmկ3GXVVA|3,r V {SĈ| \+}bK!F.x"e ]3Ƙ!gqƥo AX( IQg%UkJoK_\ AU 24(tlѯuʇcٱrMcP'W y(X*2$ӎWrџ3O`! /@B_@@r. ][s[e]3ƨ!j|Ѣ@%#8(1:חEqA;fh/Mn ZF(=B)GoQNjEG ם=9b6=ps\ڶ>v=Џw썳nskcvrk} n)N@&wh8y86)9w>)d $ZsQ^#!ؘp<aAoP~4Л H; 4 >bbMDHWLЉ5M}^x2HW8ah R> kUr傌p W*kᘀo2$4[#)ͨөМ%hV23dba0))18ޯV_lSbq*?-gΰ_!zv'ne)L:)!}u]2B0 lO:(09# 3\[lxa$hg ذ(!KMB=1Zm.Сi#OPn,hEAEPMaYB8"3 Ȅ)YjEZ8K NQi#b[hiay~tWן݆kP]J-QJn4,1_> " r, -h#Ҩ]qхJ S(eS^+3ـX&FGȦjx[ri'hѶI{}b6PdI.EYL֏ox`d,TlT6Jxe(60_+X[7;';j2hJt~rWEi/vop`--Uo`V΂rW(6 ȆvW+\+X'7 G`G_7w׌■XI]M]J]=8U*qNC&n]J 8/*z/:*t Y7'<֝HDOm<@ne}ZnG\E?zn\W$Ro+qYt25PT{< _i+y_G Cml -;ߠ3>djrГW[B9-=XRfѰؖ&Q|%ڴ"Ի9mJcXw-X6^p?nHFCez?Tv]7 dkƑ &5)GFzպtj7ŊS.B&̂1?vg'7p}ϱS79z<(uK wH=oݿDcN:/7_ĎMz_^_gh$`Nv8!,?|/mZ([ sF=ɷvwKPZ[VI3(boPK[,!;Fv Sn n]H3>v㌑1hX BD'v&xiBJs[E4F%"n2 4 щv;^xʇL':g.e#!.Ծ$ֺ%B7R 3fS㨞2i´J1f] ؚ>䰁Db Tp˓%9!qBUliP!8L3Èû})\KZ%H0Y!* ":AeD[qR X%'K۬ߛ=5kܳ #n1G |w _[/ˉ?[h1:%XSk%%{u+Щ&p~f>sO L_>'拙|.9n\|X.pV@݈@5Sh6efS0U&mDRZ1]‚0Ⱥ,љ)Ϭԥ5=D~Y­>T xKrTu_VKxpZ~퐗X`/nVN}֣hl-$+Н/B+cT_%!k}+/[:•Ю5$Êt-/]Lty_B{ ss:6Í9.C[7/AG8 E\c.ov|]" *h S$c+cC;mC烚pCͫa[79[%8hX {,[M s֊cG5܇"puK *?te!Vk1/3ȢGEZJ%܎ ºfQC. rSck8)uͨJNv0=ə/jϤIT݁Zql굳F<)lwߜNXQGr>Sɷmҟ(9jY<1myR,;u;iش84p(ħݯA3Sq( VCU=;"Jiy>?8<'MAP<^˦ǃsi=Lu)ն ږGږǰڈJO\Hi~k;J~k1~+@Ke$6I B2g #-!?!Ҡ;ՙ__.R-^9f@ᔆmfgz 22Q(7Sx#(4-CxIrƶQO{xGe98/=8:CKHGflCY!XvE5QH H\:ɥ-VsiFeݳ? Aol<O}9'̤:M%2cLDppL&$"W\ȦvΨ_F(JIDPWTYƷ#҉THRg2JddV vcd:e@xfpeS1SLc¤P~=x{@, 9f6ϭ /6u[>-fOP-*ėwVo:CQ h 6O":xУa ! ZB)6O -)1疤61pX<#x;(g[ ' uqtH&"-Ҡڰ#Qe:X7*rnDfݯiGZ :$-ѠEH Tw7~mO&GӬ!F0Y E qZrRs"SIf08 s+)rxè<ÂZü:B6ѥ/]\pyw?my!XIy[hQ>QHubr)aL{#:XinSP5$*kϿ 6} \ܹ9C}T3tw2ժ__7^˳ɺ#ğE4[Gu% +yw%5֔/|+As|/A~^V| ~!)Sd{$t)}2 sr0ڨ6pnEg9Zҡ㕻U5-N޼p,ŕ[8l.vzoFPT*َ!_ "FBJhˤL*ďVf*p$rG>2 3 kOѓ:%xkH(W\ή-*ԯ7z 5ƧsyN|tKɫ=HbA?;-Vvay$KLf"wbK%|=vo#9A\q(香6~Y#[TwCڴVܭ(zM0$)MM.!&X~2C3k9#IjQ.}pn=Yܨi„EԦ*Qf5tR}F$?.%,MtP: }H80dwvҲ#njg;.\mH,cFHLYDdLk-Ia`j,'VBRS@K)3"Id (0'N3+tKޞ~Rz` >O 0 2ڛ3>S_ھaˍY}:;yܾϒW_t:cGM^sò|'XwrH V2-ě f LGh EhvL%8,0jSLވ͜?4J`@# d%IV+fMp IVYJwR)"#O0 U AAI0uiFKJ@A N2S dӉ1Q1p 3"f\dmb^(sc2+ ࢈rΔ!0CK?{ɭ$@Z"d%OɃa;8/16\ G5ڵgtii.bO_gW;i5*9*pD `J`G3kR٭EBxpl ɴNJ\sKʂ ֐!q-8D]jb9"-d-7u 2Sm8+f '[Enk1of/%͡}WSa|d%2JJB8La`<;$͎Xs7h  KwUW8&W&(g^6#l{+O}71Q$AS<-4)+ wע9laتޕד:ΓlE}мd9YUO?Vu\ĺM??n^-ы%zD/[wwW*pu}!5J&8HGD*6zt A:`I-?K_'?cC/)t~`@}y {}Tlb),[_XRh*2;TM M7%!LE>eֈK[ Z=s_M1[`Qmr@ Y7__u?%qIk jz&'O}7[??`Oq&*O=$尪^5BOw{ Oq~>~x].Τ8z*905g3 KQ(-oWfW·<btpK-_y:FŠV2ZdCSi5 Ҟ7+IDw)vdm?Bf?;?OƓNX{ h¨NqTg45$"rG"+E;sBVF*yHY].s1Vi) Gv͏'j}h:k4y ֣rig_ ZERetC͈I }3Vt__.+z3ܶ;oށ;/~\2S'碋;/jkޮ3 })r]Thе֘Yp>S*m>]ZP+/yK H9)wP1Xnjr3rKR( ejKe@8t""_=P3BɣD$4 ok3Ѳ@JIuH*(%1)<ꞨiuڙPڂNj4[Cn 9kO7* R߇xlm0LbO! :QQLJ2{u̥|WϾ]ASmHod{]}#KodGeZηR.O|us؀dg}d=q_Y: lyWW֞\0W<|\xR빋Kqr,U\\p{e} 7Ox/u?/rқVRUjز.v4=h+D+)UYaE }Ed&\ls^fo] ϒD%hOcǠuSdh<!ʠ !ZvmF+p}W>Uvic5-,t0|r`ԁtʁ=idp7|  -oR71#ir6W)Lzi%9)={ c*nnⷊRjuID$4#oI} /fm%KWnp_wDmT*ktRq 'U[xܑdkWdrMXˢY!w'W)G[7?n8&/~o?ćᴳqcWєmsѓϝ/azpr5x)|`Oȑa*WnZnY-- pWRZlDډ#| x'h`E qh"U(F}݅7X}nQ cMh1|{IҢ‘˰c"]:z>>%w ü쥔QY]dwm~lRN6&hS{MTyP`-CQRj#I6em0,@ʲ>TS):xu#2B,.yQr@?ϴZ՟|!0WJ#_Tk|SyTES%D-֭O;]4y1NNI?9DlQ Ih|Ru6 Y`pA<^L<"{Csc+Ʈ=FSnxyKmVݏ~Aٍԩc?vIet9:{tZo'RK&utR+ o{ݾ=3z8Hiik?Aw2 Bdy[p4. P>?L\L>V|/Vꫛї֋s=Rqws=[ל|mSQ&/1ofsG[_5mv!{4>u?mwGn\M&w:Ut_t^5qۧ="B+yvҲvUSeUJBFT".Z :w tb[v; y@BFٔ9nk#zX BL'.m9V-"лgnQ6r[ލInNl]x^:[ y&gS2Ц9!\$;uA, Eѝ~ hOLFҷ5F{frnp$ZW^\-L۟~}I\^~X\# ^gQI1jjc>F':IЗC%Z>@E)Q#GSi;PԶ"4kr֠Jwj^#",NH-/{n%Bj?W:K'"*E̾Kg>_T 7/o;{UT-BJ8YzX nU\n .?yo=DK/ +ia7{wY|{!w7W#ip|܈WMNٜY3sP%|sACAȫ/K"G){WHSE\ $C xO 8ya~am:ӻN-h RR/t9'4]kV{0#e,ƴNT]2,QR~>eF3 yG|?$H`|zA`SDZ'2|BzrB6۰ #Fl]9)f=ڇhOP?ŕC<]IyX9_M`s֗ v?'^pO>c F1 wgحckV/qUwR2,욱_oUyTQ}q%DrnJ 3^qjouB0ܕ y_P==x,F+ieCc49/$; 3pkY/fvvswk3o7g>ew.W_ ʵWc: [P N<ª: UCVpr t$㑊F$ ,]0}uL$Q;Vrٝ$cn! vHm-0HpSpL`3)yŘcLi)&Gn!A-2jZA=zF,d<<Ӓ![ VL[ -g<חᄱ,`A,(v"3a~c ^k(PME ʼnVnLXdJ\p}*,')x!2"4JrSh@)VHNDt2+M_|Wcit!Cߠ . Z)ʈ_J'94FJZ9ĥ2$TܩIUQB*Y/HJ/եœ©,FsbHN; 6܎Q.f;~rBFjлDbtEd_otZg{%$gn'۔BRHؖdœxHntYVH[B9-gy!2ݫ5ߢRKA yOc J՚5ě[l >i)VOE&cy~Ğ 2o06&Q*aы9|q̧LqB[&Tҁp;E䪧ٯ{uY1i;b$Re&H՝)ႂtco#BcO(d-e41?foЛ}VL_oʁ^{Qϗ.Ts\iaè ?j/v}wy3 5QF).ʬA9BsLr" 6Hc6}k[^$]bz=[g0Qm}RhO/Mt'd\Ԙ+ TnfSCń3h!TQ+PeBNxaƶд`VIl ͺa%VVj`tQf)qIRH̹L*,+Dn:$Fːp_ȳ,-*rF11dg 33D Y*,GB% S_0z\n㘲(09:GpZX%{~Zv+,t+5H*;W2J*|:RM`I07_2IX'm`7C~H9ㅞ3L^sr̹WCUrHC)H)gmmi^֧ `}*|w/H9-m3D@q.lɁ'hkmD(4LLj à V[Ds͈5(V#& 9rm ?qcː*y.`͉-1Xά0souc:4XÕOQѓ26+ #pӶrp^(%8XXL\$d2Bo"R*wTzp\ /g=W*^<:/_CK#U汷wm[/ΖM׫A%˷gnZՕoog57xgn|X罀 yt*O|zvp^@1OϪwa/9Nsa ɳ텸]5qVJb?لϑ6&lC^҅;D&}clAQ݆052hD$Z Se:[s m9t$ґtׅ&L!r\.> }هåru]mi{zr.K,f83eìxT͚b|Q7(X9bzii@nILQo Jx!|;6& 9Sϗjt𹫨ױ¤VøEnd1q7?`K K *)pmoC^7 z86JDwѨSV'F9VU=6? a|~?AOINLOܤ=펡$Α` lgd{BK妵 j˰Ym9cVD*/wx7-,{ ~^, `/W8&0U40s߿7 O*H)sM 0R#3RT935pDaP b(TuAK ŸѡʪȬȱ.@[\XR0 s D40=g i3#Y$t%yآOO?2j`g(`؊"#c̍')(iQ4UR KfsZ!gkO5h&%7 J1U$k*)I$C қHIAKa9 A6ܩs|*foDBOsM .t-/ŮR2 bLAL]hIW;dJ?ocJ\A?VA ;7U{2@LHHs;D׷g~u@|Ru ["8gdHS_x8=|UVڽn4i\OOT6"1"u#ԔOr@J 0s: 3ci},phL%Fe?Nz~?>i5ĩ1% F5G.\o 2 B SfgFPڶ"ΣOw;-u$L{EAY Ɛ}i ~6Z ɀ~BsuO7SPjaMw\p—LNj|Y^Uw2B  .]b[CLbw?p8U6µQ1N!FsQƵޜ3̭s;z3ŏeEn=Le>ng4<=wi}c&ɥy~gp?_ g ϒ)7޹ m~9.˝ݥ,Mm1} O˺ɖ,;7(‰ny7ňw tbQǻp u*ng2ޭ M4ʦrc :Ͽmx$rN0x^@)J?6ocj*e US)khs[yC؃JwUл 9$6TޅoO IIZ(0}N19֙AsyYθF:̍ե8]{DCg0iTAd{ )ŁZwﺑDR>J?\$ed%qL,8g/[he5EҲg1*e;Yn͒`Zm>('oo$BVUѨv:C d/ R}_ N><(]hL}UhDž t8Ÿ_\%}ǣ:7̫x5ѯrʨC`J7$ޫ* ZbdY 92HN$bj&*[z\ߔ:_rALjKQ3 0.El%t2!'"mSqC =_]rQYv6)Cbn$ei:xn'Y\9.b X=;вڮy b{ZB_vBEKPh:I@< %OlQ$HaHLpRRqœ4!E(QAmmcRu,Q̤6L jSIe%o!Tά\EbX +E ҀHxj^<-r .il*T%B6+tc"]n/O >7y!mޜle>F.L;\haSp1 Cci6 3ws̅K)~bg0YaqOz_^\Pr`J2o-cd\ 9WGwȧCo{PzC=ZCKy C pN(PoϱdMO1eVUJmM(:%Ģ_É6ňʘ #T'蘤@rTtVPB9 TAP:~Qhq2yJN)p=C0VdyDׅ)*$OԚ4I"k}F+9Fj!ssd.yv+N2OU)/TA Q!S! ttF TyBBmjsZX%(6RhfSEEΨ SyaRiku1j$8x.`4j*²VH\I3%)3\ezK3d1Ak?˯p*noWn% %oPVWd( J}OoYJF=̻`E>nr߹ :2g拕|xN'ߜt\7"4/~\1߹[-^͙ :VmN=Ue=K+ M4ʦ4|ݤqBA>w;a: n/ڰD۔` 76ߊF=ZȖ3Y(!2S.g9SA^5mJ!ʏ-ۧ4t˶\44F /Ь+r^-g5WOa+⧠?O}iO5#]xy˫{W`mШk-Zo^7 DKxTZ˼y J!i@vPs][B02&GW猜&bA)&\FcrڥW}&NQ8Ԁ\ͼq2/1=NvڑsЬzz;ǪJq&T3Kįʩj*"n5KT۾Z61tʛ,ly!OJ2EmsA (6">hVGX`t! wM۽,Fd mV#!qQM-Tju^XVkqb)ی._?1A; a6F"C]u{n%RB<㹨Yi؂4pgKýbm vFp;3֏* T Rftȍo59>wKa`6_f S@DVR2wF>ڴ\58Q/> :}0WZ CN{e`=?!?%cD`(xxD4~m9#Y2dE$$}'|:qj [>?kӺruLʙ֕oZ;dݸH F(ZiH&}1"Qhqnr^$I}>p_?.G]QbYru;}hU )to2ŵ-M߱hILoj!~%l(g/$4'Ddnxkx0I93c3)%Bst{dCc$HLfZy&:Y*dYj$8b3r*⤫1-sh/ /P0`=Q8V %զQ7~cMrTȰ|@{c RM(8l O+R7gGneڄ*PNuh5Xoٯvq?M0N:Yql>Y _:J!z_x{qJ&YhNoz٭Cћ. Uf٘s: dE㽻W[ոy.:$Te'ט_3HrNqyǻVRIԫ -DO'+\+H+(JҶ_Q?1\)yg9s@60h5/\N^OPN$S匉n[5';`#.j?UܻѦ rg;/YQcN_ <ΨWn1j;{dʵ\6ApA`)U?͊v&ΘhɗZ}1荖 5tiY<]4Qd|Wӻۛא1U SZWt (8iȱ)$2bJXa=.;w:m]5uaEptj"$k{ZԧU̮ݨya ʵe[(:Lɬ[ s+({{|M&kBhb8=\G]KԚO1ƂۙsHɖWa`f5uBx=L+zH }ez`rUmzj8I'=wUTXM}!c4aLAZdRkRYP:_UȂrc)ކ0e2gĀ)v Er&r SQx0Bq]*SɕI F*:J#(5QR?# v |~7t,r~3 ACSНڹa`%75F &tw/Wh  X; m(y)BmHkNw8T;ՎgBe9|16.\Y,cCЦ٥ Dl@sr;ε`,D|6*I5JHҺx1 37njN2뻎+!Oo\zIcgi6+Zoi0lyoiB9!/4=۱lJ6D$'wfe͓1%]V!֡Zw ;*_w -I+<-ЊVt Z抅\*gtfżRHfxifżhAJ;Z:SjT,O S =oMi03c#ic6u8!$)QH"x['Rka)doXMwn)-P͝N8~aGV'4[ ޿o&:.`)%n2s}W`Vr\+Iq-Kb-|:/k]_n-ǔȫ+ DwpOp S#R@2 vHcScbSPGiʉQԙ^.4mғ\2!4T'DSeÒztb"E: O ]ZCjlg8dLKA&1Y{ZT;4 c Φ;-Ԇ/(WD"m, B$b6A%^TJc !fm Y\HLkY#5"T1я:0Sé؆\.}eݟWv5쫅*_虃5SdFv}6͖i?3!(kv%[ypy3 p)YNg6u V^~tpж 0F>:(yՔW5PK5n1 *qY5P)F"h-V)Yi8VX1v&iyd)Qf71"W/rDMR[0gJ@(R50穲WPXU`5K{TƆqHđ$OU|HpX,=A |5շ聊z Pkfy9(OR`Fpc#7Q]B%u663!jm v^7UNcY0;iq>aB.ٹR~|1Wbg٢?ﱳٛ~Pg v%MF|I8VDk !<8/M$k 6~т>-)aI.ӼIeT)AO~Քj9[Zr bm^UAV7ݎvc= 3MekY`Aj"l^C*i)J78!Z@JaKR{LxpF. x\4u6y5/Ec|spR;˷IǛP}ԏzY9`c`3{Hהą9>4%1~&1V8vھ0N; v7lǮ2yn8yNL LQ, rK?2oavk%fB So hY_jy -U݌PCI9O}*]c w<1(%6- 1ƙYVs"sn5X'QDž2?`tܼ.j-~u{BWFg@l1ߙ2YPw8VஜwɷqRJ $D04iLJX _X拀W k -~`%= $u;e {)W ͷk=F)WBz8B׏E;D nBXZ*^xW8 3g?v|cn(_xm%^~YQmVX`WA(UF0cƩ>f,Q)s>u&H(!,J y/X$0ﱊs2fd+H&dalel/%8S/V)HyeYr L'>&_֕ј l&LO$L}Xd1Vp%Q5%6~ZI,D28~x)>O%6n 4itRF`pv:"Fˀ7jPԬ f>DO_ V"Xx-F:nE8p[ > J_s2@O=,5 F_)1z8,U}=9PtƄKTWI_g]Tw+X1pMl)=F\L_tt (s/}u {$N)C&,5<b5(S` 1s!g{ >:X`kŽ o: $, MdhQ֦> pƤ?N:r~L\^+LC?.\r*f$3YIh>0E0KC:&N%X锁ix_ zbf̐WkcaH2۷doߦ򫯥`ϖ\ Vc#2 l}$ƃkYCک!ҶY%|ح&DBoh!ˤR*R!cr5T'Jԑ!Ri0mSf;ERI-&-֦STS0hеeC}Gj?4I`OQ/8DFWV@/FRI"G)).helBɻ !=JEty3+3b5mqΛly*q1 !kRr`JQKv-:Aͮ=B,Gf{V8LI&[5-bا }~mEJbn@yd7;7]sZ [^&63gat~|W9@?:m맗qv "0\'3NBnj1$t81~pv/ c2'r+eqR/Tpܤ)Ε*$9()Tes2Wh':(bAĴG&0f4:v1iwwљU^G?xQuNG`*0e\ E,r K` IOL;ޏ_`~k+JF1l1 8٫D?Xha0?rPQ+NV~Q?W@._LF7ϖA?̟?^G(z4ԓnѿuztzѿ~oY<ϟ=}Ky# F\D/_=wG苫ۺ?zy+1JxFQS_8鹕h BJ-485ܸ^]Ofp5G"Nu"%q;Ow+_rn>&x_N| 0aOZssbuw'sQɁ7:ۙ|zߎڃKvu>oov#>{IǴ'Wa"g`3W0qλ`'_yhO4 (.yh/t?waQ)To[?րu{u7|=c4}A;Wa}. \COces} Jٍlu#bU ,od?'oQsYW/OEn'We0=_2PEBkf*kK@  MȖe>`$ 󃢽!EaHk%3[QUmTVN):^rwuWx߃׭16_~} N_Vo'}+=1$yWwʒ0L6۪lJ$yo%F5ٻNrAV ^9{)% ovC3% 0e6RuBkt4C+Cm :mFE0pHȧ$,= |8؎-O6Wd[R[͖]a0b. O9_P j#i!۬dVk μN!EyF\,Ua(7ySaN[F>$R?3-Rw_LwK,ĞGW)%A` ^jF Px¶f2雳w&9'̓Z}D+%$~*xfEe[`Rբ @qw.s0X{q\p ִ!p9vPD &d/ c2- Z M;qA՞yҒQll H0$!"$}LN%GFc+͊X`yHts#p@%!k((Ez@R$|HY*D8Tҕ/J!SLk3OLn&MKVw s9VU$j[w;qNܽw;qNܽw;qNܽw;qNܽw;qNܽw@=:qݳV6}[]̊8drhG>Oa'M YJI1>0 "0sWCkoVz 'ܫ$D/tfz*X9b1n,dH۽<2B62 ɧ"U20́;![Np,)Ƃ#H,m;dQ-w9V*ln#Yt\FqS}PYᥕZʕ:p -Hš.ԡ'! 2JtY\;ϓqd ].D"A(5/BCuB#XX % 6EI&kcpE%re;`],X8`!V c֟~t=݌dz6>eX/MS Ih5N6E2tNӹ'88$xUt!'޸~n3) pB\LFy| :q) AEK-Qis33+{(NZ[#MVe-"Wv_.9(=z_(ETط(!\ɥ!Wk>AGQG?Ko?^i;OBMQy~YkO;!jU"%vs%uª$Q;+r)QxIO寺s-)YQveOy_ssez~_}4ac>_T|ܵ~_> ֮=}z.~kVW..4NXW4cT,p)pFl|fI;2^}\ŔN뉜Mg%ocv\N%-{/H(K%5pky1vJΙim(!pAQ*bԅ苴dm!LYH_mIh({S4Ep^cZddre2eoL ^ڋ]/(!wZB6e[#rV|lMR~1(x 4.]Xm٦_/ҒsFC͙z!֓5* ?NVr=^3 h-9,J bSKy׌95㻭AC.ӡ_3?rEQ7ʌ~To'Af@GA -o~gj_(9..>^Viqr_r}3/;=r2wi89܁E1t"H=3:ۇh\#a2'SBN'sy&v Dltkxգq2fқ\;R(of@qL.^tj'By] ctA%nBm)8^4 ~ڃ *pGW\5WsDŽdzu~, QoӐxR똠 Fz5jU[jK%K:[uQN;aWRqcྠ2%BcBnuZ*fEnrARJpȧpw1Rd>$H=h +#4 .O Kht1z'֡ECt9aHuU%;i,TK9`G 'D6مQ92 d2"- GM$}SP@|X -SRPmM1(CCM!N,)sҙ?=Yԓo˥DoK1BӉ*odp=c'"HACp$0w&Ȅ8+HA▖lQWx???뼿 @sr4^Ab1HvTQ̞А!(]fB@m쯻M' >OizRrQ"YC (i_Rm(ʰ#4EXxCV${i}Fh!P ESB͌ˬV{Bx#g85@sc ~\Rn~$V"ݚQ8ǐ#*vna,n[SB|Yu('9eLՇKPDJ&n!2GhaW"*I49JǒHAH޻m}S!|O@g ,;)ثYz&sb%үq6H|%wf+e+ƆU8ɿ1 UvMzQ9 cL(ЈN3 :[Aj4p=FI|*K%UYsm:uƬ6#HEF C+ec>_lQmB(k+$XrƟZ:}F3޶¸֒nX`seX1 uq.KZJKXxԖ7,sk_Mr 9@e&|Z|}3Pxou% RX@ɲ>#4.} ,*405"4sq8΁Hkp+QjxQ‡ kF_7SaJRєK;Dx+ls ^dw$0L K#mh 2PR$ /4u!y_|ћ~ odFlS`w?6Z͛[[^MZ߁weUl4 v_rg*8}}A bQ̓{0"D)6/|8خF͍p g6 ks- \uzsG$iAg/&ՄQ6χsYɺˑE[^L)c-H 0Ԍ%sbtAPC 5ϒk1%]{(.#20[޽Cz05h]3Ќ4fW&4յxgm >A ^lz5O$lDcuDw".2Q Ĺtk6>KwRrJ|<\dllSXk[q2wLISkU`tdm/&SkQܦi&CFcXR\5W0M|˜{”&T { "ZcGjEsM( ^F SnڦminڦminڦminڦminڦminxD +UvhM)T")ZGՏROXIۘ}oc/?f?i۸}o/9n?s:A~V/5~(]?ßYoK FHBXm,己fz(K⹣_̇^)z8njkIh2SS$Ϳ'R҅/&F+@,P#O u~8 ~Q5He\NJ&4t8, 2>1:&/^ %I'K̄f r bJJkp &BVa+_HD(M5 U <ayM7 -:JS1C ( -Ԣz=ql)V؂ɾOzU$JS+G_G?EߣJ*5xZ5xZ5xZ5x+ :E* iﴡֲi-ֲi-ֲin-rM)soy\CƙH'QċDc!)3)-S)MSzn u~ajShxm‚#Yn߿X7KjPg@5k(#;@&ڳXRNI,2/P IY8)BIJJpAC#J4uB3mI\HJTTr n{J#s"bp!<s`:R쭰_ڌ@Wɂ@P0eSbTS'IP#ڔZRm$IPaOυ* 4-빲Z.PkJ4L#:I*>r *qWutGՒwIq3Lo T9j( J_seS`"ackp%٫YvSdbmR\XtZ<.AIXîl /eJ"%%ZjYI(Fe͂@1ќ`-hLTR PF` 3Ẍ́ƅ2[\bn IP8#9}}[_c.Q౿FN'b'"v3;5L':?~ިÿ6j!Uࣼ~ ǃa?.Qس:(b_K֘%W^ 6Ob`uGtl)Z|j뢵tik\hrY#bNJx DF%e1r%?/0r%u/Q;;|7[ ʋ`ɽx8ΣV[C$'~rpE&T/YRliٵmX7f}J0zN{ᚗ*lK\IK,Y% <윱q6* cHϓbɛ V*3hq61Zqe:=bc3W囌R7 6߭ 2Cy{1upk/Ơq@Mۂ6gqJ8h_oE³u+.fT|o6jIFIs)?; &tR5ie?r)JC&Ԫ2V˨UZ!60Z`L YiгmɆȺįlKwgx34KQEqi}MLej=ee+ `k0Rv}~>2;`/>`""O7z>S$@kL30 vP ~יY|&'X!Y*)=u$B)Xǭ/p-eR]H0RcqUTJ긁nDU97QbRkI'̯Ď[ƍV %R,0ئX;J|B$# [ed 7-QS?#uz{$& 8hQdJe5ciFfEc}}I|*&R.C4Q8:0\8Ӯ*STTЋBoU&[7UztIEAc;PzѴtoͽGne ,dg;ژ4>:NwKbƣ l^e%5~xX-,x[mTw߫,eAr#{v)5_>}rѻhNNWA׼kQ:*>ms8n,:wv`/ѫ?,U`g?ֲӿQat|rѻu?GgGŠk +*rF_ t'X9=2p{ T>4Xӑ+ahgI1k?}a<שy=p>Wqrяs hN6ҳr%+=eTo8+gGŁWQN=,B.HNY7ː d}nÆIo/B_#:c8r;U~}EcşWIu'>Pϟ=,Zj7gyWw9,3} Ss 1׽yָâyN[ɶNqgW>v:;(oa쳻tu*a<)Sļ2ۥ>z9׽7zt<|d?oYO?uNεRXAޠ{y0lqor {x%axݱ~!<du']gᴍ`:|F:_~-=r}>a*>уyxOyރ]o;םa7X0qVn};am>' ^?;\j+hD݄_|`[+ h<(Ft,?e=EOdc?u>S^v&dW_$+ڨK\O8p'䪂s,"5 )+rV04׈ 6ٌ_:dM9Yg8y#7~x0 ?]YsG+ڭ+ᗱw#`~SNR$,MML?]YyWOͺ^͛=\%{⥽"OaՁ#Fct9&rdd%3xh >&äL1lRBhze XH/ ekYܳ᮹eqЗQZ~!2Vuna{_Dn@8=:vuSKs7eR b"~p)8Wn`p4K@%,&@j}~|!$Wu t@z%'Qfz%zTX7ԣh/b5t!'C?ȠҽvOA#vJJ !a(^__fwifRkJ-}כw#v>@Z 4ix9}y:i~,w5'Um@e}z~XU_M~\uޕ޿OKuD0+ƫ:T3L/?;Wx-Q~H_oOw -S?Ԣ엛Um\tJU|q?ۤ7@Uͩ)orDΣGhS2eXΙVJ0MMec Ƌ)D*Ș*$V%5)˜ys+HboAf L(W$ eD0gDHdwŜ5 ;G ǔД7;$,F^$6ofmĊ<=ݲ6KP/޹ɛցWj)Tk@ t9+|rEKcBָJaL\1 PEk{-ƻHNGИ|eڑ퓶BgY'a=LbuVel@ir(W O'= Jɠ~B"s9I^XuʓJ30AdLx6S:#A~``ZeM<$kȅArBl99~\R:DžHP.d˒{cт&9f9l ^Eݝ (s,Ha4\ K.@T[r !].d_8=[K*B3+D$ԇ80sOL5:GB&<#IRHp4N)} XAh1ג92'rM~1Z!uhPdbiGZ.BВk#q=CϷߐwQQut$:qXldz R蹒'4 Hm[ d;/9CސѐpV Dʐ3w|܀$5ZL3Akߪ(jߋJϵǷv)lE-mϊZvԲ⬨QmG-Rj=dS&Ee$Te=MF2+=8+LQCA !؈)i΅ TF!2`FvA9Qp 砂2dyD9PJ;:%MK  iJy0В䂳GS1qt)"i~cxb0iih"㧏0>Mi@㫷x#7~xKI}K&\;?wo?W<.y#>u8#Ny#N:A<▟nF!T);B#DuP1 02xcנʁ3sM.}*h1e|G  |'.NW4=rqt⤝_ 4<nbcF@hqzLn`K%wo_rɱ;Pv>a7?hCgsUk], 'J>iXV;$@=e9qUNTI/BTt$X $\)yǦ굤˂d'eCY= ˤi4<@(R1@A[]b3W:~x?LjFddH4 C%Ȅdf-MB0Bq5mtlܦgIhdt7'Nah8¨ow)&oF$ʚJBFA Yj׭jµ3&e6 d] \iQ6yM3(,iKR gWwbo; U8s"y51-°E~?0\]鬅ږ+ޗ|<ʣTÇu ^HmxvW/΀#U$/蝹\ J{D!5VyGm~ ̀f`U6`x/7.q4W F}ڄǾEƾe̴qh7asC ?IsUIL49Ks蹐ޱªK?5L$xAxmb)^J^ȼRCPRr@:s{q?HzAgmY Zu~owzyɳzKny|Oi%Qv1N=`>+rOwhcY2 v7ih lWNS3̣9l ܑe4DJ+rPNnhM)h"W4Jw]v`Es{"9 :P8&ʣf z2&N]]cR@TIuKu3Y-kxu 7P B7d>Ni.E!7-5]i$ZqŘlp󂈯x}?{=D)?+E+V|;Q:HNyͻtsEu[їQQc4ŀGQXyjuWǟjURX!)rLEIߔ2E8+No.GB;\]I(<0z@9\g^=w٧Kd:# VyW`5u6e[YÑI݉f5Dp#qhep1$\2鬔(H;d>nai&-;< L w ff̪ 8[~̎"ѐ lΨ@%g+ZD@}LV 's??||mV~>׎k΋6ߧpxͥ3z+$Q:NS;>:gaGY6jf:+T*N2K)<2v9m޼[@o(Hˑn5P&sOӽcq1.hQEtv̊*TaUAZч?|5^P YaՆzNőH!O|A+2}LH~Hen𻿽,5H엛'\#-蔒+رl0Aw7B/Aw VWSj <%͇;Ds @wƺd6i:odb0T۽{4v=cFFw#Ň^_(+ 5ڦT+,$+wҰVApD,27 ,ZxX%C֛ tvwМgS f9 IZ+{Ӛo.^o+E®Dh./_ 3Ň"< ۭJ\] cr.o㢌2xr?hbu!,چx:y$5b ȺбڭY$GL+:rkS}R_Uttu}*1Aq#/-֌ZjFqa:$=}vvR}\@8,x;_?-ֹ:v]0S`;1/ONދ /|E_;W5rgS\ kHa8 p%[7~~}dHߨn ՝,Ҋx!\^i8zuFYP^N@t6iK99C{;Nr'T!(s { &}Iې͟otq; {C{yqh4n:kg;ՆO}q_q~q,yxSկEtZ`䉭Bl\'Zϯ>j9!ԫ4q?}U\?VtJ ؊.WGX-gktg2Uk4_~a/WwjwRqlpiY(R+1@|4PKH' RIT>9jHS?ݶuV- MQH+C|?{_-珿yPiNyG-WK^7-+1&Q%$Alz!DPCXb " E-}4W1Ȁ;KQRrـ/BAӍe݇e>^Y?\?3B_a1mcf )0X56Ʊ@$0/#FMYN!ⴂ C4E$LgIV0"dQb@),g)dQpmf[8}T rDX .>61qO:,q6 ΂GiS 6(d%6hHcIQ2~Knm__evyqw_$XC0so~[f1ncfT IHs"J@pJw,!A鲒찘Q!(eutd:I\G>Zv@!PdO(q7:G2-Ʉx[`!}4# %mf~jK(}G1s A Yh%@*[z?(f f]ӄ}ժZb f ?ZRp=8Oi-> BaU7Er}`.ωӿF>堈eEɧ"%2l9r\IdR\C(L決]ZP/,(zZksڝ!H6QP&eR+$V/2MN*2R.nw?kx"ܵB:ZB*R CBO>$c1Ɛe%G a3/dd$PW_EJ JcB CgFPhKj\""Յ>ޱ ѝc }sYGMkaj~Ξ7l6Ҭy95kХKdN5_ )8By(R: Ww>XRrD S'" =dE"t"0wvz59M;PM2׃u'ݺM2Ews%_sŰ{j2ԡ hH1 u(t(u(cv(.&a-ʠwؓZ-W{O}Ͽ36 L]_WIڵ]XW~i}^L-&V6v?~ep~e')Oӯ Z&;v,[b_Xxmϲܳl ZveWֵk[ײt6:$7[6"7o٪N:t.ckt_UDp!,5&"\c@U>R gR\;Y3)hA8F)1Ve n}Zt] PH.wr#]z׎]/Vo\][z7ek,w=Wg^\7}Ji8eA7&"%;K.*d!r[\xQĈdIӞT\h"YSi)͕G>Wp`-wrwɗ D% W+c&c^ǫ@kJz ׫Y8;+F?L 2 2Ku9"%,_2Z:(SvS/C.qzd LRlc%i]+tGIKQm= ?vHR*GcO;v!F=8O΢MYѹ8!k}"Df^-k:ޜy #C}K:NAu@s=(.i7g׷7fVy|&~%Of%$pb X2)h8Ӵ?__]w?w\bgQJ, hD}5ՊR^'~Mto+5Вio=s"KB]oX ?b ezxUaM(dZ 9rʏAd2F ,Lˈbdl!~=Wzo^#h\L Գ6U[9gďM\gM*ƣLb F^<Nzo46H*eSEI6`_'~=ޝK & U ,bT!z1#t!ޚPۿs ) ?fӎ{I9ʢrB4KXKBE# %;$'hQ Z6*2W C$b׉3BъL2%:B48 2Xh 1#t {.\-qt2'dN`[1BQ)ln$i |Kb ~f(0rF|o#9[7$ϣTa?5!m6 {(r/O[<u'Qogm&QqNr0'.: 8?,CrP6֪}֩NP4F-_rqY\"HG= ٺzFQszRt7W"[x:[=j[~\qfÛ nk5t-e6RB?~x&?3c"gה@cd7>_0KGf݁yۣ :7<1`Mښ R"RR,ב8:-OKJ*{H8ZXhwJ~ڸvq= QC8NY?\nlƍzNn9te{F' 9鹯GrfI"=zEgE:)-N\_^> A~9),>Ȯx ;4y_!Epg7%Y P]d=ج$)#񣄓UbY޽[Ayӂ9FRʤt;zIQ@w"w 3#LƖ"QJ~Q1VN݋P_7`uf%moBjZʖ EP{MGᨋK1 uVSpč>M1-& ^.`Eɣ+= 1_xmE]QOM(N)'Ç~/~Amز܏çg*O8Z:Tp_F͟>Pӿv7ڸZO/кYhIak'+u!} ЃcP -yr)tW}]T֥v蛤\^p?J$vNtM<%b5g\B4M+VQgy]"r}P}8ꭿOmױQm:nsf4U`0k&xV!z5"Zńֿ:B O{gcujwU$d5vkOKPE&d܌ԅBOY_o58FrYmjׇ3mpNx7>hNZINŽ2S5"8 . ؆U#01-,u;ßrQ ^')AdO>}CSwڪ@GTc=,*9o>Z ɨ궖*p &>}ZYDwwTȾ Z9O .%Kۢh$yŮ'YuCpԃ##u T^#e=y닭v Wa-֨qWhѺ둫T#DOP#5$\Sӗh^025(gG#jI,[X$ kNLڋ.*KvU 亲TLZYr_I 4_NY}]`Ƀ{g81}J !\.S~Jtnl#g7u_8 nm/-v3>/'iW3a庖/RU"9_Rɻ aRdz{y ;6Jam#RؑH`\z)p)ox%-(z}Z"EjXX. 2OykYwяv Y M+]@B@ ڼc[-oYkBNJhېkg1wˉ"},w;!oS:Ňq~A9?-LPWѸ' yǶ=W3ׅ 现 $xbr}N(='Y ~Z`B? ?|?#q~;|\(*3)*!!9W N.Z?y.>wP:On֪4!6NBe4POmC fIܭBB/t088 WLoyۅD_B"!&$R ߞtɪPZK0-TXRaƮ(oӿ7 y^7ʧ&.A7L'-fFݣ6>,˗4yn;۰|S>=#8R<x{዁`fS<ĻE5`f>~<`یo|Ɓ z \gZӚn˖k/W:d >@Dܸjn!? JRV@la1J(vraqanVہ6p!0 K#R)1Q^ (2$5f^պ^RTf)m]GQ *z 5 9($1Uh#RzÈ4漕iè猪SF#KR( ^TF˭! HpgS"R ͐QT6G0]ȭGeX1 I2a< -8๠HY`a FaItN -GeA)T! PX:EsP'/WP~JXvWn-ƃJl)+`p0&( !I$0Qfv{5a-RaXұ&0tI.zx?_O:>30֟Mͦ-@wG^n>-탯RAP 1h! !U1JHr6ٜW +rc=ܜEÒPIRu, CVLC'~o-a - n㺠4( Fm_\'*a CW`#qtS\=^ %" "YP2Zf\Z %d ; s]4I@y)+{8f~9#XR[Qtꈢ cM]Dȥ :mRZ0&,rAֻWՑV!QnU5lŔafDq9BG!&BQN"YŁR)D-,k W*q qBI 93R߿B {/{'{& s65 &HV8SbO{Hf:xTd"82V*Fwe2IC+'"Ĕv'3Am^`ʝ{e"96"­Ǡs |z['d=nYxxG!T[T1uzBKgYs[}`#7N@CY!l{ԓ'͑o#sxllH4]fM=TR7/.*nnj*9~!yp!*զJi [l>x'n߇*2o(gqi{˪YqZ7# /F^И< 57c^"-QOt,xt^bxO ,Cğc{(!O=қ*1{|٤K6Cӗ %Y(G0?NXt4'?{ϢqlDzS\q$̅)GZґtr)˥`^͆ދ74 "̺x=hǞ[D/_?^0.w u~! ]Rӟbj}2R=YE;Tf z& % Xϑm),Zii0BLw_(u{$?df|>*w:Ƴ{sz3+Y.u`0yoFL ǿ5/n )H7\&fgHͬ|n/1G_'I:M"mB);mLrdr}{r7G;Lƾ|8q0 PYtn;Lw6?OJT$Albz/{1umwJ/_8r K`(;$f Q$99z{|֔l: }&Dߜ~wrFDޅ#mjrv|Z4>8>;:I&M0W9Ŋ1n]X>zrf 2y,9WgɳkYo`M/ymtM]my,:0HZU|Y8ϓtӿ }42]x3hkFVz\1J>}M SY"I|~p|һ_ @|N~wͯ{?>;?'rQWw-ft5Eh7w;ş@ΗEܹz=wCżia5{482,o}算Ť\4R/E*]}71|NǵwN(Oz|uܽN;#>c?^~g8Nǫ97<ݘD߻LFSIf{}i5 C*R|{iLst4~Q ,MvX_<<Ѵ?˯κ+lbweLʓf,:˾~u ^^tKE󑾃f/&ǽw1zo\v^@EQIW/SxՁsQnJ^۟~*x\@7L^?O@G/x/h voG0ٜ*?}Zyit ]z0.ϻ F'ogg}k\aO3.bH*nFJ:>|wc.d?+~\ݝbBm}%S241nMD``2;Y̫ЬP/][nTh3 -߿w1VUy=ͤyǗ} hTYw덾 λܨz`Z}3CwV7F񻝥,UlQJK5-]|kb4Ƶ-^$_N)>]֪APe0_o Hך O =r;nwaM Td 9%,c; *KjFo6PSʄsNS5㤝byI{=OTi@6 5~Ԑ (a&ql9D ܈ZQvKC(0c!-68 ˫rR’oha3)4fGTO2XޥS`IJ$@KY'!(G/3b S(CěYx+  ZFBQ;k S!Je1k4)56،sfL3։*J3 meVL<ք!s18,xt 10cB"V*]Kف:HFFum0Z3ITŃ 7J!H%8Xr^QSBL4Cü xY#(XyZ)Y^T:A/a\Y2Ύ9n:kMUu3 t#L;`+r<K\(-դn,Dm-݅r[U;X[5"7W*ąrhBAH6*pApb!(l9܄4 r&4ƄyBop,EĤ+F-\2IfP \:FLajkYL&ਓ DnP(&~w"jpW;VwDqH fCJ$ %@Ta7 AM5AJ mⅳE<ΏWӯ ?xa6㟍}-6tc n*!ڕBݠm= ԍxMTURu)bs0qG`.xԍF1a j=BP5Q%Ϣ]U^qzz 2|U\w1BB00r]|GrNgҁa2Gs.A2NZ%A`pN4p’kEF0zQ*1j$x<<"kb44J;/+QPECmUU;IpIҦ!75UܚZ\o B'۳|Hy9S"'5O m7ilLO^&/eg kܜv8ԯ ~)c\1$vfO8;J5 ry3nMGX/ja+>;Pxά s;~;Ѹ`f8?EkTi~-H'Byi\:C8r8VDĩb ~Vך5 ? LyV [Rq5ѪtUVyh}pke3c~y'r1컯1{fUK҆n tO(moZr,YYFTUhZmr9o.)S$[?GUJy|qZVkC-=I߃bX>euѾ\/{Ƒ ݃}&Epvs'9/ѷS$MR}栗EMLİIltU]]U](ƤBSIrBh6‹'ˇousxN杢vDFJ&=g`A#@Pѥ1;KKO+|tfҋ eY| _ _vQDHHZg&.p7_}.x:%ڭKLy{u͙bq?mXVkhe{ٕ=֩(ws=n%Tg&M&>^C.HU]ekcҞڶ[>^q/Kx *E}fTNE_V*bQ3Uzq *R=rBnZ֨Xkܶ>[/dZeG/dS^cs|X]`.HJySK֎c3<p⫝XaZ\ƘO~qyNF&r0UF(V9U<>X`V-hREF("ZjRp)X?e만bQ~JG-с*6^È) h4sX? PyM ?%Dykc*; T"BpWkEWMA(_@_B1՚s B툳zHB Q) w`/n?_ZpZ:'Il`v=An1T v'tZJ ;a)Ϩ p  ٣h$F[TOp}-K@_$a"Ϟ$#DWi! Ip|E1'|!B7m磖48˅%aU=0\sF<Vftv=>.%=Xxܙq.(9bp1Y'ZG{/_&$V%y3Sfߑi pZ&Yd6Iμ,V.IR2kz(#50"[ Kܤ3:IHѾQ˜S"ᮆEKg=Iͤ3 ODXıaIURϝ%d"uf-9}bēwj|v<5|*KƋ]j,@*z6CmuYi/4O>Ӷ?]cRƨx;P",s T8+-E%fάB$_ͣ/n,=GI04!&XB|>ۿȃD>yKu?|_J3䌝Nau^ S8APQb 04I)-qqF_{tVkҕ^Ә E~<:,Ӛl(&U.SSz&`6GzxVX܍WQdžAD;8^@ |_0%A\!hgY_?|%7}*ƫ|#E^piDr+V<>6\U{t$]uP#ۻ8#(U )v/* MdMi˓KԞ w̺ѣofKS-Gݝ +` +w ?7ԏ_]7ۼe GcZ^!g46,Q,`%؎e L3nYPATƲq$S%6kzG?UPTW:H \ᱱىxlH1G-@>m iKN Vz <缨zӆ׹)rMlRk |(cZ&!#صNxFus.lE4VV 00hh5(>-me2ԧ~X)X?>,_8_t W,eq5w Ys[SQ T[6%1 ^}"P>SBz"&^B롌\3dΧ;CeMrr>Z ׅKY}gbF.{>>4R+P4cUdERU){V ȣگw{#g[s9)lLlpEt*hjKǥ~~ eh$Cy!`gMn- =uSUT4_G@`$2W(4եcEQ˃hZᜮ";UMi B(^qq윍Iw w^DMl5Ov&]j&LیήFgZLf|Ofty|]P|-.8I6јb϶!E`BglCx]bیN`1jDn}Fզz )QHxi0%e.0a ޫC>$g$B$E᢭yn҉܅VWOyԮ l|lC0V9tXQYlh\j92i9td-&8ZrUp0څ${K.  Mfҩ[ńuOt0 'oa-&ffXc/WWΓΝ'B1 yFWwJR=|R٩1"OzshGfݑ9t@B+;'!'E+R[ӳ# Jwg{ȌmFUqMk((./*7db"Dw(&T+r6r@cS 7IG5/9?_{\z2ųW_3{OHm/[>+$!RU\ͺiqo*s/fboߍj}{b#J_zʬ5jUJF4M x;L#p3^†1RSq2E~f  ƫ |@T%aG֤$c0T(`be΁#Ӗ`< %e2=+wG>Y.S.u}![MU Kf(q S%- ( B04s%kJ(%/w$jݷ'uvӡ=+J/~=({vPHeWĪ e.BV_"<na5=gb:[n}i&2T` *FQ_G᡻G0 %gR=سsz!1yCƧ,f PPLߜ\H Tv w7jjٞhUw"( Ql qZY.QwH5*۶v$b @|TFqWWkpbfb? ?UH9QVߨ)My-e PbYj_^h_2Jr{X#IM\P$C mnEU2oQ{rW&af#'|Bx՚!`D (z n& oQoP}7`7ATUiX#2P.% 3k RyԠ)zZƂ"LjB*V̏v?'^KZw`^]ŁALbJHDPFPwN 2XeJ3 ɩr tU-&F\mȜګ8+%SWI\ Sgh},O[ވ̪^ ~ ǭfLGЬ3ZgFg(pAĹd9l18Xt7yETޤggtg&=3-pO@^QDYFin4D[J4Dڙ |zSJHiI@퐣V,XѹFPpa/1H!vE$1q.n‰RCJpDd"I AQajQ`JoR+gU_팒1KޤQ1p+H(jqAgNYKUZ:Psp~s"nw?S_Pb"J^*} La2{7S 0e'bF>z+?=->ۢ6]6A!)d{R:LR9mF9Hp{oז;2/*(I&HϰfQ:^]%MeX_J5 ԅ0Br@\z 0HB@`mrYJD{P7A4YPjƖHZ,pP1W@EOōyL׆ɳspoL12Ç)`ܬn_?x/{@[UC`F)/`=#& *!#90_DpU#+%<PZ!L(XngDXBL5!x&Ე͜ƊsMu޺?7ڬL繽Y5&ל? PGפ5pìלRVD&xe-Z BF!VrȀhn W|:PJ׭o仰2몈ޜmI+;OmPTS=!.L[ j{^S/fJ`$R 4*.8%-DZqǘd`-X8UN!4Z}n<{Bepf$[D*EуDf" y|^+v< b\ʇiD`⃫:wWٸx|y/vBtiELQ$QF : )8.-74@9LASS؄o r 5!JtLz-G_X 5&kl\&K=gϋg`}ߧOPڛv#߼ͨ}Y3ېnzR2j g0[L˱/Gգ~Zs^|+*aڀ KFX 5 8(xPE\yŸ޷7΂^NazQM d%G"w#Fg 0I&FJ*pa@s!kR)_hTclū=yKz~HJld6{[*DpD-ؒ:/\8 "%FIkHDR@܎i*|椧`6:K9HmAt\ m<QXV$,DcR4T&PKj.Ye__ߍ3ZWSP DQE-U޼ p)kiUv]䅾7ldLGKc|,h&VEp :;SSIfPS];@QFpjR !83s}*hbn9aE:?f;F('w^T:㎫/t_Z2Eސ)F-}?{f)5klas5!ʥ^=JSH ':mzg!P_ݙ!SV0XY-ѹj-j~b%6SYcC";CT4a&)hx!?%g߃}JΠUXUuuwV?Ue/"|TiHF>XJk"ֈJn&$2ScnL6Z =cN롩 lJzW-۱GWbbgiʼVP./Tݡ4oTL9dW[bwp*;\N9\Yǚez3n]Fyډ;d;b(7ͷ _.(QjMV?E7#K Yp?͖maҪ{8=DU/g윫У۲UpŘхsYJn&_'%c212K )[!1]/ƫ'cc{=-b%qc.kr!*l.ah DMyYs_QX;^<-tawۂQ? sOndؼ,*R YT%KQx)6@,4/.H% ڄꮒ)'bIrw2/Feb6TgaHz+ꘌ"@آ@( ,ѧ~ëO1('yzx |9*_Udwa iCk k7˂JN suɋSյ AVqnGeV\.KR$ Sx<; )n:Ey:*ѽ`܂U"ŻwJ.wP-cp +ޣ#}܎ڑ#_ky0w:@pCM:?١h, H5: gLIpKmM.cҽ6FNۥorx%j]uWCz?ZoA0Q^͊dY#00HxIA֗h1IJkkE{J(z}4G6E{E"um 7~W)|a'ȯ_ݶ7m '~N3Ol^/1#lA5/L/1ˏob0u 䥶5z"1ӳ3U=nPVlq%+VqmS8nTWP5A4]vrJ[J68_EW'2ƴ`Y C8TXfjJ քsjuE_nAy!n]@ s?0EzTŕ @8."2LS L)pތY."$ŊZd,N 9(Aa](=zzLc]~j*=^Cr:y׹ 9߫. o^![.ԄU~.Tk2)ZYo.ZهfM/R#bUpVU+]sh8]Q9BjYNOoZ"6GBO%:E#NQ"d_JR"@I,)rh5M "G9^R]"#*JS`:S6DM\ ^;tƜ{>y}t4ؿ +\gߴVᷖRi%-ɹ^]q0pư_>`d  -V#i(H&a?MR}9d7j$BRcr]ek "ڀb+rN=(G(u6nk'{gOHNM:C#ொxa4q&Lf`Yx;#`V}g:4 ~ϋ''~/+x/~I%FiY""&8̅hDrTy,‚A0.m|vyv>+f/|=?x8 ߤ)?c}<H#OIOӻ$+WGe8|CBA >e$I$ D -Tidc0,޽of_-}} |"1\i(r|~_"o`8/!Ma< - RQ1%ɠ ˕,ZE*sU&j7T61xLI h+%ܜ93\!6ϩ&-o(1G@^ ,71r"qȲpz93yҵ 알Q; ]a`Ƭș!]t'&Up }3L [fgIhYWX49/ivΙ vքS{w sSVTۜkQ ;C,.L#YL[zs3fhs9mk!ԛ~,JHe73Cǔ7}!az0 K>g쏜Zټ[c e`F:#ňܙg v tpX qBg Fqrۦb7?(#ypz/ "DgvZ5UKp޶et`ڥ" & .`B;^#$sfhBeֿ^=i3@ҕ K-`Bk OEk7gWmԶ"JWe@# x,֜S<3fhe}#vXDCn-"1z5RsfhaSZZ:MI~'zHדރ0B;!փ*R@oD},^'9$$p8r짏;!9I(z&$~+'uuBnJׇ_$5ktny~' QxXh&1*okpǥ&NXiiiJmfѾ ;ċ 8l,7Mm3sPLPa*D0A$ֈYpV˿ bY-yn>c3-F%!Stsqڂs,{>OԧYß2O#V rt+8!e7p8^'rH]Urœ/g݊9_~Lg0E*&#(.(P% >Eo^}:/# />˝-ց#52]XgumOd];u|Q7'u$i+ukpbkw%8L{sXFie wXJ(m ^0-Z]kV)[zcW3`ɷxV)x:+/_R_ A% i, R G6a/ERĒZdk`j8vFh ."%jyWl|jgM56Yk$H)M^쳑-:6ݿ*w_$}\59%OIq%B:b|Ail l-sĹh5Rx k `sFuCKz]ƌz=Ԙv H8JlZAMg.7bעs,7 ~(Enu/{ u?ya'^jj,V:#֞sAJEeh;'$cHxO-,Ԉa1ED`\Tv5Qy";gTPR tNe3 dQ$@,Iт8 EV1@uߍx0_ib\,6xXB>1}1}cTLtHk<|1ՠ*"8PZ1r{GDހEo@~i~MD H=%Zv]#U:lN.$WQ#q}gaΣrʚ+RΰXc"C!BSV{:Gm4jg#0(mbe@Fa&KTRxEApElRq^d1X' _gl:ZKa̎TQH];_'Kّޣ{O)c{G| 9*|^1SiBGEX 8  uPDڱHpU m9u;RW+7頤RL; q熊Da61T8Ģ,V=> v^&%?P2鰷Ȑ31cI[ywf2zƽh|V}-+5rNhb䩌w%@6тv~t.dCF$ݙt]߽WLՄ}1s'ɾ[:dݝ( ߕ|H<@,hW̌0ȔUdQຬOV#yZ|;z'ȶzdHҡ~S -"%t[LnGzul\vNsrr{?izB]K$%]z |ʧVqSB{%{-َG9WiV;zՙJn9wj Ob#QN0\++;Tc4BRcQNGlnaz|2l}!.>^o߽Z⢋1c^l>rcl<#O{ *|@Vl\^? f [ǬbF.]fVzlgqwQt2hLJu( nNc"Olݲ'JH'.Y2%Dha=6\ja #-Ć{Z0cδ&jh)͝Cu[W ͪ,qSazا~: }\҉ t B1Rkɐi@1#qCc HĢ#{jVĘ3FbIDۋ!N-bd X='A;VMs;V3رX ϝ3J4?1Xji @yHU@'%`9mi9w4&(?)yǬ('xrh|=,* h4إ.8riRoTDvVC6W/Mi?myBno"dN skP.qPPJ{VbD~Fe?7Qs1Hyc3jC~wS*='K|F-!!fɔd2Gig$A |DPI;N4I|"%SSn OM>햋A1F!jk쉦jhLi5&*L9 \8k!rhSͥN0Mvt՘^QT 6P RTôN LR2pVBL2f,0Z[_ѵRvz6A/HZl#R u;gl⩈ԑþg2iM,*EÊY%Ӫ/WqqƘ>lpH*Yb[qo pYa/t=gR^!I`1c9oIl{#}F t6W,7mr=:M<&j<&yL/e]+(͙xR/c8 )sZpYv?p5 sГcp's)9'8lV;?i{К2X8);SKeg-)̉[FAiڜwGhhBD  ڐ^>m؜}frACe_gP(.?]B&$ηyfi'G4~[Bmڿ^SsW/+UQPhR KoU*kEMpo8M9L8RA/[&BcƵZzg\ ^trDY_޼UkS,_&ݲZܮo\ ׫@6ŵ6w 0o , ___~ X[^G/,>J- ][s^~a/ݵWkBƆ$dsO: 88p>}ŋ?^<\?(:NX Yׁ />//ׯ^O˫Nb1OSr4 $9v珮Q݄\7߯_h딳ꔤ5.QMiJuzi]4c"̞5% lFo6fĮ``v}X 7o A"Ѕi՝p^tC :#ԴPah FYJ iD*>B1g-kE-'@[r_KQj` Kd(`~߷qI8>Djߞ/^E|~\ԭ%fOf8zByog(441E%Ep5&761;RJ=Hh^ !^BZ a]EiQ141dwD%d*< 1f *:jS@]v|^sW2R:V! :A~j1f簚xa?s.*ekl}~կ=gu6^O R W58Z؎O)c@D>/voyxR#8_)Ev QY+p ñ1 dP-ap, ~RZeLY5]=EzL*1#tn9$y.=[@>#S!A>ȣkWx@KRp%˛ 0DtI>>cx+E~u [pAm<&z3b \Ih T.s';]A$-wXd { {$y; L03ӌo~>vqYsb}EoԜ{iRfi`'G;ڸ9R90RII͹yI.IO?+q? m O30qs3JQuG}{#9gYIAfvX_R 9["}ZfƳnطצ9g1,溇mcNbfǶa'P9d*FtгB5ϱB!mJ&̤? %=0Ɯ!GͨlnTq#Ö <ЪyF (W*84^]Uড়{Nmi9Wg=#o+H|ש!5Otr WsG (I}ǡPZTjSgsTD)ng~L~۶i4ISd7`@13QFT'Giœ3Qt#'1v2Z2n $hɻs'R`1!mR_ mc'<R BŒjļNvSWWNwj5\D҇rLHs utp~N$q{O8ԗSz܍3y"sgNk$V@:'XɧPb YxaX8;0ɧPb"\O^" 蠵#B͍9)CAɟ{ۅ&PPk^vNb S$2OL>ǜ“Ws.퀄oLNP;l '{~)=<|Rg'CÊ*2jnٔEC8$\~|epjn'@;9qzeX͚{t^y "kL:Lţӕ.:j*(s%8Pa sF#k,BxL 0q!#9*H?v65!I)|d:RƪyL6p>uH",Hh gEg'R BU:$umb^UQڢr qNèM&]՜ :a3ǝ  ~ɓuF evT#)x^Z=sǑfs x0/"LPa2P %fM9az}X b¤xMgXʛL,m&ڋvU($9ͧ?( p2Tv hKGuc{ANk~= r'~.|bHs—3w7we>}HK4[\1.In1 .1P2f&_LRz{p0Ij<cM#-U8śisc8)*H8HGpeor"֜WǸǞEP25b+F 2묵JNh;%|wjMksj J\RnMNAGuK^% ᧝d d=N4Fx{%f)dOďvPPR*z5τC'srLc&"M0ŃExk ǤE* dkHX79& @يo!-umcIVwo~us fV~1H~zٶ10e&5N}zْbAӚ+n%=.>E[7q2ΰ(cdע,rQ=E?'c,k[g6Nxlp²\Zu̺VT!(Ni^ֹܿ>Z$V,zn]xޔjy]^X7:n-^.&مu7&߬zuwlW[3jo>cKyЋع]Ga㿹eL%1ظ`^*Q0mFȗ/C- \{އ!(*cgm%m~Cvd[NdE:"r90߷>rߖFNܾ{)=(yS| RHȟ\DdX[4[ J.[ N{i+]|j'{iY}?ȷM? ScQ"/^ax|;9«{bfrjәSS""B(rέ<͔t 0| {ڷHcX@{g1f}pByD{"oc* ~wL1$O.!2EPuv~ByDtBGuig5LhSCB"$SNVgvkAP41ɝJ5P'ǐ?,Sj78r5}Qnβ;C<'F&Q?'oċsͱz9v7 ^Wz؍>"ϑ|O 1R7+x&o8 KpuW/@聽"K8,-:RL`oom06aJNl!>' Yj3mT:J&esmnŔn:̏ts1p3ϿW9>֌.b5^,KW |;nIo++'x_X/^Y hM<ȟI.Q{8a諳.uy.sER&S#xJ@Dp<3"4h.]WB*UQDZ/._z{k!3k@=0+&ww)'x ͚!r!4bM:*ra"*g2+aᲽ*F4OJPaҸ,Px$K 6nz}J|PJ?/^ ޗ oŇ( JP1tB"3.ja83ԎvZ7on[Da]Ba!%$#Q3I3'\9pS3p}soL8{x_)8KADf"ec S04BM^Piō#dK4k8(ʉTĢmt72yxQ}!YHJt>Q- ^O!o18Q)1L+m42cUơ֤oiPQ<9%LS_MR|f"=,4Jlo5@8OE1 d?Y{_fFI:AMI؆#O*TOUoē UU$Ȟ%W BV&^~놣-0hXԌCơ"ZuqB"- Q5K+^c2@].sʷ]U!4ڠk`~a|[;hs5`Ш` Ck>UCap] jWEX~Fd XɓAA,ȣPJ]fW:acWn)L>ƓbY '0^G%A5piޗ]R">cxh)IBͮS X"˦#Zj@'gנa,28oi5č}sQ)'l$\B|89HS0d7}NvU]S ¤G&A(qI5@'/U, 9=k"ңf*pb5\dNV7eh`fqwGMg%+S)XG\5= j`ׄD hQ&""HmjA2M <1)zִص JF\O=HhI<`XLqL3=O>{[!X9*u9h:-prDDwӓ+u1JHC 6/~тE(Tdi!μEU )bC5VF0)]nVIܵtEO2TF ]k:9bK%s"̌t Lt>ѹlPL f .;iD5҉ƴ܋ .( =ϑճ}J.=AĿ7z%|;o!Xl {3pn\ ~{^|\A=iкՏIz?ӑE?.%K+/WU^NfFcŕH,# -ljĈ+Cx2TjE4R-LT=5\/o}WA>gkCЁ.>|?0#+A,zzK1sPܸʠ`!sDΗ/ _ o/yU/39 tFfF1:gFE@+3BYjKlC-WdLSkazW( sR9%ʥg15UW}C>~d! 2ſ0Tv}_FK ][ yK{ȺU&ӎ41iMCJ0V3\&ƺHJf9"i&V?,eg`$)悧ԒH#rx(O۽XP71DNƉZJ_Oņ4bs2K!ZW,5Jo=у'π|6a||b6Z٨iD(/C&/ w)xI1%MYL>o\F٧tp^s׉Z>Λgx}f9=Ҍ,kN3y4NO[3Y{ \=rwΖ ՛S%=xsq>x7׃m  @"Aѱ=t)Dq[)J#&Iq9~$j:s婠nd<O>00)nW?A`>t9s+JQu8CB>&섪aC.yd'~ϵZGNʵ=@n3[nkm'8=3lY'֣9 KZsPN7*#[=KIu'/_]9r$"6˧io!r!̓.O0IۻGdxG6QyX$yh19*3*PfyDnj.N/`ֽ[$]ÃƻӭX?wE%,4X?֫oON`7Rf/ ˑmrG^ju@*24@ ⢤n0?\l0xa$<vf&YƜ4ZB0g Ts! y;On*{-k l0aeRDZEHt*`ZbdVl).`*}4J>#1yZg"׼z޼'Z"VCקSSSdL?S*\ WVGJdzw3ex V:6Ń^b Ouxb1"vx.ď&3Gg`o)+CwS6 y񙯍}u6eQIFenfҵln9^=oo zo&+Y ;b2dJ@͂ iٜ6F婣Tj_G]|q F/^I{ / ~.Q{8N+prO+b42A3(IhP8lVs2i> b/bE6e-e@r5~ z +W]b"T` *Ȍ֏[hO 0A5?`sU-m/?>O+{?1ƘM_K*Ks׶nrZ 9ZQB~ne=1VȃwW?ޖmv_Si=s轊-ѽQ̌ :x=rףvT0޳KuE!*(|%kAw-Mz)rikn1ZQʕ)޳u 4>>Z0gQRFZQ8{ F=WԭeyrQr-7Q?@9ҎR@A$bDi$ȱ䉌Zy8H8;b?{e?#4R@\2|𤕝vr.$۲S.rc-h :(Nƺ%xL Mb F s)X[hFs\"5cmG;Q:%g Pu4ИL$q5kɋ āHT JRQ#Gxh$|؀$cwE2(IFD&tm4R8O@)Z-@$K,xORbZVr% VAt-t? 'Ї0BI70X`$glQ&o)!ڲd윴97Nxw9TnR`*F n0ZhT\v LBHH X" X&>FU[TxGz`L*E@=1 FaH8h&zq}!q%gݤ gFi0INy!<$)ꤣ娠ĝD2-;Ng+u^03O8sg,影>L%[g0gW t$ghk433V 锤H.15 $5&o$dM )b'sQ 5 N RY(3mSH8m_p7s> !uGV1T9Nn^HL) с ؜WXq\h8K ]RrJcpb7%* eVܩE'e8M ʻ^B%$1)p`Căs&i 98k(͗dA w?o sKC$ŨIcLACn$IxXH.Qxa"`1P(,ZP);&O!Ls(Ɓa$N m:m]BH 2PjUGHXpc +u 8fJ#krVg%'Vq$L*HF~v6],{j[{smE.b?i }"Mz,,?6~b +ږ%>%$6eiGn6h:cnfL<+ږE>l[ƅc`Rq1a^@LGcbE۲[]ȧ@-vJmuJ؇݆{*cW+ږ%>{c}Io41[ ڸNp/^"2-V-!/E|Jѿ̻nRQ*ڠ딎 vƴ-Vh[vC^z^/ofVhVS8^6ATRhkvC^Twwd45f Vj6S:>6i!ĔnŊe><䅻hO 2h VhVc6ةBLmփp-)џTb7(w+A }N#ﻕ*}<䅻hO)cFz{ݠyWڐVhVS8^6؉)/w+V5!/EK|}mvSf[ Zs)cXu@LUk փNȒ7?Uc=-pF% hb^!TG-Uhr,=rOSdι1.(%:vCKL q,ş脥4XieLB>hA)1)j):BKs|foY˯2rm旛|q}Y[~qj ԽzgU۱KYv>SzPH38|3{n>(ɷ%ܓ}_,h}Y 8..W1bg** V8%|erH*r3*\/Ƭ{B~'MZl!f5tjN>iMy|ZE='wUU`yE\5>;QYM]br^zE)Rπ{u_|d/4_cvm_@fn~4Nv_ٻ!olbsr#.0P_GRY,N4BTqbX:6:&bgS6Oؙ3JSjܹ2b5༼M^}!_ɓU?ge0mdl>;]7!u1f[[A7mX$3RMf0W<5e2n=]y<|y]`d,\g&MG4yYnʈyXA7w5Xk^:͓?, >_RaCYKLcpC0kao~G߱bPlB Se46@dv@Xkn@03PYrRTFaKX1.e&brlvh6<|x/ӄ146 rsp9 kk$i3%y l*Dƒ+~Fw8!‹ioGmNEz[H0V;`7*h= t@HV`0|xk_EA&J9&pȅg92}bD +=pAVz 1IFIkF2 {&\+ vzͼ 9qu *!SSQ=(bdKYY-B\)]:;Ik)A=WmF1,Ы'e`H1^jUT JČZH'c?Č쁴4akK1҇ ?_|i"å؁0#!-h\]ɠ 4N2/' Kop"2[#91b`uK|cHJ+؜23TTdũ3}8!3㴱02=,sFY/+Hq6Z냦0^Qc}xOhқ C֝$Wq>W|Q*hN/u?>ubNgOɆT ݸBcN8泵^=go?G-.KnG VU[【d֧p~Oldû71~! U\N՗*L !+c!UX9}%2yS˛o3*tlzf}w)>Pu h ,hsZ"RQ$p`0HHY3R u+搣%RJdaz#q_1"%R~M{ldz1$]_vUsN#:uDx(&+Bx bg2(@/V@ 2GPWc691Lq.QحP&N*nC p5c)xJvjЛeU;-FۀB\ )dCѲ@uSl|0x+cJ]E21#$#µ5aaq1e%[ !47&YeL?#p-H^b^ *A(DU(@>0V|>299(uЏ"WĥJI-, P@<$[EDZQ"jh1Z&cղXU }9WFP- w8B"lG/dm=iDcSPMȲ&1Xpa?j/@\qeUb %JS S6lcHzU)XZ'2ꇪSݮڡIIJ#%N a0*jBFff-ދ,ědb& 0jX8Ql oN(ْ%D`jfՈTQy dEݧE]GC'ݨU,QBH %l$qY $ODa(X ُ\tm<8ݏY.鏟/CLsLYۧ0b _KSo߽&[_Tq??ww) L?j$ kph~oOy^?C.oei[@Bo|<pb$JÃpu[LMB:E[,gފ.nyMlD,&he,Ã[];^VT?rܩMOD-]-i}onl^~np[ p&hj~b>S~s>ǖ  e I k@m|v` ?[ߨ@┡3]O?7jċt:bPʁ s: ^,N/_}i<߾y3(t8؉M7] yz2 /M㞦vL0 zclb^vպY8{rqR|8[v4T3v{_Ǟzr͇Wۃr;ȓ#eCXi+%Kq9OI+땨SÞcRm]kEv!7QDv٭eevfVEkŗL2л۩8AG2aP&>o9ޏf1DzWU=Y?]z?Zk? ם_ړ~hɏh½ݯ1nx,86#b8՞ov'(R+ x+2z?jӺq7Tc.oXظ^,Rq oꀵCs뛫zymxD\^^~JJ95.t8WI~O7gY9hξ G }7Stdc(oz^/O ^]el2mۺ&LݣYi{Yݼl r*ȇs;fzv#3R(gkB Li{V'ifL8vߕl2 m^ޑҝD$ 9$@~ T &*P KS@u_hyfÕ_mC/0j?(H'WTbޝ^1ƚsU kzN(˷k,c + *5ĈX )Pʹ}Y@s)kgZd/tݗ.F%ՒImMEɄƓVJUqȄ^;3J ؈QMȲ`c:UZV@WզP!8(T8m-gWjI @.c0Z6j:7X .Ja+pUF?k۳Bޝub8ՎBխBޔJy~>/!Cuw+I 0`ݒY4Ygl ywK={ -81TY/ۨ5Ggb ƔZ3ܢ~{E60 B̕QhXҍ{h4S'om^]'OPka.`[Ll{S(o^w}|# W7]m۩rܧڄCOZ c\By,mqNoP^OiYO[,7\t.7{zR\NZ h[^# mO)O~Hd`5jD*F` u7~:w:5T{'~ )a݇=y`\eezcm^KC>N%ONN_'ּEGKEc):CقMu,:p”{oIk7C6]3v>mށoOOEց1r|,G9̱<cn",]ܴv{v[B^8D0e:bLqquܻm=?UǽqǷ i7wZ^jyljydAvPٽ4Ј) ~if\m*P8+3?MO>Ƌ:;*&}Zi>-yˤmKEٸTk+a/s'vǟj^=P@ߊ;;kCMβȌ`MDK, V8y( +8_\Z.g_vtq<+Qvۃ_O?}ӛ-({8]nG>JpK<үN(GuuF`=^䆈'o]pocȫd\%+q*à 쪺yΔH s'N~y}~[hRժk:Ž QلlLS GS;5϶8f)Tf`ƒ Y*lRx6fcF).1C A9e-2\ш.`.=! QqvSzJae:Θ*^3ƘrI %{mRdCbż %&(Xvն7Q⸚=PI'a@}d]mrVxֺXuڰIRD2VW  .R9dbQFTu`Em# XDs''%!B$(eIbyW@bjfF]"FPfrlܺD8XAqHBd}4'o=#ek7)u>*!,C'˾ Zs-N~ub@\jp\c5}LHNט+ΔuJ)Gkj5ۭ}sU^֫[Dݨ&Y'TB3[}{N-Jdj!EsDPԓ >E"Zc(}( V2OQ}H8*(}(M禺T7JT(QJV&Z*jydd \|bU !A'I+o]V NU EY-T-k%dhۣ-l} ަH"SAy- @V TbUmBZHB bSa-CcaTmY1hN,FRۊ%tĬȷJӬ e EBIGP@hݫJIɆg0QiSB1': BEֵS44&٤̭u 8(;Օ5Yg0CU>Bbd5mMGhvlKZlP6G":L:tE4V8sྑɲJ Y+^s6dsUDU#J}ʎZaT{O)Yԃ(!)aS,EtLU}^ň@PmުNs}M{ =M"՚AB2ovC);{ҧ>Y W>GvۣkMoǏ7f5wG,V^K1wsus<ՓHۅHOaH,[(s;Z&ﭯUD~CMa:ӋjVNb gЧWc.)?#>PjTBDU+e/ۈ3ؽe'A+'Ds3KUjߺ]\֊vmel +v?ݮ_ ;v3V\D76kǠt&덆v6Ul|%/vc, '>8SQ~K kt dQ+}PG6 [ONX~0>Wڮ,:g K*5c;M6;JI;lmPU =K$=F>c`~N}9ٻm$v[ nݪ˽l\[[u[]+7NKw03!H<$hhI߽So{]>ܿw#|{$'혙W*c2KpPr:TO6WmQIqMkX۴(M % hrÞm&vRՏ+hTx(,1:(tJT~!S8tMXSFX I)D MH3;㜊r(7iaI(^ۏQ^ E%jɰ^W,{$ҐҁD7T$,iDQz=tӤAi7هE_)ܻ_[ݟ.WX~: =D:>35/2>iJj:d~pHv%O0kSI~l ZO(Jt| t2GQ"ku[1xjz{&͋_ݵTOvFZ rjQ7_ґ|jdZquV y>$1ϩb^ADt3T[8OY_ݒb>C^9Ek8œߊ3)]|8^7yk,B9~3Jf9IV;O9;lgNc0St8??8jM  H"}|w!]Dݛwo(IL!w2+#5wRZ(Qso?̚]Hկq~OItz1Xe9֙}7VEM#Y6O[v긒(7Kcn `y,][.=ϛ X b$Ry,][^Xz,ER\o EeKV\Xz,LJ Kvc@Ƅв;N)+c]YmPYע&&*FCZz'%vn h%5Ђk֩8:] Mc)X]kVx5ѸֳĮA naa/u靣)Cwܝ%OfB9r<GXiuG&1ZuoX%hkM>Ƹ& u_=U+e΍h=!v: >喣 D9J;g{Y.:vH3(`зD_V'H4+6z~Mc~F%@^Qi/Q((E8DN6eK+Y"F爱; ReۦiNٮ|Jb%Xmkjrvbo?31UNrBs ܒ6(IՆ*TkZ{i ; ddBЯ9=VwrCJJB4/= iUY6캆cIAhlS AL#[4TP]jD0ʕwaeHuR P#F綾q}R # lIߑ#8xcvZ ,CTjP=,ndk1)dE+^JotY*NlV!yRw? w?,R"OJ}Fw;^ť*V FMɮ;z,U[!OQ4YP:oD'ɸߧ??4^6o7Esaf#k~}}>|'*k`4Cq)7?\ oL, dJYT+0__d6*$OӚ7d&v|Ș3O' >ͫ~P=zt\=3VBPvr, ZFp}2-Z3>^/@x"5z2Yv9^i}ʸ7MmU|.H/[d tz)zr/6c"JZ93gb .73i|?8*EJ͵hO~ՙu@;kUV'k\[|_mAot/%O.64 <]U;jF%fZ`,0ZFl|c%Kh[%ofi [Ah3J$p[k%ZL~-);C^9E+87,1bCgD]ɧ |fF1)Z)#h+FVAujQ"bo'n'2 y)DT@#k}|J+<QȜ[~cO\Vq?܇pcft_)p.Z)}yտ=jMym7oԻ7?J Yo6{w{u?]=KVޅo){n>mȦ_]|93Etc_`/pMx]M"Q^"#7'u SBY Nm 9p:{ PqGЯnu8A> շ n pfdcr{j*]hhx4c!Qjr<: m ƩK)CmZbȼV UFS^Q_bwnKa:8B+Ah-^}l=b\rStʐA컅-OH/v *&tʈCВxM^zO'ÇY^c.Zq2:]C iۺkҁ`5&\yN={GshrDnEg^% d9x/tuJYyUb!^ZW vV٥3r^TRcRZ%EOxj6YW i_,0!5Fi9!7p>5bVkm)+E'P-kD8Mm5A6MiEx$(Hwf][ӡ:H{@tG 9`l[t8Y tDzY f RICi]I߶S޾n{|)>6twS;?70nLi^c9 Qh}ct\X`J1> 9dhV,RйBF2?d"ʄV#_1. Oq2o ]q̠H7}eA vP9qG+VH:ܓ$eMfO粞3~JRE,EVOt? bmvF SVK/~Lrnzň7#56)ip'[+(//@\OT17:Ï^ lĄ*#*i6xv9Ʃ [ԗC23ٓq# d'888q@S"qP=*qP:9A9wﻘڡ?Ϻ!B 23QiVq'W~Ǽ^3(yklt{L\גIQz;9Oxv'.vSY9wn! ۿ,yKp^CӖ\|S냺B;DTEj-RR߅4_T* ƞ2WyL҉ u>I:$ye@ sGNPZWX^%Y-{,F{0tR3.r+uɅBBTevV,i*K1Nr8峥*7WOf!UY8Љq嬆sJ|h\@hlTVxp!vA`CXu-şbEp4T3Wf22@i]`XG, !ā*l՞Z.-+!\tSږ |}ۇ[ {q>3jsI\FyݶV+Z r0jQ"2OSN~)2C^9Ek8%%,~&ُn[)FUAO3Ĩ[DQWN*NXs?^[-9ԩ}FIt3Pvµ[}߃6C^9Esg'Bm;۰)qa>WM}yヾ )T}"޼{S³%σ0Iz@f?8(|^\RǶѱo? VB~@^l0Nꠥ"ZDfE ܶ8iH]#D!HImKZ,7jwjb9%*%d͹A 8( i>5#LĔ]I zF5'A:4oyV5a',KzTO)@]K#-L^h#tiѼSk5H;Tw@z0Mvi3 34Tƫ- ǜfm-lœs*hcN)}is^ r"|YC̴:du(j >@jM,yͻ1RsI׽zz(&sZF@q{[2-J-"}8ycmT9y`1e>&)){?SZ7Jzq':T> mu2o EnM0S4Sn=zEpRt!P^ۨUԩig}3>4Nڜ"b$]|vJnW[<`PӻBg%%Su*t6ZxVK%j%i+ l |?O0auNjhP@(lQ650=0"=_.ͣ|~ByЄF\+$E exC!Q$&Z(f214Il 3b_͂gLָٖm Yk_3 cy={cH;6 u*=Ͻ] lhj1y鬸x]ZY6>׏4Cu@s Sȑ$ˠGjMqhoSC'rVMDGiJtLAT"I 4X CVfX`FԏNFٕ钥niK?biY H?XzHiQj Xz,3>sϤY=Y'PJ㜯/.*弻y^~urǯemȩ}7z`엥>7L1i8@upXynf:%'7(Z,?dW:.$ &!X+æ-Z:WtC@az't 5 PxT.$ܩ֕jvpgh)H@idpҹjrW]|_]T0%{PـuGtE 7C {GqXdBݳGc6ː+kb%R"H&o7%=Nnf2Xp+*ņĐ2XJ G\,Wr;K2 ,K^f:mp6Q1= & /Jxp1ѓn ׏GK0rcx$< H /ҧR,YvߟjSp qƀ10 ApBsGt2@ Ⱥ^{ftb@y3|hUᓄj4%*aUɋ5y:H'1JĄ XBS\~ Ѫ`B}KwO8 :$wJbr8M!,ۻR.ؼ'j{D'8JQ;jjy@ =A1i6K֑J̴4chţ]6 O5 fE.2j/7jT̥kԚ=^עs!XkHA VJ%VJTA bDT`(0B$zZWO Md~{iL~0[N+y+ZMTH*ARƘMC_9->9AlDML *_9- N(Pnո'.+$T#ƳTjǫxalfqcKgz1FFTZ*_l;z`ԃ Y@0nrԶ13K76n0?X;4kp5<1]lc}wӛ~WJ!ݡ)U;o?F(?( lnɨ1fV,忾ؽ$3ݢ1±2#' `%mlv~*imOf IjEf\O!?_/6[><1gp`caih . tvt?܎ R8{n9?G߯^-jSS?^YW1Y̮^8L~06?>ʒʒʒ%j~}7B(a0UTiaHZI` NXC+-6-"7}s(n3 ogn m|M)pl*.Ngv 8$tb;dc}BV Y!Oݢ 6HKX)DP0{XXP+qB"+01I3 !#U 8χ^Q`1,9OK=-J r#d)%jl?O_T݉*mNuHeK%GSy+=T1r+4H_빇SZ~骖åܷ?T}*WDN@Zk‚"֑/!DZiQ 86X)Ov`˟l3&m'_..WP+[7V2\F/-Hwdz"X9ATLic9&-P딈k1 ܆,Efq><jiijd9'NeKŖ-ۺ96l-C{Ws qCv5Bmk e%eSӿOrո!&8E`Ŀ9Eօ?f'g'}z_k{_}۪jkwZ$e@yjqh4I*yֲyf8&zaH{l| dzb$䇋A66!sN/_hd1p6Kջ6A}LEp MB>P 'tc 1S4S޵n©'ءu ՁuBۨb:Mٚu Osiݚ` hǹcݰi-T> murY֭ p8%<9%&AoyC&y.ynvyӢ:y.[xQ?"j8l>ɘ1Y=',,K-=$(56ғf1^,%r0ä&K Q4U8ba)Mubq1#\Im|6yQF^)HM=3Q` KcӢԈ 4[3}.SxrdzZ3Sd)d~,,ωzCRORcғf)~,ETz \IM$KO~gd~^Di|B!E5d@[)wP(8A@flҽϧhzɄ'b^f~Gn_! ?93dX|̲ocuףWCDrB aYX~Oއ&Rs3˝ɉ4BltO]cg,)x:3Xߣky(xqAcm _3Q FH {Y*YWe#g͖w{:ZL[ъ]_ڟO { kLP?3s{X?,_bٺ GF[j gboGpQ*v<cKt(k4(aTi7V-Qfѭ- f|2 ]hc$@JԾM9{k=wtk}hfn>9¹nr(|Y6'ۉyTRtfI=dHNO~V%[*8V[_jv][C2j& >.U_j&7!ӂzޡ:)FzM S&[xn!/Aws#uhBuC6X.)cYÀ֭ pqJܱn[G:n:NhU[!Y%֭ p8E<9E&"{*KSo2K73Y<܏wF"YG H/`p_HԚx /y%dgKqUbx$&Bg/f7~?!j'beMQ Ӆ?f{׷+~.Fw 3R`Sgo=_wrg']C'#;o0joޓ%4!1 BtN#|)^s"|ۄH, dy?,˺{ܼsY'MB4@Y͖kOΏP/7KfBsm?+[߅}3:Gt~5c$Qb󍍦X[Gb$Dh쇍ID.(ހlJVQՀo0b,ϭJM&K[dRLˆN#%!GO +x>֫U[Q[K;kzŀzo:,h`P jHPқ`!1$;!QlHl .4V5 _o}89 2TbhG Y*2QNA&p('X=ѺBQH{I<\yAf| 1BWᾤ{ , -x`Q{QmU (:=e C֑ȽxiIQH\KT =Xwx1!%q &(Qrvpj #Jc-媴 uUI2QLYe% aA Y Jn;KuQ눭ŸnQ\cgfq.\TJzh3eY(/7,DQF礔 [WΓRPX5/RiS߄yʁZdLsr2*E^U/F4s)o`i÷FDymH)CmDL@ U(%w#@ڢ B)DEB!/,jc݀`s89kxqAcz7K%[l2} YІ]+aCT-L n_2\|?\ djrt aTp.$}vqp*\"j|L|v̽3.gVg"g\rad:{{OBȵ;ڻY ){R:`xPުWN ]Qdǻfgz`+_(nΞ\?>>ͫq2뭽?}kgx^߷,>\;@O.eMOOOO!:/p^4uN)~ 7O-!O T_-ݤv3;r፛4hɐpLBnx흦o6qbr|ͨ{ӻ`.\& 娇ѻOEK;׻?z{Gqmݧ!j8=j؅y"~RHFC+H\%(;t?tԩC$oCDDnz;o-(>Hs!c[5sc4@&Jmc n:dusí&\ b.\>=o sss.HBUՕ,KDs"VVV**QZ6s Z`JV5q" ԱnSRz\6_) ^ MW=#4Sp㫒!VKri且2r3R-Tܰfv+(MkڹGٸzEaD/RK$I%At;hɀzth \o\%@t=bt,0Eӭ]-$*{|wgl2$KkrO-_CZtښیFpޟhQ?EFPQ=%D޴'vxЎ ?;hE4, Q7bdhX(v"QX-y^'W)P!>Z4K'>.t`GםO)R]Xt[=(H!$a'ȩ>4A2Cn5q %n|@x ^Ɗƹ0[-58xf iThꃘMߑ>1\TujO9!R} s6┒ע KB ݚ :Mߑ}I<-2wN&R\~ƹ݈9F>NwD!WĹ-5:`; 0S8E4LK9,6*%#s~T?ՙzu?wogCX,0^<)3o'KUR`Sg%KO>o\k癃ق2fOq= =e4}8v-܍.M 9$P3Vx&dU6AVU$deײ6~ǧ<9d&B:?`៫@u"|*Sl83}M]hC}t{=m~K]m՚[P;(GybjC܄5H$ d_"(AA9` Ʊ`%E JX?PlB,Xr(l$ ;u<9 @\d'VE!P,K(;ȵ6*GPEWud|"d;^Y1sy~eDֆhqtx!ٮuI޲miZ*6xcHv껎WJX!5ڐ2&I;ֆl:s+O]xp:yղ;/}}gfB()o#h,r!d傑E[h+ʠ.+A%U%sKdFW9?7TJˡFo i.0-i$"}(0 Juo2N"wh0JnEIn6r/E3"20ӛvnt9EٯaY5, &WR t\ӰqACeG\1riqFrgv̽KI )eD.}/} ;ҁ\A~87XKBQ"-\Ɂq'b~ $ݵ];c4gº E";njX6Xy];w$'cXvk44ݼ39HAsdq&wOzӝ/^P/m#..PlA3EJ̲<>ȉo.pPK[(3-h/.ut9L-.e,ss7}4J`u4ڍ$JlciTley\,)#ipYwn0̧_sg.8,w?x^/^Xn7}ZCGv.ng {|7`~LdψpK???@x7Jfp3m;i. Na[\>ķI_%mXx5[_>AIgGc+VjQF}=ĶGjG,eժ:Y#WOzzFRzZX Dyq4nPdHƭVE>Xz,c) aFR8z)8Y&h|,E#F]V_Z ܜXz,[=:$>$KVO AO,=>S Ksja^UY*dKE}"8Kcj'7KX /@@q,[M O,=j_'^4o ,#~m4/=f%su,1w[}j/vbQSKyZfj,j /Xz,JG|> c)uit,C`,xN#gq,9>qR)XZ[ ̜2RISҚ(tI~!CU~f//(inaaIj">} Sk܄uH ~.fEPtpGP4bmh`y hJx}PZh@S-D$n@ ]Z l>nXqK ^.sm)@}IL RL*1-I@Л6*d⽥?;;dU4xa(p t2Qm9;Rp -<\۪VTzQx:u?N!Ic_/Ȓ둥Ao[pRmi{X '_kn O;ݟه/gwA_5h0niVŃIҊeZ؟'nq֠9nU{AzOAڞ2v4$ XinXpE!Rzcx1~:@xjT;^?nUJǼgy tO?c iwK%`.!]MKn KD>YgZyF~ qd~gZ#^*h88f/d%*w=c߯n/%u1n[*z|n89)}_={xT דȲgoZߕblB;]vŶ`v}_UG2}c7BwM*!^ ލT1 MjZk g;H>9g%<=qeSm}ضxcv#oFw#_<=8wލ)GoUtz!֭zCd0X"!#}^=MMb/ ?l0z8U?ݭ\Hw%I v|^UXbRC/PS)pOo?^|}ǝqEZI0{%ɵo/߇Dhd@my/b鰗~,\ߌoo>W͸|pTLk5bJ2?v/±cƍYV:c}^2ONnOgSw4r؛b#''#wrʓaːy͍Hdk0 <BjDHL{$+`"14)<ԵT_K5@ ү:!'} `;f41EYxF ,2Ƞ>#7W,jĔ!+S23 oU|_8ZJR<ά49C)x|B\||xVsLk}WW~ fחa|\@6nʧٌY] 8bף/\DH2{%;ͲWK5 4:))51QsB{uE MFr0[$F+@' o_wKYn#xTrn JbcQd)z'Ui>?lj`H n5(|"-UMO9!)nb]nAԷ~6_o@}}?{<0 F3!pEF( 9 8MF0lMUTx/O"R 6QLԋ_(yBlB>ZBt+kj#E#DYrr{mp_~&ԏhiK iMi}j5VӉ\#ﲄp+ĕi^i$Z5aa|×s9FȥS9ahM/mUހ mj O<&26\R{T==8wQRjN@dsn{úr0kp_ %FazKo ̈Z/ā88A }w]5P㼻ւ͓B{j@:$.0_bYն7djF{MvjGursk5SmOU`WXSY|RX:9} ۛ|QN%X)d+zU]>bJl1lGuRX@w!ehi _>6ʂ6jO{6o+7eI2 ~'nvK FtRhgOdiHօqM)mnY<햊A褎D[zHօqݐ4,& 2qV47I 9("bbj)GľXRRTivh{(.EZ#Y ٢maN0!52r8u&:&ۉBfFYNR%$+y #P.pMx$ռڒĵ~[H2/Cc$!-Z[d.p[IҖpLs'45= G;4 J0֔.| ߪb9J(A TV-}1$`|@P@S qP$5%0k<*SxtZyNr<ӮP 22*<7QZ7!rLJYO>l(uFW 7VYr6zJJ$Vy_f =$ ?|oNO0 ?CctRbFfC7S֗w VuyȠj=p=s3X~盟5PP`7f ߓȚ% S㋵ŽxG~3"-i]o/k}7g;)نܟD3 cnZ gQ1HxW_hMaKj{סXJkQIL9))fet=3ڬRĕ5Wć^+I<])]L:׷չ"Aʊ)դiXΡGíz Y%e} r X#(Pl+.lcFfX"gf,Ke |gl}(2c5_"f5 ZgE1hGf :&5{s_.hh Qkoݥg$=q}+〭i|${C3.XJ{nW'ĕM]y?yiFiy9GbLtTN nJV y <o(EƵGGK\)yrdP B"QCdGzTy{F+R<+@%h1J13Z/ o ei=<)B^Ripo-~}[OW}8/[>u6לp"8O _fPCui 2u$zPž;'_(-|LX86 9%/=Bdghyܔ/`n L4W Ul<&3(pԡK Ю, 8aH+48nߔb1W)(*a\w,]}_UsW{oX{YIE[ڣ7K ux>&d\ vhP0P[S;!@)j/(t~06(Wjʦk#NHAՓ$-U+x[32r1$+O pa9WDcsj>"][~r'h/zIV2C63ù{/w8cūxQ\+E1jꅳHJuױQ}l0kLc}*Ñ8;QXZ@&׀eoF<`mo8Cz93xOk5O{r_} yaZ5==r#gsL[6JK"zUUҫOt".~VS05oLg—bܑhm⒅cMf5/@XT?l+jUkLF_=VK" FfyEU&^U*br7_yv̳`]n-_@k9RR9dYa ΜvܡɥX.sn S3!滊g쏳Ѣ~TFR:-*6jcOl~saN[KVL7}8ꀪL -+m1ƈ\ƇՐj'bcb_|5&d#Ul1G'4(xJ\ )EN@ N0,S ףځ|ɋ OղZ5%bEQфr[:SR  :3B.yI+Vɰ*i + n/ryn4IEsS"3hKk%F6֙CiXe>g2ӚЄ煳 s/Z?  Q2bTCYV]/4Ve(aC))SxL ;Py̼Yr2R8)-aqn,zR2e!w3kOXFziO;:Vrc0rwp, tAcdV%"4oۊ=|P؊$*L® (x9 rA{d7Tha|ȦTv'6e5_ IFRdi7,{98J#dq M"oKh߱܍ Z9qBw1n.чfy2:4xcRh A͠>$3 lҏ{/(A^j@5cV| B-q%s>j ru]EORm@.>M+j<|{o~pr#h'USm9~ErI5CT b:Ѥ6)W|&Od.$䍋hL:y}z[a6-Iv;eUl0^Y/Ru!!o\Dd[[ PcnĈNM!O`p[zHօqݐ))(e,ly /^J Nf> s3\}FGڨaPM6|~rkv{:Ч?]1y$B5 a2c*eA=!T,4z)EOΈ:WFr Ia(.[5 M*TDf 9H ?&Sh\qEׄJ_HbZcYk9XJ!QhTB|hN2 '<:)Nd@ڈ 4rk`e*ϝ@BS \i4܅HrK͉$B^rTB $J^|k!'|dac0qQPB PJPفvE~ļ\znTR7Z[kzQBb+XhFKC7qh% O,"`((Xfj & IB!XiC3++F64:CeFMR*ª%`($gZ ȹ2(yl0H%108cFR͕>XkU[\FU6J2)ID8T?85Z0KV?8ClS J D[z-y}!X'=IްF:'H&Ɛ@U?C7 =P$\=Ynr<#dg''8Uz͇6}FNHR. ‘z$mؠr|WrJtͶQ͌*-T-'R7ުTVNJ@X -$n F9aֺqŨ%a%D ј?r͡xrJ ٨sU%\Uo4{T7 9z)ApŎ ם**@cúwثPht:̰OM %lHNSyHM(jy š)9e`NLvq\ ZoKFOLb !晴rŔƒ@Il`wNQ/N걵u~jDn~~:p]jU/}"MhUSԋSxzOH@u*BG;Z:S))֢n[% tyBiBqnWvC&9l4R/G^aifg@ٞ$O80dGVٝ~^FmWcMKMWjy'9DUR,r32[}0rd?` ixds ۩7K?6fIj]}R=͎a!LD t"Ch.&yW䵰j̬ o},#,%<ʥRLJJ{:Jxh|df|΅tw$|E%sWrGKkO0٢rC:@]\ m M_<[QdyGv[qXσ?_Ƴ|/.?ӻ[}C~)⳵<D_$ KG m5{WNdˈgx,ƒ>/@q! q^K YQfW}y YU`\Eދrh( G$wZVi\D H)PU DTdm2DL0 sɉ1DvzTӜk4D!K >'(5v,jRR*-K jKzTԱYʔK,3sRRSpU:^1K-rKm;R,P Xԣv,nbRL24;u,FRolKzTnƿr/,Kթg5ԭZjIgK/b5;ԠұYoS̪mgHrM[Ɯ鎥WR,X#BX4RXJRTvRXJMXz,uoۥI=*JMqu,ʍ\e,`)Wn,ͤKdGEOzn:2ViW#5Ņ6KN1Qit,uHM.\7K9rc)GiLTGwR|RL rԣԒu1QWR]]z߆#RRYJm)%)K5oK-M&vR]]GZR]D/n:eHKY4_,̍ԒnWYJnҴ]{vzTn'YJKI)jK q Mb]r~P" Y"qi"A Ø ?$," bʓDJlfnYvY*Befynuj̆-r¶bHnݰgE^^>Vfgd1:M4L@h(`JA8`TGrwVl C3--(>YCMaKD8R]+{s Yw@`zO~^ O!]_P0ƗNBaOr8Yi^5J] rg~A1 (i[%2ee)1xtL\u{TS|l:[MM}WK0 ZLC[wԝMI2 t䝉0jE<`5%_J\I('@@7o_ta3L;\^bqͶO,l8yKo2>.FΘ|j_{j`&4i4A1Fޑ?ZLI$׫cfg|$kEjMVx EÆU)l۱0j#ř0:lUL´c`;X"-YoJ|H nbN{tҹT\o.aǎ!"QM N`ƸG']8r(2 ypbhucBI50oT"L`)=KCixb.d.{O 'mG8`N3!ZH3P¤!T"FQBC"0@J3@C *.r:*%o8Y 7whҺ0Ew+ꭠJrwDD 0jt̿lb =7`Aˢe"VC*+幐n5T<$ɰ/^S.|/]5\;nΜi1TQ]`EUV~|s$*p@J2Řs7݇vyti`y9+`|;,'O 5C/ p.x$7`:bFչ"l;{\1ΝM],U%d7뉔,9uǏw* n;RHQRK| r^̂\;T+ Bvԓ6PaTM$B(PK"2LLHI"P>ᇃ+@#̸az*>*$uq7' JBR@x( E2R2 ޅbITpfh@eBQq9,!~5܀]g[-UFdId:̑%-yپD V򥀍(|= SA$Nh%&%Ғqf< 0&C> `SǛ5bR׻ oEhtv9^Y%a,`+!"X+P t$q@ۈE^ZT*}ْǏ6[ۗ-M ֭{*[AWT-~~yjX**&|DF7?s+]ȋܤ?m|o1@xldeXxL`)t€!!`f/MIXcm? V ޓ穡nBߛUe;V<#L#<d,y7v˟@\jW<mM2 ("Tx0'8u3BԺLd6L' mS3hNKoOƛf\WsXz:6@d,8Ϝ v@#{YvRPf t5r%BG1eNAԪyYrljn𘿍4\m3ݟ>LZIMT{Nݓ޽]-[M7wtʻvO[7b5v󖍿d|8yJ](h~DRFg+hrjvGre<ϪnlJƗ}Žt?t@_MZZQQsY[{ƱMwhBL1㡎$F Ә3C6I"s@Z*6z&5xT^*[*P]BA_@ UaT(s׻*IJ>]j"חRL4p;tS3Xim0|~ן"F??|kYD}4]M, >y5u`# zd{;zc@>|x9vU&9QRWQsO0[Pq+*MM̆7VЂŊw}H>N,@60(s um-U'kCZlDHuZon6 7p.r<\6:8с1w2<~2-w)QZW(xBd[bh꺁sWCS$O,=cJֆhũcm)QNKaǁe6};߱J[~Kk:*4屻0w<40: P{ԩNv&zfc״cV_Xp)@:$J%-q[fZ.л3# #.}a#w&]ޑt|_wI _ϔ /_Rѥ^Mg᳐6&:_2?nնdRMz<6'DHL'FDbWeΨ2mlFPfEN< U<,1]]; g+v."G` 6]dL3VE]\̒Ɔ.ѵ}啕,V; 3]`,m5Jzx"h <cÓvӓGqEԑh7@դsQsRh=atN@ls'.<qO۲gqK/h^/{ػ| [@ ZT_z2YPk~m3a]oAϗL]>:'Ag"Lv;:lg7TJ}odf1q6;Yv+Lnc CS[1Kqd7pQ$odiy̷@TiykI!1ZM~ޱmgW̟dΈ4kYR+?cƚDcZk؎,Wa{*:NAɜn ^M/RƼm;R&c oP1LF7qvC=Br/7$4GH7aOHXKSqfҜdFx3PjBN60> (߶,%QqڕTV[U"vqmǻ}+oLy2Q4,EsC棳Iߥ%7BSԫJ9fm҈Eo `g'"ظl$3O[ O=TAƞ˂鉗֊Ei{n˽=ݒ ):V:r}N3A>؈mnNOsW3ؼSʈZ8aiy%kmŰ z/<>9bo~o~Jt&7HER$h:âێb<̣Xy9xJ+Em?>#b8͞elȊQ_"8-l6MX5SU hpj6KzHVB@ukAumMt 5=-1(%oȄ j'u#RHӊ2ƣib$-.AyVp FbDJ{C%1{ FN҇;P% CݱEͶAQ֠b  @:vu7&^'sPUjkkMen\m50Rp]4ry)qǬڶc/g 6vS9938ZMwy$áqZck0ԵbJPiigQ7ؑ,OQdł(UPlS4)(vW(nGg MPl3XB":66K:A82qV}FGݝwYrb.,ؓG7칸=gO0y}"H 2=RvŠ8KžNICݚBVźj+Z7Jm_[͡@:9g 8`(x⇔I(0&n8һ۬4N Hb\:q! {i8P@年)9h_j]nb;Aq2Vr D q Lim% ,pX+8Vr}wFN6XI E<:^r w>ݱDێ/@pVwL7JI5T)3%FszI:bC/]ʿgU8U鈕+\-M&>8õ#K mi#= fÓIeE)eвOy9Nvd'~3Fүˉ’P!+tQ<+oF,(pf>VޡӠXNXNZް|y+p9[r#dS|jP[iK&GOaFqR, {o>ގ ihOb/2DE)7p sն՚4.BIz}m|bګ:c(Uq-q!\Gs\K5sjsh7ˉZYH.IXm%TX 9ttձs 1!ZcYٍp٭))FNv0H$'n Mnc CS,˘!ge7gn>HN7rۄ1Ea;Yv+nhjv!1ɍS|:6 c9q3ETjjKuVǢzR7uV=PWVNsF)BJ\JB@1&‹Vy42"PzmrQʉn^ rb.\ 1JJY:nK%K[}eVrAY$ \Jtc5I672r)QJ7Vh{AYT4jۡԡ[JY}m5i6/(=GKC)o#[J٥tcuvA_A{.[Q @qҳFi*ͼT"PҍդrBz(Mb<סqBzmI JI&:.6ȥI&FcgRJ6 0%V[F)bJq3+E1 c纜7JPJR疑KY}m5Q* 6'CGFiZLܺ+v ‹(^|{GP56PUnT$Tj(Ma΍TA!ȉsk$DJ V7wޟWݟa~>IioBr1jW|9wcֿo\Ҡ>7ݿnW1R7 }o?zݥMSMy㻷'$B2 7qQHEB2qa^2jHKF UI\_Nb휩kH#WXb0^Ys_om.G:_v6V_׷%<ж +"o NmbiMm:yuRH|,h NGdz8kOonWgGZ<CTk]m Qvlږmʯ>N(*R"UU=Nے}-ygQJ}=/B8\u<D$}v=q}.:so}@lz^>נ{ T6;ۗ3#M'h[+K ]us) omno~uwˣ;?şwϿڴFݶ_6LKP,t#V?J65zG^59T Z*X&X\Aݽǰ]*q\x;%Ue]cL%*!pʣm[[^z:qx&gtft9ܘhGIvvH%ELql ºdRcJJ* <5Cp%]ۛ083/ێs%7 ظ~:"J\~:/G! ]ћUwue$~dbmQzJ`_~;z(! rLbú F0 ^L5f]yvJu|%^J1J>ys;|@8/R̟#$xJKbfAo; }yYe>wΩ[)-L"="W#ǫ4q&l\}o]^ ok. 9]ln:.-#uuS%xkOp`Pڅj[P/JVCd՟>뫯n]uEB"1mgZg!,g3^zkxH򲑯.v1btlܮX 5 W\EKadRD~bXlۻ:%Iḏd KyPEBxbדG2;EM:SIJ g4UJ: 2Ss@la! to0CL'b!ӫDPFOLDɞ6B $hPT{.=JtɅ*ȖB6}s ^ʙ)_, ^9 _'KMe1 f(C}E殠jPʔH,Lgi ’ K NRse%0G1oV$mfu66~Asygk qƉV)X(& ㄱ\CQ\Ɉb5#7%5cÌbCް'1GRʴ{MS.Ǣ;v l;9y.쐷m]Vb؞s%܂6tk[vB'zEw@a$rQ{O_]oWO>wєmF%w讷U*g~qVT̫/)h@`GXjuݻZCgj21p0@d`Á5Z(i9L *C$32c8DH݌WP6E]whz$ng^|n]bDluogɕ+G@ bpMauDf>IQhu] /J'jj]sK>M0.Yo2*N]L Y%Jͽ.M|3J1bx}P{p: 0 8 R|Kא$qRLWPtl ЪrHƺZ * P_HD .$Hxp(Ȭ%aDk&v.[ύ!d .RgvRR)R,噼3qeؒBX5QjX~Ԏ>X+68 j%1)Jkh++G2 ,R qhxseq9$db[ۃbtQoqDA$ƊWIT_.XG qu8GngKLD.Џw%evAD4!RuyV "W6vAI!4uTȉd1sNOT:l toȐmsy\U8+󎘭jq+f祇8g59&g5™vBrJG8va#߶ySxA&&?SG+v)ڷE%OrM>ƋĸoSKyҖ?I>Let0]}o0uj@`pBd3LA*HX?ҜB'M;fJ䷼?p>O."ْLUd,!y;k5f,6{ykbk75>dlc9¢CƩpD.Bz*'#(r19a]R ~/se ^V KDLFA"\ @Q\ *TI᧙&8Ŵq?%Vu(6a*<)8\LJ6rImLK5c2'B 3Ne hdA@n$Y}fV Ȏ٫ u G $jHC/CnCf:wbH2mac`o+s2K0Қdfkh!ڴzo Ȏͽ =T Gs\ kH-$1nj]n\G\ 7„L%Cnk?nGh%0fS5{祽0q䍔Qs7_rkO>Zse7_?Ϋ$3oy5Ѡ?b"Gb2צ~MͦPeVlSˤ @V yWA[-|>F0Fv̔O;@4LC*ͭ NŤ̗q^ѻ'Ξ6zJPh":+^W2pdYv}{iIm>4o+ŭWr} eTlTpo+%l#a>};Cȵ$> @h) &I@ wk|Z/\jca0%#%(l.$m6@2AAzAFL Zᅂ`HHf 衐M{Q1Em$XN!: 727Q > W`∅ԑ,ͤ*R ua=niVK!)NqQBd/xa6~uG&hS!1p» ^m"so0)Ht=Jo ;LE2j$w~0Lt=VpﯣH(~l'be;e?('U@3~> C:#Hj GʝUmB?+$nd{O:cj@^[HEB~k5a%jnրj- TkvTEg@ ۠vQp"Zӹ g {8On ']&г{jXc1g:հ&TßluM5>L59#]?էj@ :[j ȣv튥BڹT.}O:BSvyR mxp#э_HRy&^^' /~lpoG0z~]7άm8^X}Hߎfrc*fgv2.Ylj'k1$L~ ]Hk5[$(GPNHʢ9,#pa--jK@F8wH< 8p)M73$D^| 1]Ȅ#<#rn;RvOw~ ᯞ^8rVg{7Pr#'.'ѕדjǭ GmV!ۡ[&i[8n nt KDQ-9~AOҺU,XJg7|Gk 'zGT @HǠW} F?GJc܊#|.o"JԻյLM~P"у|RK>qwnLȿ|ARpT,cmoz~[].Ddy"r" |rH#4~!dVxTr%ίgYF9X0(-syr1aMf6cy&~QO6t^"O6qEkF;Ym&觉['_ĩH+K,ӺVR]@qYzŏߞjܫ}/GaFliV;3 TK sSE솼)eq%GK|MEvDgVŅ #Q TU69bQ@(4rLAz<&\f.}9}C1D3&9ɵ_~[*ȞSӝA tn-6x ]^s@~'6zu(‚;iȊ).FjbGЎJ u_3f[@dE%2:ˆ y Q&b"2AkW/'ƺ𜜐ƙZ-i\\ט pV%=czmFgW ZB$C &eR8ɘ^B4F;Nw䇓c'}) %_GWvxwlLR8s^{:U|?NkWO%坟8UwOϳ*=<]~xw?±RdODt7G?]$z'z 5a|2QeV=ӛCongS>{w9w7u~?_^F0r.MGM6UAB!;DK˙ -E\3h%G)뎟M>t,Qc§ϧMZЂ:/ܦע^^,+׼O{_onC_]_"tۖ{ˡ)l B ~ ف`4$iNM>t{ӅNr-`p:_<ՎVlKKO**%:7:}Bh&X Y]9wuIQIpOChjr~->0ǬKfD '#oOZpa+Tą!l!T^Ɍw 7x}ΆawK(=#{bX{|+F{^+4'QGnw8)BXP2"S[uvaz75=V-~4/-\wl)) >׃?޿8NKPoW~E@C ޅ phH56oٳ@Х^j =+p.95M~֠.䐨j#IzOm5YiI(HY,ݭ4 W2:'q#V'z,{*33ϨKVrDOum=FJ39kFPwb| (m^7r:p[΅zo=hPYDzAZA$ (r$X9AI#ȓFjS`΃/Wp B\( 6Z.I :4$#9_ tw 4\Нg\&Ԏd)Q;/=v\KaRDX+`xMe^=Tu:0oNyrE!;tuˢ{ .-mF&Otf bٜy=a(N'8zJAYR76UPʼnLI![E?җ,SzjOj@:lR;U<ىhvԤ7y +;Q7QRŠ5]k~LڮCB9|*9_&voDk3D.0Gqi,j^Bؚs)BvFM8 Av4{ S* J=$j|D6V5~сl#-4ط-r'/'-f%`Ϸ&`fͰ8ltؕbrniY&8'ˊ|EkI[Rj Q3Rwl+1VUK1#~J՟tPy6]S 3f&{wrudŘ?Mve1# wD Znjhaÿ>ɵJ 6D"Qyh#B}e`Q|ˏ?jI?A"yOOPكupK: ^oXv0ˆhv;WE?΂~ʳ;L* Z6yqXeœ̈́06Q PR&9 5oZY=* 6q;o5Z3 +϶1]-v<`7ԴuŴʎ¯}^) Cpy]!-" _m| ى"k̝4i<-xZ)V0y"rKv!SDWt*Rm$"&MP>#BWN`墌{:<+ %VĪ#~Xyą(h9ѩlH99DD0fur Rh61Y09!K>M%V-avrQ*PIʍLaNނUGw@. E#Z36"f.=x[gm`^DAOmX \Nu3s5= ({pިai- L+ivI=in9C r *=8~䩏zpvY7eLH6Ju#Y3sJh]: 95Z?Z3P s$R4'tldCnr+fr}r]e H#qMm5Ii:oI4DHUjG9ymPZ 0n|$%&'2-&}L;egS 85oj{Jq-0do__C*^y0Rz!-!6bZsm+Eqm8UuEjd$_ߢW,+ID*Fg.5r][s#7v+yK*p;.ەJ%<dFk$-) Լ4 fZg>.-\:pZZFkcp]#j~:,f`HRcBq*bWDT^|+mm`; _]\|-~@(Cz`P!!aB8S/M`'HjbC,A6E8CdY6^2VM =aRI(T\Y/5Fzk b(CTl40GB\΃lm SceW(XI(3I3{cI<Bq":HJ$4-g`c] g0uh!I9L'ᗜg^7)Qs9`Rk"%# 1=8:kt{YҺkMrQlue%/j$T3}{U]eLw[_ȻP_[|s?{^472LVBx+ 8^5K[v¡ք1+o[E(AKEyPB T0r'BJ,X钲oᄍwwɣ]+T˺o鐃 ^-/zើpSٓ {7aF~NϿhf]uDj”AP|-'arvIsR^ W$ iH?$ \]벨7xpE-~TܿKҪҊĹ2[nA#:4u V Ɠ~HY^ףb){'ڶʱv[Mm5ÓQ׀ ^Tr ? 9s1sv. a_&[C`#-T5].Y"'zgoHL6[mWHT"]~ ,LqƵ'!ҥgH?q~@{)U~jߚ3X,W(lGPM x?2Iy'NXj&9WJq崓o_s Hk),dHLHύOR_?c! t.d8I '>ZE$W/)RߒR`nE &P F'JjRQg?ixyF3{LfH&.x|^ZqjsWNE':H8bp r׭\f%kS. !/ W](](I UGLQ|0;_OI'|J04ic&Vܟ2>4VffJ:pE\ bI,XIpyvۊڌWE!v,3uBN3Bn' ׌TjZ9;i,`!CSYA1p\SS"G0"޵DbJ(fz9"(j2 j"I(̩O̸cjMEرQo xD*Q*pF爚 *m̽5^VYz2>͵uW3sd 1olx~b7ם#̷BM|id(P"FROl&~~ntuLJ'U+Wx K /:: Rl.75d!}<3jvNC;>A#/4\h;~㢑6)76(}Y)ZROf,uJ h|+13%eԻr@0Uafl &s"mk,IL>b!?D*ADDO87JBC*O]WabMjĜ7*)QZT Q%Mxcmc+d IYĉP-;v'NtrlN ? NW2fЛ$@]:5PnVQOfw1pw_ 6qͣސx ΃gwz-ƹ#h zNoU\A1`+o=6nn1Zdh>EDKi9 Ld  s%g^WR>ܻJqHGO9<љ{~1X\!R<79ìT| p0Jv]d>/Ce;zwU6?>MC*vAlDZu|w5?Kg#TLEXeezl8t\G`( Mu`3{{( $K? E8V4P%q!{BoQ@#|( % g9 h7TCTb";Ld(<1Pp E?(J~ѫkk΁3'ч_~fv;{s0yW~=<ߟN]" 3[+f׭if:TPBOs0-FQ'r5_NZ× |&M7+rwTNBMYcU tɒ1%jjI%fي: If$X(t7*;lv4\-AjIJ}iJFw ٮO!"EЇ+*cT$d_H,ڸ; , ҳQ\8 ågcD[`yVQLJVqo  hbb[*#GJWy/iƨ!ji;N9HF* QLh+Ҡn% UjQ«`3FYae;#ĖK)+:bH.󽯓2ES4V@e**B/iKOw%*R-nUVTLW><|w*0zxԷ^7:v[ǧۺ|v_>~pWz}®JT6J:7ꤎFѣ{TѠʠnm/;顑w}ujT[ByEȰ.B0;(o,7nd/GL$hIS[_uk. lϲy]5 :4̪@#id;d( _kr'eJNԠj} I!mtz hk)M7ǟ~1lNNV9ͦP&Ey#RR~}о~\W&d{w+WӢE 漨uQ֟&塓t#W#_wW)B~4ht"]DJavU*u3.qsJ& *%ڛAMzgdHNGCBU;V[)&'Ro[:BƧٰ[@ lSpJ EުǨMigm~U~=GS+Ok9QڒIMkݟuhżuVgk^r&%RѲ:E-Åo z\<'eG)bV9(GJrVaSu=#;^O Qh9XLX2 @ 4lgx5Tk6jMARYYSVv c>`\!;QhQ8򍞙>&9_+ G%ȀN( D'DlQ 58vG 𨁨O!( D| jhB0\ EQ d~f24)2s0G&`T.qU˽&\o ?ڎT/]Bh_9PPDn?y[vԍ`Y8S^2I<?>wC]nnNGu-5UX$mÄpTKg1BN P6@ѷKXp1(N) ]\Hȍ%yrAxdQfML' l͆-e$r_``, //B*}5f_(Zc}MmUHEPR)eDmHA`LUJˬ5랄o\ђu~V rx9B@(4B*i8rtF!=A7ybtl9!e HXuYgllm So@N75؋uH߂ד%dO^瓐,dCœ6\ˉQ))NAjҢ+iQK V{5Mb;T맿N3?HJ$o3Z:y.rY|E7Ao7bi/`~~~*9L6q(~z.<>J |cnLAJNN!*JnBּG=кh$~΀E)iłD!LKX94T:'ؤ5u,<;RZMޚ?f>{Zq:R3-p~晨"G#DA.熟*s `垒KO]J61\n mS{\7ꂋ༝#|T o"pr\d[o*GfVȸqF#!ۊz5c>{+UBp2Vght<oϧ#$ayKWYm\a x-ov0>JA':)MJl-F + ȏE䥯^yy22U8f|1rĂlύqЙ,;%ZE25 %$~L P&@oq)R%k$YMbV++vL84֕T1ÄVPνyU!6I.p ʞ ĝDfsPqBHh0+D؁,ܦb cxJF{`ښ۸_ae+'g(qW[:ٵˎw_`,&?RI ù]@\n4}i'bR%Q 'I3P*q˔7LSԒH@9n,!"a b2Ze.U&1˄324S@63<v#5-Փ5q#n-~CA6g׮'vf3po M )O~mjwKLsSPoԇD kx"Fb9]mYALF+ƌ2rR):EcTĊ,T)2  IJMSFP@J~~nҌ0a97 t I73 $(K֒I_ DFJUzQ/-ij;هpLu' DMŕ/`1kf "YjPIVdc`1Z]< GmXƎ5PFL':36&a jҥ F#R  š\3; 7]tRR~NQqN_jQC'#ae[7޷tzYrW N9߬8X4?kAP~Kɭ2:@ɂqRsdDXFgҺ ͝F%˨o5bx SEuCBW`Tw Q{d]/BEy(We,1"FHNP{vj;$G_폏4p(a&dd|?(hzG^ûSͨԚfJd\ZYL]|3N]kӞvo F 7|V }4,Ǭ˰O<ͧ_ rRlq3.b~_O80?s媽 0qOsAx~= 0[i(A(0%UK\?Z۾_cJ2[ 93Iùd-=A[a$Joƨ)T ɵf%''Uf` ūw &&?>UE(^zSkW0 Nj_~vG$DDܤ&I"m cFpc`ĮX|g~a[ mz u el$8vLx q u8inG 2[ro6׼kL1fJ?|Hmz;<ؿP$Pi=AG8,K. raf7?|%n6wO]}wW~W_wt:ɒX"$θ)I Iu2Ҧ&ÆRTlI1Gy>Nrk|Ӧ>78_L . E_+ LmM#fpܐU#& 4g01TjE^:bIbI0DAݘb(d1$\$MnyҸ^JTtHoÖiEsCfu7ǃS3>ИޮIhÃ25;?;klZ/V>_93D*SĚjMGy73oROl-!TˆRfPo_V&u \v#\sq Zi$ P.y- rxwuP5ȪzgrD&uk,$扌&*34 Kgr ?R:)¬tsb'2d&n 9ӷwMvێZxi1{Yll2ToIuУzh=+cf-PKQ r[NxJQ&O=bml)+|'t8M;X-/*bcpz U$tEwػ۫z /$@uZ{JAMXA? ֶ0k|Qm'z@V*K\meRp_yw{{mIpFխRk̮Z7s?| śoD}bqV~{rkֺJOy ɻj躥!uӋ>-|^KA]bw^<7KnqcwbrZ1?ϫޥ-wb4ko5G& ץ<@i0)=V~G'3<n-6Zҭ=`´?& g yF+b;)?u7,{fI"ïbdNSg:*O WrmcO6E#2T#lE>r_> ǫm?uQ&{x{~-@FBE2趪JȦtjj<ve׃-ڰqW^%][D2P/иWD.EvxWVgIlfujTFYJgV"ęU gb3Rim/u #D SA! (_ۏ7ۭ1+G4$*q87ĔTTL8)-lY l@u{5|yLre9}kbav JuX BHuS\JuP~_su2M(LZ 5[quc҃VDzPMIu|()3+RKlԢ3QX$iJ%!w;E }­@A$3K!qTY>[Iz}Ҧc{z'T<@ {!1{:vpҬ׳:"PNN#ǾC|aD#8ބ& Ob_ERqθ$Z!)5LL8֙"%M1O}׃\hlIݓugg 5W1܊s&T$*'K$Y\)gBD p*bCXM0R}(ʋ|p!Bo~VJx(Mef6/QT3y]e^k`#JePRSe茑ƲLD  eTcc;kT"lQ;' |DAk= /{@ {[g#Qɘ&/c7*R ǘ8(dB$ADi"ǘ :)m,h++|QK.|%I y%}Êe]WCO|ݪ  e(cCK|j6,¡f_0u8{dDhT6}|et6_"fg 'R?=G>~Qnz?>RTrқxQ6h;-L']!QPBp$ <0'8l<&elp{Up1)brL6."AV{RB0! U-_ȄbA@eHPvPX͓k|@o< 8{ࡧKqfWolb[̽' 08^(P7nbPvR-K[X2ÊΧC-e.YZֽSq4Zjev+G0*ky]TNY~\(.\KY |{UGU(5gY2ӁdE!07ڹUU"*ʖh uDzyjIY 1^TCz^+IO\z\TBRҸSvҥͥܤq)7Ke?*|mKKq)(%CIjAJO\z|\Tb;ZBdb?*Ԝ0vңR*ҸRBz^Z(#N\z\ \ 3h۵_M?k7~XUX%x̍:ӆOZ^J[fx $L I>wUlERd[㓥ùQ$A.nث`s;,A&.A~F%o~5-"xs?\zFE*l*4i望f狪t=6ǻXjmgg[0=k>@[5Q1G0<4;/@/V 4Q$=1_k0g1o2 MQ\3Cנ:@\%.A`y:HUSijrp"ϔ! U<ouj9u]5QiTN׃'z VLjՙ+Ae8+cYG%0 Y&~-xlu$$EE3C)N\̑YJ-sņBw/4b]qct@.ς6i0TsiK2}YDWhIyFKlBrIi9;|^@v@.Afb'ԥ Py8Q/qlj4H_{4uG~tTE$M=յhRǣt5zڌ줧ueB&w=I0K[c"©咁0άTS' Fa+0Mq&=h3|0 ,$9˔%G[j lgme#* G?\~E\o_1bN贜gЛ>Ca[ܯ{q'_\1$+&Wvq~n$A THggŅm{ҿ<=#BJC@~U^%?*tU1#Yb-R>*QrPks/T׹grrn%%tOJdGBwZca@N<գAAD84֠j0EIdƣ$=@IMĎSqo{:ቱNgh?Jiuh뫾'&48a0֑a,uR%t\EO~~KqtlLq(bPauiu~#[m-^L?lD<8#U|EEEE: omQ>+l8@utq!h3=U[ċ{~(lbf8樂iz#!{"1JF{gaW ῃώjPBQ I) uCg{C,B B֌7:~o4P\}(W?]Q+>Ҫ n&k77\oOkqWn׷- d+&ځK]ow_=|wz+>j|_/lC-H7,o'hD.so'0={ Rf.TN~଼}ln0)EbՒ$Z"2c5VJZn+l-HkjqK_͆ qPux(hyHWzD-;f8c=I=.%.%l4& -i9գ+f]7FKƨ#P_DBb63bwpumk=Rs DfC /wݫ {0T M3n]jA"#H ܵΖ Vk]_jmti\V-[g4Su֍CAݺOocp`uˣ|j$%/2/__{4zݾ/Z]8pL4E'{w~u^}5sR|tӇq<1 i2vI@ x)1MO(յ}Җ]Ҧgڈ[^(ΑRg((Kw)y0 @ INy DKY6 :6I$$_G]:mFIR!I!HI$QѥCťUcHIKqAFJv̮G(MAN,b,R~ YTX{EhZֳǶ4+U^jkÌo !HyG~*ĥiutClt$83gg>~ob(fg?p3ؙY tMY+혟ZKTNtfZZ"*nqY8.}yE wt ԖrBh1 $CU7>USo \KR;5NS/?JҞ޳}1@ -J-jtg  r5u&~<""-LwlJV[-Dy|X eKNK- <0^¸# jάQLodRi~B WR wwlFS*HݟĞ͏|u"E䫋2_{6[e3UlswJZ+BŔfP<@Dʅ?z^~A~C[;c>m_bhM?ppSC6d?/C߿̗+ 3tf [ƨpV 4)uh <1ްH44W/P !(F8m K9ldm|JfjMAV@)rDBڻET!*Jͻel]s2XIi'u:);t $ cMqQQM 79[,^2ѤS[,訕O jϠRvr MW7L|i/q9zyS^9*]Xn$5~ 5#E_wFs@(sY㨳k.;ԥ<"9Hi{_o.tRiO`({- ʤG7UH)2)S((eU+SmXJ*MR-+~Ki5jJU!dj.k)cVdWH駩>YD'0J>J)WeRUoƠ )LJT Tstt<xR LJTKbҽ2 o:12 |>췔LU!kPRlt/{K%SF ,"h9r jɀĭ³n_D)gO12PΖwUsv?;t,E²ю.k&O<&z=(lۗQZS̆Ka&mm]i/u4~1׳߸dч܇XKi:V!~v[Wm\E49>k"i[mY]Ңܮ66U #5|'e7akr?bd4 p0!(6FGNŶ1%cܥA/a'ܜ6h⤷V.fv~*LKc?7b͵n-"[wYƄOu͠suY7gH]cQM>x#Үu@/y` jeϲ NOLdžlOky*T;E?v{|X|E/rK]doA[CfU&b*ڊ:fٖy\тzvCC7zPI-T8׀1 ey̢KX)-̴ D#sQ4arLrX6ؾ0Krn+ɷym&J@"Hؚ&B "fi4e ڍrG|UR EYWEueFǣj%#⍍-%$%c,a0% BV@?s|%n;͌O#} }Xbp#1@&U,z@ < yB=)hƕ>FazF|@Ҧ |*9ٕb`tEai<ч#"82{*GH":jo-j!+I:lqmpycP?pY|IA-י,PZ([ 6!Ԗ5wAxplmVlCrqAEw*/(05+{5C/pU35t Y> K/UYSbA6J J晇kW q*=yaģ$w) D"QiGj a:Qn<9`?,|AmwABwDHBvB=o݅\Otg7p]IM9Ŗs_?~l|R&̖_ii{n'K"DB4Fb7׻u ~vl1y2@}^>fONHlEl]!r3gLI߁'gnv=H&'^4/ٛ32b,$pIS-3[\> xENg:٧>i wXI& mRmDZ ϴXΪU}jS_ώo%vA[Irsߏ(G_&|OW^+Q$V^j~ Vp&"SO9X$FuV6yZt}eb^I, VFWX 14ϐNs96G-8y9(gt@9G'bzC)Z}2Z2h!=!FAY0L>D ǔ1%胘F%p0b%G b}~,XŖ.bxjeHoئG5}$qrnݕ|­#._ S"A{NNyܚwo'|hsr|/5NW 7 ?j.C罨{= 6F}|{@܉|wI{QׇZ}ܐB G`[Q=,dcjZ*|-u8V8 =8l9!kgj0/Ė_Ÿ[VR MŽ2`fMNސH͒1 \O+)ycv: nFE|XV%eTXZqӴ"?>V%R}[ā S? K7+#Zq`{kCjAVR̀U.Eu2)-[1wAeEi&M}3[¬{m_RfB#Ĉ zJ{1Kݛ)HgmK8Q4}e 0h|]T݂n4CO 5,;'?JV%c˒&n1ES}˃fir%`E>ȶTVjLxXUѣ ZY>C+wj>1*!{0/װ72_nج#9Im}kS|է{ۊwt_S!RmwGGAJ.FB2d+Bngd( 7{nlEO1gECQaKZP _̮ wšya׽UiYSٵ^˜g}N9bY!N>z EXg%Nb6B:gNac2ے9UD}a*{OK_Tsp li/vx}[^&zRZzG^VGu?mb*beGaow-ƒ덛onAڑXW<XUŮ{B6 oy&DZC,;A+cѼ"8ϡ)V| `Qx+=`k0-Md%,#'ӹEeD5Gl̼6y:zZ98!2QLk*xAp1S1 b"p"@yBn(&s'(ƪ`  6h͠[Oc,xvFWŽD]ٺ [ZN'Jમ# v>%凞7Hb]R<i>y1f/Hk Jpe{*G@3G$'rDιt^\;Fjnv\F{̢w_|$O;?IE_~WGG\pr֍qeTވG?eۺ7ݫ7t啛9B1)&NAa\d=HIy&FCP?Ɉƙ"QAV62G8itcc1D1ܧ#嚬Ngr<:)OUl@"+z!&- ljxc-%$RZǔ|mhR:C6:ΘΐyhrJV؞u/g{"5YRtda0iG &IifΕ鹛tvEPl\p& oo sE|{$^'nCo~h)􇙏O #JKߟM4˗G@57 Lq4gW-]]f5axs7}yvq6}S;n %1|ߞ}7 / ph򋛅7?#P% 9I(0|`F#_n8`d(8>PҧagQ:io݅Vn|o$ 'P9f?[_LB$p)4[~w16rPNi7rpu _@fr1@Fhd< ELEDE;}sǓ+2El"5;z:LNꂀM-l>d&ulfr{ Pt eÉ.{ Pt[4g"ѽ+wsږ9mݜnNn+1p9R*;x|d  Y59Ndx XtKtd%7^JՑd|.//$w>{adr 8ThhZ.qBMr19JmFeBB*lMIJmDA+GZ 't-T'C/+{`̥ S&+Y:)x}p\:kxNYNJTe0Xźbȗ-@l1\RkD/hdl1\!U$!SP"L!qgKmʡ`N޽un'7vK_ *#uqUd SҞA:]$2\jϕm 1%v_ XQ<բd)APkGbmj)_ɟ|;EP椰A!8- KL(,4!G(U!D}v65;<ɗo$1/rI겛"lJׄEj4,bȌP^mr4HӠwg84W@ GQmt,5Kå1 ֙g̑1o6rウqpvd|=M?Υ30k!=7A ;=b,tk[ϖm]gktNƓ Q3o'Du>jE:`ke[8_VC@qVrPU))p˹2P?2غ㌱͂r>D9aECt~?*}_* <)?q7*%5RVņ.%AcRS0Ǜ…q*5,e[qBPlk8T&9E£M]6r,Q-B@m+9lh4ڣNÑ)4q8[~Lu@tnh˙V_u"L^<YN ~sϻ}2I?Wwjy2sHl ~|zw3ЧȝcFy|qpw'_FtE1t=Q6tqoMϻ`.e^-w(~ܖL77{T٦% y"!S"jVBv[.);Gv@+Yk0Q!!/\D +l;9Qzpw7~~/h4o^]~Qy92zI[Fl-ݧm˟ܱx-tWr%OVu,$L? 9WY Ԕxߡv퍿}8`-¡X1=C CƿR*@/;UZT? >8Ni+*? VKtkV/=Е(@WEiJU#>h?&2c\F$ӏ/}|Xk2xi/+UaPODA<$&B&x1.Ow5ltu~TĂ s;+@Ygs/OEudEU-*"63>s+5:+R)г/_ޣ|oiHdw?6` v^),cgvӠ~v53M\6KnRj36JwJVHʌTqS~|")i5YL5鸑ٕy[STe.~K2V9K}odbH0*ͣP P P P BUuc%G1\Ceq'-YG钕@SIbgp3I2MDYnr'_nA}>y=C.a'WS:[m^_ C맏:ww1rv 7M6ƴOZLOQ{)<]IWpRN.JZ*liZqGJ"%-EŽ+ue<'VD`@Q\!n _,t25H ;W?3U ZɭeSSG@S!ixx3b!|bo%5xƈf(CݵB5,y4K}$6!lPqӀpcHeG0-Qt=PܴB1Oc-ծp\RRU{0)*fxMK5H!9s] ȋ].hZ:Ωu[@Ǭ{&]{ohѴ`JL)m_ Tvzk$7q̐ a;P+݁V*:l&w==ofM0N4#@M8* _& !L\kL>i?S3q3 `TX|{7~ri_ xɬ>|̜_e] v(VmȃZ8GdӭkwRer]$ϓٟTJ%a7hn7k{op z^w}rgF_i;@ҀdR<߹rE(_HownzWE >\g-9\ &xt/F\DY)w) s{^Vm(%~g BuC}z#F@OxsO:B.ӿ&]'ag],s'$IbaXyP?7a;uZ^_ՆAˈV]:6lûnl]Oݱ9ӆvyxB[֝mŏmZZ(aMBlm#VՌ~O"_#{# D9uYI*-kT$ze9xI9LK_ lݟj|.Ji5gN#%F{Jc>HgXNRFI JV>j-Qm@I:QI 0i^n/S3َ\ R")[#˫'ī'ڐ.Y2%۪v( EtrhD.~Tֆp_%XNB)?!"In+Z঄RsyJIYVU,X (Mn+PBt;L(.D!+{Fhi'*982P;F`93"E ފuf%$Wi!m^+&QBn: Q1eN A;\K(3 A[mɽJ&CjZzCSi4vݚ;! h3_|SKb+;,ibo9UTnޖE' P FS09&kg)Ӭ=l_ABggEㄪ$ެlQo[cV6(1+Џ.~O~VЕCН/{B2PɠRS*TC pWEgEiONgۛZ@Q5tΠRR·BТg`Yp s0hq,e\ =sIKs  RV( {Ag.>BiՓ/p65  >ە%.1 p4v̝Ef4u]>qzy8; ؿ(#k94n0M6 nSrB_RΌTD$PHe,נDʒيU˙ >v|9aDYN㞗+8I$ 08ٿbK!d` %$J3=鞴=MP-4IIvp;s[-)?Ԛx[ޘYx]EtǽQwlB3@) Maf1 H^k*Q֢(Ե M#QP.@ 8IxrkJh)tmMDMp@*C( \.PU5Q$e ~W֍btVM+JHIeԉFkEUl8fBQkrT2/|%yӞRC ]SN\4'bέuc>6%{9AyL1̲9^O2@.l/_V/Ӌo\9[ 0o啉LH!*^G=O^^ YMS)5t!ϜGVjM؍7ZIbg] Fs{S>9m)(S]B"'ІZJ(UJQ6"XQ ͍P;h )/7SXhtCJYi`ZV3V^ۯR&7F 畎 gNJB"}H* k{ =;{d=o(hS[m,TfnX =oyxHvkt~IDD8ӟfx:L{ (?u7N9͖24o>8ɈXk)"NK5 jz1 ;`6oߦx=N1#U:AA\xtp"pm-s'p~|`$uN9feb P~6EHLA-x*K\& Mze?uQs4_$R$盇b;p!U{y{mt%9~E> s(ޣA*݈9Ǻk)᎕$9~U']L Ϛۍ/>+./ ‹ Ë3pYJ;/as!9s^.*o+PhPϣSj,dYX"v'2h_3KB)gycW/%:I:dSm]q^$ee7,L/%o R>=|_x Ilɔ)-sFG2ɚz]D]L_pg0 ha?YlدzYL @b嵯=»Gpzu-v)שA`.soz!Ά\l\;[ }˔]7U\,[ 5IzVaD+Md!t.A?RW u̵&Þ\23͂,ة j$Dɑ?xN!s'Y\nc֊o `Sp[uuޓٺϽk?ھXU_e\Ս%ismcmnuƮq^hcLe )jUD3b Z2+]4hlD(j,gj5?>>==kY$˙ 3;q׬jlv>׻ٝ^8ф LrB.oۨF6ߺQu6϶5U9XHmC 060%ZՊն/)P}I!]8CQ.x3bU*]-3TF͚+AFYC:GdF.T UYJ@n]0\.r'])3Ȋ׍bjBGSv0mA&kEe0P&,~l?^*gaHJ.;]O(7*gu(S]VA}џ[}MR)K[ލ((GT}/@ ]yJcJ\r#v& rL]IDIB l*JЖ b#BB,a<; /᦭_f[\O.?Dl+lSd*C8bk"(ñyI<@a'{ϬA5LKaC/UD`+O)#vW\%ru=y0 f}`9[Q4PY<4z? wczIh򵟧  I e'Oom:_1O"0''Ėly\?IlLNc)B!TMS}IpzH#cN^ ~;pal SbmS?ΐ W:HW<=8H$ƨflJ}W<78]ʮĤK!5 ydB(zpz;DkHbŏ.yԍZ'3V3aE,Nb"将ݓ̊d=̊fQf/' hrAwAOppZ) yb^l;ZL9[NLunЋC40Qzò_g ARO kT>`}1&YJ{ qD@θUܛB+#Y7 h6!fkGѬW*: \,֣]jP.6gO]UYsՂR yry;I`KR[\צKFuu %e^9G>͎-E[H 0iFЮ,hE*L =i/&|L T%!X)V5m]zQ$HI8-4j[Qy+Ct Г)Vv'Q- nrב$Z97oF/fY%S|=he$'Q-4aXu]SB$d.JFғ DIق2|FY55 |n[m:(fY)uԒʺѲhF  @U╢FH b~((f;<]X-jP'э ^=>PY#X)()5vRUT% Ԣ)Je?Z SػApo]rWf|[,k-|rWr[v2r[.5XL,ӌqEX-xm;\1vͼkvl{Ou_"O> 7X}pG|v xvY[謵حyܮ^t7gn&JAfjk擝[WM $[wH^>UgN3!2ԚϛR✗OyR6`ݘu^ICwBvjb겢Fڑ.ڦ(PD%ӊԵ;̰VJ%M7nݴS{d G p8g zg4}!@H~ؙyAv*(@'^.tMRWiX@a眵VjPXi 64N+::ڲ(l%%̐&j;IS2vN k h Xs^3jkh}+u]ah_6|(2yggp?p 9ʬٙGϹi,[ـh S>qu{l͉ב}wz?o<9C7^1߷/7vr';-W -N\u䈽f$ϡz=dz!zǶ4/[Cf C޸N>&E7l -׉FHts`--q=ߧ8 5 VY8+s^ X-`}4C˧ pW䩇NTI i{.r͇.H[1^X|c=D맷6#Q F5C܌75烳c<55Oʹ2 v^: Zcǵ#oJÓ$I0Fac05a*}!|a57QIoCd@ox '@bٖYOzlpTZt-L% 銘 IEV^WwU_ 滷R_+W/ma)>o-x B8C.}(*xtbqI~vy13bl sVYQH:Ț-+ 8[Yѐd`Mռ=Glb<1a ;}^Q9zd9d[[8~@p3;Ge)翣.0ӵeJB=1`K7,ߧB U( ˤ ^k *;LqhI4H%lИVS2s}A/n dfLLAb6h>qdV=.dkClkUn0?߾wU$Z,ucu]kh*4tngռ]rFd}B̦sǫ&,HGT>ى1=߸`$Jz1үs9(S ['r.F o8NF;n'%e ;y6T!. 3i<[2vČ,ҲDCoTf$s)^O PHY/)NM - g:Ѐԭ{RǀP W<6皃^|Pna4h_'=Kk5'̩k^A187S~ݾ_h7# ek077|!c\69K0ٸ_nFHPVDX50UY4TԬ>2j2|/):TxЕHm E1 oH]1R:xiji0qI頟ߕ fZ:;gm:|v^}vz*ڋɛ|N/ZBۧgf a>&?@kjD#|7+~е?~7 V5(?̚j6KpcJf_}\f[w@Ã}5m- _ V eTԒZϿC/eTَs R:}L(Obbz9_!`JG%6w/cYyHtZ6od)fxHt:K_eοrÔd! ,ۂZXhpOY6BD6WϧF^2X9bC;Jo7^AxtowBm֫B / c,msi/-֌ZyOgفرƦddqcG fo/ RC l^;,;d.B!^!c7HӲ|g J UHS&oOetʥvZ1|zrV17~YZI9K;taPeZYZ#Je7NHVӧtwNHV g o2'!"?&SOuD"EhFûJFQrϭ ,^(f甖|htWu 5m:|7@(uo h(M1Pq8 }\\C+̭n%w7ǮtI鵶,. 0/TL)\t-]7Lqul6ictw/ \uL+B@KW4ȽiVibFԩ^ +y~MSWTs`+suӜ;D@u$9@1K(sýOnڌT*0mL$U;1F^`Z'XE;aeR"o򔜁D/>EPٜU9'@>Xw5 7wŕ׿{S[t=W8)!FO~WESDDawɧ? WǏ3VF zr{=U'[7̿@nƩ%qe_Xwۧ6 yyjzr7g!i Y X,"$S1kZa4 B|Qsg='iř`"/)K+iKPe֟Wߥp6i~К@o ?.Ï$4A9adJ#$ipRlo2(B3PP;oW?^3`T:j<%]HbqWDS.j-IY%O~4Nk:jΜ>Kfhq':'JwF3(JI/_$HQz1r-h*JD'\P" n"cFAq^fGUniE:]vѧ9v?'i!G` QQF)T8DJ+~))rEĵBu9 hG sNB0ʔ`*]2h!E)9aJT+Zn\PvYnP@PE-!JhJT3d,w4ҷJnh5|MX9R$d]Xagya3&jHJ+КasNq\R9~ҧl5$ X+IOIbZ0W01BRƒ4qK9/8_$)IUZj?tv'i-# Ɗ{). /^;>HNh:MpEt"JY(v˻%eA_n5d(Q_==P*l4Ӛ?5 C6T%c?noB8qq w{|o4cjgk+7Tm r=&6. |?~{]}ԭ" 0 _ptO(=.k`E? | hZ^(;] ƺ+J+M&mf*4w:>qS7K=g~ ]l'WSGHbФ]xk};WdsT{b8>(ӧw&c@6l6[{f8JtNzG͜Bf;m,J/͏/|iHu'wP 鏾?z/Sή%q{w6zֳk39 Y&P%5롒겼_ |Ӫ.jA@ɼFʗNR!QBsmOfFCsJŖJ;3۠U}It_܄"BTw:,U;?uf'ׂ:$ϛ㍙6Ds*yD÷Sԫ[uͱݲLbBy5t3 |-@P^XAYAyI<1"ׯzv3,Nt\+͍?b͚19Ze嵖! ?KȩQw8!=K=1=/ X91P:G'F1'ςhQ+O ҵ8}':mtR2MFx#S9i{ O ̗*F;JJr}I] ZGLL))l{SܴTVjn89nҪ(K4CyS"<dAkth :H[Xee6OܘI⇇,u 6Sze4)[2JXI GժqKh]mD P|Z!hXX-^7tFF0zUȘW;?'@q9L8JLt"srgsTBYm8^0@<to3Jɝ$?J\.qafXIpL''Iּ\Z6 QKM!Mg!)*j[~*5" ʠ"62ͼV)H \qJ(K0Ur,eQzدؠ3 9p.0.#nw!ZZu3F*તB 9 _=}ڮ. SzOu{Eƞe*#K LKӵt H\޲mwwv BUy!B.9>ok!xN) lvJYJbނ>KIz87%qJFG)8!}sFegwUDYoNr^+h1 O݇/,!&:O ީ+8=8ozC%U,߭mw Aec3zvm&gо=3RTFѯx$*}#ň[n2sɒ](6^-Z:FUPa !wќЮkV ߛt7R6g6\=eG6S^+=+: iA>cT ,փn}%9.RQMKxРy4qYD\M4UorΰG"#%+E@ }THp*y[5::'If)#hrءuJe׌͢;d:~QM_fJIs/Ȱ~oK4=9oHƊ{3sYʴ̽)QtS>]xyRdҚ1ފV)[_k~ yrY;+s!#8%hUuiZ ! TχҾ=\oYڂ;ZVdZY&S*v9v*Cc;?5MLٔ±l41EЮ?7b/IVi}0{=D6Z08Au\'ĉm8F;y^+U1qD;Qj'[ia9"VNګJ<S9Mܞ~'t= [\$ٻn$Nv vG66}t$`%OA/}B4OzmW]]'Yc׿%*r8fg0eǺcg %zIԏsd֣g'='A tO jFuoq-S*{%-hL[e!IP`8!scy=]LG>Ho\iw jX{ s߄z؞אAO9!*J?y@ 7\^.N*(淋ucqC)gzZ;f\9TF+D*:h8Ri~͢\I*SҊYS?LRs2tc4Ml|=tamQ9MR/REJH)hxWh.-0qBX YHa8C1P|h~@3a<1%6Ss0hK6Ha3hi&Ts-QU6J 3J_Ϝ_[mW5O;9}VAPP_JZZGpAiPN{CTlt)DP< AG`[_ҲBo.$&<48R RNQR u^,՞+R3NkmH /\t?_Vg#_f{ln$G6W=b{fH2 "QztWU{_ZJP9x$4W稑e+OPϩn `B3.VËXiߧoy":.WJ6O]BۨCy@KIUSBڧu8(Mqڪ8nPz%|_DY&H#}-cg\[VWb#^%98d9`q(r8DӓNV9yP-t-)Sc)A;=d43 #dfinMTg۞pL_gE S/HkzI@}oCQ1ôv䋝ˊֺ,N.k]MQTiN[:QTqM$Fvj^og_^&)4]bgJջ]D.$Bl܂Y1 ;0L. &y8@Y*u9odӈIttR8 />XA_ G'7Wq׏$g{H sU3l5ͿtLW4:DuIHUH11/,w C8B)yx 8(:<>C>.g\ӠaBRf2 i3f8*!v֑$P-n/2V?\ʟW9tzl>HEءmNhΙzқ7!+td?&e +=7_}iF-oxK"%#(_7_T.x~y$n}Bbo }t混ǫ ˿=B.˿{n컥~ On-2RV{wW A~GqoqC p"Ͼ$۠iPќIg K@nz V*%9aDG)gٸS0P)9b& Iƌ2WxÃpgJޜև?>a%qM$L;e/G{+Mlӳ *ڲqXFX"dUZOtnMX.UCI,j32la 0.7cK_dr8SZS]ݟԗɇc9y4u}eZXh `xa8op`|%E8qL J26:ahڊat+CBX{CsQ̒\hI I.LJ. Gg#weƄ^\'\Qb>ӝd3 Bfn[PPJ9H8tyg*9P3/QY'·cE}utG ewFnXIerAN8:`JHP% 5CKYqCC;PH6lAzw TI#.< !WFBh t]xLj*es8'rNߐ"CydX*$[;_p1۱Xfsun BMQ)*e&|4v ͚Z`=NzQgf'\<\!lѬK (9̲>Ƣ-aWÚUt}{;/_"< hŽa e%86˩*"]m$ȋޕ#k{op@񎓅<,u}x}"'~?G3[LD|'{D<\w7:?GlA{}?tds2ȶ9zBÇ;ËjI0︓]kuεypr[\i˯;gG`֊aӸ6= cI”фESO 9ޕ$HoCeOѯihƓimu\&UVf`U8ڽW6jYa@eTuf21%Y8O"NE$kG@(Df2X\TEa'w_QE n'O?#(gT1'p2&T=m(7G<2P\ZG*֣Snڢr:EU7;?(QV<pW4[g-<,:H\~)w4R(܇Ս}uOi!Ej%qJً֞;קOP c}f;/qrw:][ۓ^ 1bO{@WZ|,wN[\9T\bD:G<= =O.휑[hx $eYO3,ܸ,ܠc")e': cQ]c)0{TMj;hHV.VZd_)"@Xb sjQwM8cJg ZaASӡ>brdҾ dYL;<0/)Tʒ[méJ}T:t*ʤj.?E:ETS}T3O:$UꅔwtV[~2k0\\~]M-n=eGk2az-a7 rR߱< 2z'ONcMTeocBwޣbΦѝ͎7 BeJr~XL"?:$ +=:+,=yP-W90VLIS*?-xdG9QI&OJ4Um]NT34wY^нMͿɰ-bkN$& PfPZ'9 >7@;2wXIyKݺ1c)N43Lg6L)ܨ*l{s>v8ec$NhGs8PܢRX,qu)6cc_2h{UUU Dj)eENm1To1[U~KxQ<,U҉O ޸j"O<7>˥mע;^ݎw| ve6REӡm}oxxqȣM#bkPZ11U:_I2$hyBe&=Cy[8/3(K\Km]H;l "t=CIF*/BeE?B^% Lo֯~~:l7}a߄fM7ٿ|Z~^E/B+hYFX /7G _hoא"9m/pIsٻ e/-SiZerZ>4K~"9P`^H?= [jϖ?\0 Xg)gJ *$T9"ȥ* 'aMk"dޑL$$5dr?2AbB2P\QsȝeP3&"OpK63PB"p 8ʨq@Ha\ UPPʁ 0.hXO3N /($՗||Fl<> )$o,2W|'fovEK ]yE;4gگzf Mlɒw~K!=tݲj=%j?G#>PAP/G.1LE]C$c[Cd *1;e/GhT(f%)X}YV\Ma B9畳iRFD)jgÈz#l=mwи<@H=tސDkhZ(rIRӑKS{0T`Jեf <3VhA1:~?$-eKqQlkZ]=O{jo#H͍%ӄMດܱ r/>,#r2 ˉͳ1#"YXwGH;01efuj2Čtb/lzJ߬񳧏 C~n~:qDHQd6`$ʅ_d 4E]L>]jAFF7_qJېP%jh&&=%)!<}\‰$fڞR"D>ڊ>l:NNpN@/շ38\LkNevµUa#tu.2)jRme? {phg`NZ[]% -YB ].B, GCeΡBIDᙖRd7qK\9(1v!Cxc$[탵s} $aSǃR9*1.8EVMq/{Skxkp ɓ۝AǀZ)-=WK{[4-(ntorڌ'DH.+ tQӎnG>i1" )%3eonmY9Z~N,?&q'b),5|sy%^叽{?}0 h)$5Bri@%BA0pbpnQܴ, 5%AF@}oV YCjbBZlsjsCJh0 @V#t }V+6s3+_]ʯ|{/~Z:uw:yU,׌@ɨSnX gFR(N}uB !Ϙ]*pMJ3 IoՊl' Ŵ)D#.-%h%nqK&y.aU'ceXch0.&mUb-%`"W}uϱof󫮤^akJUyMGiW ;v#(4UTȢ\sI֑cKb3h뉧?j$`g>>nPTN5X ` DgRŒLd8-_ t^ i74t\@!!{vۗ aDѳaX."2=}\,A3" owuh]_./rp>y k3˼p{7 xڊ=~mUh[Q :JEyfkC&ڨq` o "AKzk'gf@(0ۃpoj[X\DDE՛2ƍFi`AcL%/"[6GXpV LOŦ`.9XְP([,DU^@r RQ EŠ*؊rVwD:XY&࿃#`W .2;yL $VB$fqF9b!O;p$PT$0ɉH!&B 5+دP Ug  $Fa*Bs(BnsXKcJf͡ gjQ| M-@ wn ͵.N KtPX(X %ndH5 J :/9H q!Z%\,rBCv3Hm-ĨiB8#RPXc CM%7EӚ v$7iZJL7-MTr[̬rOj#*|V#@8#Ge ;naM6"|a h*P`!7}cix>@߿3zd~t\۽ۯ5)c`Q;7ku39孉r9,Džg)]An]\ilv{sBz"%B'n-hp%A(SV6 Ȏ%t6񯓡 z XUGEy[+@*﮾U=Ri|w%|[b=-/DMGK85S\N/߸קR=-l=F"$b,V>JpDrgD [g!Tqb=ըH@_LYA{8g A]h?Bg +zA=,Am8o%GcpvkOn[в=-|GWLk-au=jrznT>Z>'OgMe-1H쥲X?D icC쑼Ey[Cys@z\[{Yzs;'I8vgg*6̔$}c?MN-?_ |4a88oZtjDqpkO{(u. N eBh*@ F0/L!.,rA6[E 㠓s\3*!}~]3񽽟}/>:YjFIN*W@ȍ f$ǂk( dlԾ-9BL/{ܨǴbu"rM( +Vl#3=:據L'ͺAJfwm(GS 0UEXnH!P XwjDEw!DI(AiXCqyTٻ~ j07D0h2A.T0˒$8`?iþh80RO}oaQqމ.F. %ǪY% sʴ OkyԐw{)9ˏi m爴Ő=K9(bHN_‚cHVOLq[á1$+1Ő0J$!TeqgNKs"l{=h퐡s 9= #BG&@ }y\. 6%o$nS(Fug:+ ꭾt :|S'ܛ]q;z7[F.YϞf0[~n!fWrQi+ =+yXqJ(ۨp"xm5J7f{V%||vW_BFORMW*WI0FTsuni[S BN>b[n]ieڶvk~R7>DOSK]]AM䋯)ӂQ԰ȐF2B D{ჲ.:")O?ϦOc w8 oz<`df2ή3̠dD))eA&ϕ4 *>Hw^ֆk“!hwdLxiG;S kıɔLd"bqCE$A$µpͤ3/ԡ +̔W_69vG!J@nTc CVp?TP3,:40bJa Xf9"yFf+QXj}!P"T2d4`%Fc;ܡԢAjj=%8mK[ H&-rwsR4WqSF؏6{hS=| LNq΃(/\*0oia6qD0yA9G|2qws۷m8JgaX)|N7p}z,Vs6_doEp櫴X)%N8cI;cXY5rrr `)`+2nQBʩ ƽQۮk.C_hϿ;(ǗBCF1ʒlKL eR> e\ HYYƙ1ėZ01Sr) ə1 pIt TM`1ɿχ[/'WۨD?jh]xY;h.PhE&qb1giHN 죬C}N`ꀾ1㎩+r9WAya~a砫Aƞc{;U|UB` }6d )`,fd9 [`lƪP0ƚvH5E"^Q h'i|j~P.kkM7WME0!-U 0I%$N5pe E`Ae!F ;sfanD, ;j(7ь%[y2(a~zjlޯGn{ȣZ~|wuIof2Aެ0SZp!2Ќ"Zpb,t&Z*\OsmR%wHrX×[wdŀ3e$%w?R-[mImVk`d$]TX$Vf+X|-"gBdQ{tt;aV$ZW5"`Yë'-"E12%bCVZD_-(B,JPq=0ĀZDEa3;4C g0(2&/!…t\:Jة7(Ow` D/di@*Ѧ ]Oi9ש6VU/y*_C[O} tǿd%~O޿{bbs槫N-]l" #[SE6ܼ#cJ^ĘjiCv(铐h@DO&dUI8pPp>:lgBF)O2g+"` eu߹!i>`/4y;5†8 ScZO/GΏ]wT09’5rBBؐ5Pj3/gIg!s&F>+Dbq:m0˯_;_ 8{E)Π@K;c#D:YKy3'/R*넝JȾz˽1r}|E3߽NƇP~F§Xb]g7}oP/?%K :k!˽2^ći09f hq1A#';V8p1YkfH_/^̌c,5Z"pQQqadK&. @CBD\5F}# 3`|?_rX|'KRbXiG9"5t1I8~ r$iݻw ɼĒΟFpk=J4J_@ eD@ |X$7H0 pb-B:1fq[+*R-fL/Q[zqΝ11 o3udž(y-)YَNj=MΦ}\-ysT }5[h^_%FG!{ġ]((C?CŊZAeU2ZA9 \ACXY `1'mC;cnxK 'y\O"(^1B"-08{6hA!FN9*DWXG>?B2h2x3 F3hm/9͒~{ôbbEGQ4sBJBPK!\JBBWJ:͕̹4BpYop2}QtX~ vqI_gmNܰv4?A !M 4$ƍ)H=L [GB#i }@m"ƩUgDU5hI@j^#Zajlj^gTB}]qzl semiư e semVR$@/4#㧿T< 'RΨe׵q+`uQ׀;MA0D zI]Wx6T+i$_=4hA#&_EESB\QJ[:.Vq_R&=q(>@V+D)@hdp{-mfq7}kQ>#"6cQ"-7v~uw<ڥwy?:fXy-]װE"*!ץnox?M]_ǖEڒZE؝. Mfnӹڻ_G8mtw[n»"ҭF%BF=n5\[Nsw6JmqS6-WIb* ٴ^7+jjZ?xM@ib\lq5߮$ّ+$!x+%cYeʬ.,e2PpD@^x?̸KɃ`oH)Vy\]\Lq*mГsm$>@AmS۟#2˧LK:Ԁ7Syt8 6T_a7owAz *Ȁ `ӆbݚu0#FS-s.Wz 1tyq+}z#%@Eh1KU})Q u7P 8d|n-?`#(-R9g1vΏjywہJDX|atvc6{ f!WYX7%)󼳪} Cې6{ r9ܐ!FA2k#v.P91'90₏yl挲9 ZVㅍ TcЏA Ep|#yk;"aQᄍMyUgW'QҸG"o-N<'a#BFJ3szx)gN).;g ҎU P9-k^ ^轾>-e+6C@uh{@^Ijw[]I(HsV)GDHbj *Džj"8x`$׷w~F>ILքIhr?ΦMho'uˊ}ƿkW⧛_1m._Uk:wswWh,N9=L[WKft\ȧK+Fşݸz*SJ{*B^ٔBɻMzR rLmIYYKvԷw+~\օp=ڦ8XɓSH&-KJ3ėFuKRnQ1 $SprI-:m喙vW 欼 Gi,n`1b&GQ_S{XFw;]w;n/ݺn۔uH |Tq,Z%%.LAq;5†pipUues 'V';!^1?1W6hA!r0"V3)J[IQ \3GYnFp=1$hHc`O8'چw2XQ'<ġ)q :QJY)BgE*'B^Ep Lyb%j▃sE,wŭ  AِԷ ;7rʛ١Sb;zCWlCTIJ:f=$顉IzKݾC$ήҺ匇UZ+mj4ۼ4Vr/AkV 0 ZVY, EƋ LVB\v<{q~ t[%,~xkxƈ!礶yB,,~5Z@[7+Bb&Np`·=Cw.a6#En_lIɬH* )t@w>ԞBxG imhYqޣ<}0^8 o[Ͽڽ=y=wTњqF oxhB'2WHޥˡ=Z*#WGa7W(Ӣq=Q>T2y~?lB Eb$Տܲ6KGs}ܠ6_Q@^U7t9h5ln |'}U ]4vHc@Zfi"S\ (W*Մ 4˙+D˄&Hʔd)7(չ"Dޗe\{>`ѭ#-)l7y7ct=ZASDȂ]p|͝ߙmIKGJygy^gumr g.<>u:u9))Y}7޽3赾NUq%eY[| ~;b(ly`x;it2<0V.7|m`?HU׺0 zӦV3' iJ}xR^ ɟ5' „^2(QJ6'1% ֹbɻ+U4<:kSU4!}>:@(Ops$LrKZ>"%wG_="., ߺqꌽOYdf67 O*...~/`4爉̾;d:/Rm\h#,T) TjHi6c5zt<- >Q샮G\lD@)fd~7?9l#afhNR"l,c\L(3QSV`dr,rBE`٤  L4DA'9AF&eFcs!T<5` @f†Bh84:XG'G^VbD+A+6OAE/9s5T4+BZj גێE -h0Ěq(@Qf%"j`}2g=\0$_%Ϥ@T)ƲT(&S3drP.l"Nbi4yM )-**粆F8qіj=ÆZd PJڏk?.nݗͨKGOc{.U`R+m֠flF Gt䫢X׶E.f*UÄ0O{jDYF13sg^mz.h5 ܲhեZ \Բv>zz>J"%Q @dom{i%(UMAEm q{pZ ëk,j'>}|%GwӭV)>pr{>!߾gfIz߉o^/e }!IRkiqvZpT'Z/˛'8[7ތ>U*fzr/ע_z{~Hn+exTv7ϧfV-f$fSLq2I/_E7nU-wZ&h=p'3\8QSC4*ھ`o';nN3dS^v nuDȟ|G=l<λ#8=rDy6N9k7'>yuI^ eo.6z2 V217zYXp}pfO/ib?/c濘טfn7DjՅ6diqn{'\枆Fl}z':R9i %oBCeAVS&\ȧEO?'`ũ@fKdctB.D(nXo㈯~ 7]w?V60aEu#A^sP 0:_;t=ý ?ݕv//9"[$~]-}6c]-]?l6S/vvvvǭh%lhAqqN@fX`gY"ǠaRJT:x샢c~r݂RFx~iԩ%N\IsB& Cр 8e(^@ 骂z^h6ZE%ė}ra\JyE}zdӖ*2.kg}ѦSjY ݤ'[W.jwZeҞ;lFӗVw`qpm+ yug;O\?6(G>d[$/3G;QɴHiM +]ߖ[Q]X!uN1,V$\ nG]db@;/^Ӵ3md_p gšS!%b@Y.%0fP)eLSb1@N3,Z)t+(M٭/)@$ۧaJ ( \D%ۦ80X%΁%e5 d\V?]V2 +$GE}qHFOva0þ`&nygvMVJYGV1yzHe +{0gjX;g( еn5uZ8]9>fRcr6Լ"!MsA4z _bKHKH/Z0Q8w@0X~86$ȎfiNDc[qHރn5s"HohyN o)?XL{|Go#R 3o/-T ?xi\ ˋee_Uwo~$a!WLC}q&'xz&K<.^I+ S%f$j`v4ўgrҺ:)^8k0nuc;m8PQ]Ӝw?qTzGwۇ'`5 j#Wy$պqNc!Ф/1m>^A`лܻ-۪gp9Z~֔0g8&nI9CxE@x͙r7YhuG^oϲZ3^n@qqঢ{P%"hrYچ1 Q)&j܋mv0Sl,]f u29b܄pZ|pZcBs-DҠZðt?>7Lkkϱ˖e@)}r{ohej]YT7XFA_7l _9(`%w6nFJ^F‰|8 "͐mWBf輛UZ(Fd-BYtdunK(ӱNR\rJHgo#\P9<%WJ<6܎EU}6QG z,b!F27Gݹ5z$֏"sG}eQxc^c)@˯ˑcY jZK-i6KpdSS)*tzs}xQv5(8\֎iNB P0G[|H]( NVFDzGAs[+#1Dq7,T"fj1;CN ,D;7KZ#5Y65d)Qgn RՉ9!ZDZ#&-9 Pm̱T4!噈x-Zs0hD,N )Re-qu}lcFIf nofZWk_Tq^;f!{ah qDm|]SJ`&>-41ɭHe[Ԉ2ɭGe4"}rNb ڧKTJПN}0veN27~0Ĝ0f`pَ֘\嗰 :C Tn2v*LXrot0%N߯/ߺ4u0Sqqzki5>Ub`q V>\V$Fc@DKeJDyǯoS,i.5O"W^$bbAQLS)d4FxWɬ˧Wni++SYIfJգ`h0cluG⫇ܳnbzpd zltm}R +.yt0 24r.Z*j{unPp/vqW9VG^N{gb"H6&ntz`{%ypZ`ǂ+PBI]ouߞGGwvaӅNRRNdY@1ž`#HĄOGޜI<HD!pN7*&L 8iW2V fI&l%**-5syZwvdܳ9VAש.LPc`ξsL\bV$ev TZ6}ٶInA}-̀X in $9A:3gx_0iLNC4 5D.P>Lf+# cr@Re;~7KCa*)aw*@R#~? _V֨ٔ-1TxF:ln%`XI1!hm6D@) VqXe`^_53'j]t$!ϋc"w΄:1W^E⮎G^}:Eյշ#'6)SY "jEk!*ZjN`G5Q%rkQ ()B턊HDK). -zaZaET*I+!"04IǍ &n!3RBo.e8ި;e6Г=rm{{۔{+P(X'/TZ0-#-`@d7qK< r;XU00DA?JUt:((DD(%nKg-D&fkTMغq^>ȋ.ӂladS G #0ju@,_.pԠwt4Bsaza%J(GAi:esF(JG#ec˅(?  %"ey )@ʂ$~rU*[Q2RRU"+S(N= t'H8RW t,§%nn.^$l|zoFO6] 6h֫T'8}ePL.g1f@p2^b5#fAj:-to>05tSd䝞ĶnPm'߳Cqn1G[|H]( NVFDzGAspCKj8I<:{G/jhgˆ1 IRc僊>3&@juav=jʘS9Y{2_n-c4`냐_9PciGnîSNYDCuDO.x!nNKnR/ g\x\ -l'JQa=|[7oj>/\ʽC%wM(D4==}X<<ŵP}χ? %s\?P%XK6t}e ."KcnCWb&{5 Me_m1h1_ k_~/-٦Mt|1?e(4-wW./"!?yĔvv FKnN7d [lbv#B~==$)>T?<iP{I`F¸4C+mfRh4NbBՀq3퇸(P,vژї m2e8)U#mje.'q?InhY9E2(_,GG v" Ǿqj$K 愎jh[hy+'q6-OuԒgP!apq^Օ׿MR#טDQ2 ?}>}䗖osdl7GjS73_m;TqfdBF{xZNQJCƆ& 9J1ǝUuUf6r+fTʂSx?e#ݤ|P+9HpM)~=Q&%j2~ΑZq {]VӁ7>C 7p|Yd"crŌ#hM WdHK`EDl9+sQ+q7 hX[ a6Ȱ(;+vj_V Ztťhv?*|pKŸߺ؍n5vnzCȫ3 (nQ:me=ߌ <$8a:y1muDuG :Qd/b}N(BcʴHSOᝢ>ׇ֊["e/viFqݶN;uY|LdTS#pG<;p{ J\\g߭0=?]/̶ֿ|+zn6xea(^___>WƯ//7ה9 %^^,R]ԏWO,vM(㥶-Ϫ #6] iѼ7ſ~(8!׼,0F2 )3텙!qa,<;zx{߼2Rt%x>n퍃';(ImQЂW4grhJzԪCx0ԇЧC=DUՒcţ7MƒPGrr^m4H@t]1. 복0"~F>qf=\z>sYX%6E|iςt_|Rɯ6m$򼿸vmjkwϮu JԑRH &SAL==70qCKNk+dO6gy y:,n]bDQ%}nm.+m]2=(tծ N]i.T)u;f%~;H N;C(htqEEL&\-sս-wj'w!%15Kg1dw_#s;wSO߀J8]K'Ÿ9nOC͢|巠E|cq0@Ӊcq}씷.f\RJAjf񧠧ޟ0)a[Ȯ5a-dkdbk*l,40$goea"q*L,a-L(VEN{N:ϱ_ ti6ZH-KFqCzdNs7ހM'An&˚ҡ+,QB5ya-G2'`K 6cԒ~\rauVC)~EK~k-NZːF 7 dh$2^>;NàuR{H*mnl/D/ TNŞ6kH4h/Lw7^n8q/NƷ*Ѣ_|Ik^)YuKK)!xYSw]!\3}hcZ.h&X"j\P_hwR-QԼ# HcZlk e7uһgii/kxovo@9sQη^ut/œ^E]FecK/4U pwFp !-`S6'pNsF%v. ŨԃxOi#;a*B e0Zֽ~"Mbnي&l>O2aX4~Ytԅo'>R)Bʗe@g}PCd2LI8׺+.Vw95ǔj9!߮;au^! RbDK/< m13ߏ>^K㓭TP8+9fe04RC~@,K4 L.$8IɅB"Wx<x!'wQNY,Eśqj It G:#inhD1Jc!YsL45 aRqb3hm7^*+.+*UN sG7s'G̭?~ =Ц'Fׯ^˾}(1G8E`;kE1x*E$űdRԠLhΔ cH턓 %9ŘjT!}6U'4B2E <҇B׆5HF&PDO[>:y~[ ~y;mGi+³/xC 0LLt5~x[G?@woƐ̋\NnǕwT]o/F.3?u\3[A{x X% HQ/ݽ#xZ!˭/]XRD ڤ֣T]'oȈEiVo *)B<$b6oDߎ+RwVZO!Ƞ!V4tm=/8VT!)_ǚZ鈔{F !\WOn4T6A* ]7ihcH{Gh%jAݞc _F_w𢡊d'|m~|ҥ^F Yu]rvч{z<~|ݚ<-=m"q(Fôq>jzR,f{Ia뎤h B$=Ə"ʬMd Nݏg@m}?ï>Q0 Slzx0Gϟ>B>4Hu2)Mkvʪn]bR5c}s)^]V<ں"+dh{P , _ 3(񶵥R*x~tŠ8<l6[lFQ! b]1L r0Es{@A ygYC A!ۼ R1?JE ]7ؼShvW5?J 5E yOf ]0ˆۛw!Je;VeQ(m Pq l Y.^TQ')#[!6|]Y>UHAeKZ4O~WFŅYbE/_lmILoS[F)Db&nB}Z,B\϶e~w95"3x}mՆ6[Sp?yNDV#<%(UL*Cci$ў6؄t˥-u](ª=Up'I=+gJ%EtYzZI-֮Q;~Gr0ސtAd8(ZBieW~=X(>`s@)}1c XXX8'H:ĔrFpza'?mJl[^HRs[j{1A:c N˂+k4*j숻ZLl堹DOO7@2rE4ӱZ2*!Z㘳hIDc*R8 ip][.hteъI)e ?VJmF[V(`ӥsw8}PB㮮ONCp].uYުTC4zU.XD^ХC"X~"H֏ oW힔+U UsZQ>bY[]k+Hd5x2+0k D4:H*dRe"q' uL$F&d .w9M|xzDg 'N$ݑ)낸?{WƑ /3^OSu>Da{/@)5g} ظj@ Ev:2*/*NNjO3f&JXg)w|,N09 pJϊ>ݳz+YKw<.0U)9aA{*9vjIԗ+@KD8/uLSy:NMNW%qN6{F0zΐU&qzSR|uv8W8#_ba!LD>RZ+:0KI8ܶ&nϩ`eOe׿!ɹ[ G 25gał"ZEa8mAnc-$ _ӭ SL0!AHeʩXJY-g/{-{)]K/gn"$Ykyj'Vby,ք A'*g2z.6ңVXDi%*Tyc"eH$BA# ۵s\* !\ybUƛึF0K4"0@k؀J.j:84Gic4.)00]iEpwO5Mbqt\^O|}XaZ6nK? " YZ"oϟ q<Lt{;t=(-5&3 ~|d&@1U` ]i%HCp!%[ F MHyw>WX0#RS>E9jB ScVK=toSJu;:;_ $wv 5:8$~"iyΧ+8niTStÕG"=+*P.J{}M11"~l33)q뙭f :pc[ݫ/E&R0ia18&9lH4&71#Do;q5o`ME;%e€kXtY-wDP5F# cSPcWNd{RIY^؃JiC}<{u#4j1z]V,Vh C%Vx_粳bIgofvpg(Dlyr6S9ڼ*ք#~L{2덟4ȚV]*O;qʐc,:[ݲ>?7zҠjԋvi, 2B_mûvm݋y確c͹UbR[X5=e9lHZh$ >1V=d;ik?yJٸ odzt=?<)N>N6d#wQ:Y{b l~TCՇ:C勶KZw N+Ŝ; ;{} |' ,4x soB#%(ZAX4S6&kAG+jUmot0.8,p.A xIcDRUrKܻ$Z }؃1B QXO0!eR-`BGa B\.7WkyE1硇kH`0VJB+|BLՕ+NZv wxuJHIS׵%_r4'#9tLJu,vU[B:Ep) zc7ĕ*UIh`{]s9z -ѩߏg8=rTnM*p9b/Ue7vVtlZɈn΃"MϢ>=F+])Xص6}_#RyZVeqb`хw$EF;<"_׵%4xQHl ^IT0+  .vu_eRtʲB#Pۤk̑{~hPj{:I~5a: HR^(yy卒7zy>4Ljk njE6s$plC ܏$`@2+9݄ec0 i"n^M(q*J6n=gn ?|DrVo9چL3in#ӂh,iJ2XZ>ONXP "@",jA]^V@Cg َg1 :8c ;Q`E.D&|8gcFX2QФ_fֵ\;^Yjs3S5u  n"U0T*tK f(Z톍\W=X_NV͎jHa@B=K_7&]9fǚȋ( aV )B^MG?nHPДGX:EȕpA*RyࢮT;JKReZ*h4֦0/^Pu)S)U-=|ήZ:w2RZ2{d)ݽ[bϨdE+U[/֫_)8 Q_|;D_8CB/ez)@VZ|r{䓎<rza6]iMjri}smÍW7B)]0Op)El)Z2DܲD|!kϓdlf G}Mɦ-Uste5ӰxfDU FԩƕD![bQ;ލn Tm&̷4ƬI1BLwQ'+Fa\{]v޶+anNT~HW4$K d݊DHrdB.k Bs-fN+s]Is\.!dTtbmN0$xrYxƙo=["Ă f.lzM(BmETHiz+ xRND&4mڪfϳCͶÝ<.)+ޱn՝"YVk3܈1hRȵ,b6z?S*xT>|O׼35Wkx^VwwZƈ`&>% ~yF)gbQ5j0a J,vD*qLpu ) a "hH7}P# n5QZ]:PƹveAw/u;V mE 68N+֒3Y44h8㑄k!sQ.ߐZTC~?SB9ѩ%ឿ azqXE|Shqx)eF%q&fzߡ~1dY<8wh zPIÖ^om߶^)nLFաoz5I-aa" Q5% [{4/5r:lʾ0\ztDgӮظD y h•[ZMX,x12B`N~ N߯ 2&@ ;/Y/GI/GI/GI/GUFL  E4HRM0F:'BXE)S4 8XaL#jq7}vLeWRhG+sFussNhLWL+;q!_vA9bяS? t_=/a$W*}俞$mL;/q2N:4eğ"aγˋQY7: ٻ6r%W oEV 3Y`{ `ξ0݉3䜙YK[-YdoRYX,ȯ֭T+63"2Gt] Rrh#kvw1?j_cVZ>)h`b/ߏC/hi-%fP(>όT2ZN]K!4}vkңIRU{!&2Ǡ:!Hɰׁ 8q&z?5j8}1v^z >̗N;W-賋UZ>e[%uNj=Fvԛ**yAF(uvj&w?U2Qꂸ: 'WUF2ȡ,^ȢƩexYde& o$eEl *O\% xIHLV;o\P=^?i_Vn2^g -b 'Md$<ÍѣõHRƺno 빱EFʭTq~}ÆNl1hĖa-EH@6]΂ &K=Β[BJ`XրVyeu\\?r-.xzy[Q\_ȫP M6$/jAiA"6H1rW~1H ̮56pI'6k8ۼX cҳƝQ Bb_c@hRYNY5 $lnժ_a@6) CD(5"5p NjJ&Sd gQ<+=iUWgo>|RD|5k,X)ʅ웤.QT>} (AiOH9d7$C6 H4=޹so{Λ$cαЕ@h]e-, B*K-PJ!y-J&cNm(9F̹BӅ -ۜڂDcZHH[kyixTқ,S{Q eq5j3+xOTKѝi/TMnF)X/2t8(IHmv3=iIm>7JPd$`&*Dm(\*-b̓zw8-G 4>\5 H!2L0%ۺ:`A]s$"RkfRkFHyNϔBȚ+uNn% (77+)IA*PSmSt{8\mmvuN&iw:"\1y@oTɹy㎗Aˢ@!'RC1/$FO.aK@.qmo[&PM$Vƣ?M!uY0/t*'ْj[^4g\3H%ܭBe T-7 ** iH5ojy[f߽} n6)J(yI/\)XUW6(Ĝj9u~\ߺG*FId+*m)հ*m 478 y{ˮE)A;t6Y9Řl~KJ.?1k%J^[;f<}Oz=ʌ'fDArЈvEb=4N}']K #7xG-T,V*h˾ߘ~뻫OU~ND7߅޷;%@zkAj8cˑ* <0-viy0 =q%7 oXrGާTcHu:r!!M{`D. U"[B9Q J%"#^2@VDzV֩8QCsK따9.ki7]jΠ1$Y!s73VRB{JBn "jQ`° dt-љJjt{(ϨpDRJƱBѭ#㑈Ùcv[߁[d2,ofzi?Q|Y ͧy?xK\Eqۻ7oo//MH޷U` !5G3 S?w*E>^,WgbqsώJ7 tq둺_D\`~+>ĆZ!k=kE-)mm1gK9\iM?/Vղgnwƪx5w-~,}[dOMTh3 - rݽIpN}(z8$ۚ煶mL;S5 R9trF X"Ȭ B7 ȕҗ{G9JZ}mT!PuELLֶEs3YVB=5r.D w)$#|k":B,vsx?9B\)[dUZ5/TUn sU%R wJYUu)mxQB3]M$JVXsy}Kjl&t:ª9W[g~WEBgynHu|⢦wgsuEֲZZ,7Z=ST!Y|6'Tb;Z?rr5cH-6e>gGzާ{ 7}#gc=#jՀr8B^&wӪ_4wKA餶ݦ<$z>,䕛hM>@n{7lTл tRۈn-zh-ޭ y&ݦ$'q{W@ 予 (O\ֹ{un$mc/8&kH[nI6 =og_.nnoo-Û5cO?Y}x%ON#W5O~s_ cv]g-ݨ{5n;l;>U7,=Ɍ^s.a*2a:S\jgU _>캷}o_\H V\0{>ycz $Zf!!>={O/z}<_; ZV˻fVoήV7w"<[ٛ3_h.ߝ]>ш{tKmV|Kγyx~-ҨO=n0 ODQbsjbXm33 /K~[j)O8.e$P=-,~l;EU>vwW_- KIX{sw(V8{X.?-'f$W$ST;K),˳'͗ˋW_gƔ{>K'^5%'!2;FYay8)AB b3pSQ>[ubѱ} / 9 ℿ'P֝ĸHf{!vfДrk  W5# vޥ[u{QUחOڜNr]hE$s*UleY^yn7B޽]cEo~"=m|pAs ;8\C 8Dq'D}>>]9.xȃOW`eZS;a\KRAGR2mv5c]/$e\fpTŽ];' ]*yma/=׵z=@&$ƇF6ALZ3sHOyh%߷6x:4}܇9Z),""65ˑV+1~nx 4( =g 'VORO%+uݜ6"9uK' >-bBXpg &<=:> QMA?9E*P|it0e"C'oR> ?M 'K/%TG}g="u[HǍ!з=J,lݥ&4a%kvd+3cj"DJ$j}yPc#/*ugoI-8o$Y&dMAcj1,M3S!cM YFWnI6q3uۻrjۈP,rA;ep=X+7fqSʄ-U!Fwp!n"nw3[Mt˦0ЦpJzۋ{; ;](L|<k~Ef*Cf:|_Nq'iT6;FqjXv'FBfJPІY|*H^$.HF=rMq7bs5oܖ.Q{ :M¶d $ iX =߻v1O)}Y~^Pfe Dl<-GoVm|o°iHaqG}/^ȅZVAлYm@GJ5HB?l@P_:װzڹi/d7wƼ+_T9AV ;I7orlw)sb@ \H' 8>SF[$V)]aV\f99l<p9sH.9f೧q_ov 1OӽlY D+!Si2ҖCl]8nRf8yH$0Ҵ9!fX>(γBUVhI2/E?m` 'VէA"FŎjx Xd rF?W(=WnWA'"qU@<=5a&<-ahzL= Ɋp~2S mF(j@,F M/^Mb'16SLH Zq#<(84XX4zh:EJ,dpwT`UiW"u)ƒ!7 0i|v}-&; 萖:mTT06HiQWrvi)3?^r kwZ+3vP;y6B@O~y5b31J+@ =0V9 DbYoG\9nYW5!H2y=ؕ=4El-}AN`aR `i= %*V6qgjƑ_Q,;*dj$_f@KIr2נd) _DɚqٱIF4Og,@4Z,Ld;*6`K44i]F(FG<F0:$2lMV(zJ[;DQ-C*8NCO9CI+4Пt$ vA$bA3jD D9N1QH4/5YPIov}YYoÏЯ5 Xt c9]_e$u8盯$}2Y/C2Y/yВ$@Bxӹ6R&) M܁fȝd2?x>x  ̫7}Ѳ9;(,[AϿWy,2dYj%+qܿTߛ`/ &lR-582 8AS+\F4qR9J)Zf+t81VT7}RIhQ֚{gF:eMRVq#2bSpLNjdRo>&>9+kgSlM}L܇E>;0y'EMMrhD34*%hxFIHEXU0cb0&`RtW&1 BHI%Jp %O5 *8&YU憙F؅1;Ka2'@x$)2C/&~9ᜐ2Í6YaXks{k5)1,HTPCd8o΋ 2zfMxuf Ğ(Jl;Ɠ2Ōu 6F#9؞9rtw:*fF"R25)>9CѾrމӱ5 fLT0m&#u.* e(DL(:CisʙeT1dK3O)u͠4N>D(Em&~h%"Enc6c of*HT*GE&P=/u YP4ɾ_i~[1 wiqݣnQI[ 0>/HA[gf#ބg⥏Wc^Q2~o3t ﮯ&/*Ik;t%b?sDꗻVVa=~z4:<}"_.8V= GMjB|)]qwQq5aG{Çj2R};';$Z_gvSpn>`]}pzMWu 1*gRO˘{BRSXNsP۴ApS]HE5RC^ը)8;Y)tgSF1DQHŵ>[xtŰ+Eh9e!Afwp|Fm&xr3}B)ZJۻґFYōos{1SiSkB\vq6+cBmjgnɎF1n/nRg)%Olvy2F}T ky!hʢ )T܊XT*uZ#F)2j`*Zt7c^RXTWk3g7Fu@*-fUvcdJq(7e1+8Ȱ-SR60KlmolM)|TCqohWC)p`7G ,.[f 3$.޲XAC#?nҰ$<|ꬒEF#%h/LmBoYYMo- ds; Bno!N!t~fE^ӛk%mM$Ccsm|rXR.IIܱ0hR=?a,y[WMc3?^=7IQbg=2-άz8MjjJ=d۾1+?n2#TX ȮXpW5YA'ywh!d!\bFOջcN)z I-d~SmѮ LiM[fɟm3QS2pКB́$1xvRժJ]F^s@ 'hJHt:ws'(: )0 "HH 9g2X'a jgy.(f4ѩ(2o69()g*%IkzYPPj on:+hg4Nju4C>Ngޢw]̹_ <M_6x~ٗ `+3~$0Ć|t!W/J/ qr]iR\P} &3B '׳?WPS"xRj8PG0\ @Ie^="$y278ﮯqF@5x]|?smg'=~f/w  !\G5d₠˻>XPI\MCG187Zi jnϝf׃sxhu'LjE9熫s C3Ys>CqT viYAby+3D@=iƔ躃vcê.qвEU͛ZT4w\ t{G:Bc)D=v#u{g`5-QҜ+nbbYW~yN鰸nӟaړ['4B-xܧPV=pTtvu:= u7-(Sb(?tUc[Q5C0-JO:Qv:d(os{ubs hRZi+i9rHCa>jS kYfF).:*KR9R9jt@&\Q>!ǩ E1 ' Ǽ!ɬ4HHf }n>{էkbɆwsl՜t60vLn^BBɶ?oy>qB:-/J#޷+mC ]0'-\Z( ^I ?ƠԵ=cd*A>.%:<` qIJ⯎8w3i"d#w1q;~̩2 DDђHH:0jLFIqR fizot7Tt ?/f6;+Ҟwo j-16\T\x!~cݹP&JsaU+G.6:RltaAls{T3Q:H߄sATqz> GorDkYUq2βɿ]ZD \zJ3`+M]-874جG,b7"XlB4hS) |b)TlP%JHi`Uwc5mqF"/4>'Ŝ 8VI6(lO8ɚAqqn=],1IÁ[P;6`´4R\6n*tٯJ_ՔJݙ >؊$b Ve䪧Z/p9EJ^78װ߮/"P^a@˱Q;SHt\t~LL:;XKN%/@b;FC;L7A* '3Ε, &: ZB&ͲFd J-ٛ[+M筮ο'IXaULTFL‰dd!Ys60Ck"zHX~Q#q9Υ5L p {|ȋVwSeȪP1#}ÒsCKIh0PA 7}읩LN+VHm%B fZ|2&3>KMr?6 ?$Pnȅp>u&IWބ~0lxNJxKz#FIP#(+" ܑ:N}~S+NQS)_",!QrtЭ8 IJ<1 W&*ISR;\b5x)[# \Qrȳ#R9mJrz8ţǚE "޿^w8tw7FfS]r^jcU,#t0#4="-Ym(_`*Q ] }?]KP^G#+;ZZ.NicGזZ' ^F;}NgnA L^Բb)wċS0J 96D 2L2iJ:Rwe8 B:PS0CuGgPĨ֡>c­jr ֊,_ 2M&[, ޾jsjCTp1/ 8_'cW{6zWS1VuV+!d]>j15>eq,ڳn =E8Th[/ug+sF0T~mD*٣mSQ6 FڨX}A̧ f[k s^cf)u#eJ: `Jإf,~_wZ-M*|KkO 뾔^ %hnu{"Lq=VN&E2=_ꈥhD&,vQV >-Ů>y/X0]8[=*9NSDD&22HeNybB9 _5Hc.˜@q)BZ)C)g;CPJrz rOS)@҆+Mb~F!&<Ϭ"12a.ȘgZ $4%NH Yx&bbu[-V]Me_Zf wQe W2QMӌR !a)0Mq2lCjәƒ(s9%HF~g<{'9yl Pd/xcp~TнX{A.!p):47܀i6QTPڢ1ItjqULL@^U+wYj¤`y׌CQZ q)jb%aN2,`0H¢%blҚ0? V'8+W򨰪yH[ +PamxI f,EPǼ *cbc`r˜p { XW nK@5Ss` 2)lF GaSReހ_z/0ԂOi!$ XL(y^l,*Fq:sYs\3Jγ#PM4=6`G%dࣁ$T}j<ZXw l({,XcLhm5DXB]mWBR:DLMb XZM1Y% ệQ(ϣXQ$_\sypt x.GO_ 9A*\{|Px`<0=<"<,^x;_~~;t6_ (\́/+^RrW VזDpBzpb6\;!%r1(*kK^J98nڅ m*rǯLzhNӘr)@WD[8) Rm -.,B+^ 29cEHE:DhK(MO`qL>%D׃݋g ;n>>S͛ځŤ>s)t?'L%"ȞDa6o{"dRx[dv&벡#`R8Z:WqE^"[S*A> h-f`'jZjZlZ) -m! "?7v$|Z̞ڳ19*9iDV4+n#6HnlTA mDpvL(g3ËU[&L60 /2N>w:_.-zVo-ZPBdkqk/!5זHS![D=VqwUP,g?{3}w8)Q_~Kr322@3oϒL&χ]z|y8,~eŗ_ " =r#!.q{-G]u-!t~Q@$ܹQӀ;il3_>7ukO!6V1L#ƻ;5yi.^qkK1؝悌w<Ў\s"A[ 쭻ݡa%<:ݕf_-<1!hcW>;PHpAں鶣G1GqtrtFeKӕR+FF;+>0֢x/}+ěʪA5/TUYs݊g[*BQ>{pݸJ ,t*L \vv e ض`_KdgP(a욥1MCߏ/ȒhWQ{he&1_q~q{eu8={AdA˕3?NnA9z&% *M?VM!`6.5vnR!n?.oNP&fs7\嬄ok(j%viDueړ *](ub+FeE)*of AJ^,M ]$&wi8 ·t's?쓭E(Iޤx7 !e^aN+Cz״7X C܊(]z;|?E5}pCu!zjaB:^2*3g4W?gChzw_M7a>w2 S *P+Sap1b޿f6A2߼w SSz!KS94azɰ3۪h ܈bW؆,*VݨURs^Xy$qw#;YD936#y9Q+dGHF].Y0$߿B1}eq/(&lHءwUf82tz-cQGi;o^BF.gg+oos<'wI^1P~o78}MtT5èB}67ԟ<EwJu?u()NN}_X[@t)Ѫf9E*3dgy_Bf,TM-})nayIoJz ՐqUXH}/XW9.l9 Ru]ҫvȝӭꗋnmsՃ0 a(ݕxߞSƊ 0i,~G3˲^Bs\oYzf7>Ϸ2?G!8 Q?@Hzq+S 68#8թI)j*GejLp0i;1s}:O.ӻb(wԊb?'KR1{VjM5R_eX=!/M9`)Etl-uuB`xFqI(dI(d'EQ$rY<)HX*qn1x =2)Z];P-Vw%QR̉qxW8`[A*֒3#cWk Qb.*(T3AĆ1y&!Pp㼮 bAVV37(/J@d{;5밼,O]%@r}XBչ -dx'2ʕ8=MftHdz1ԙ0a\Yٷo4y}o^f}9f S+\8ľvB{IT;c8v|pJSJ*zr닓>b Xgߤ(k_I؞yC5v""MV(۝Q{@=!ő}О{qӽ ҄/n5p%9NgdBRϩH&;TlWl\i1tC_5K^mԯjd}բJޟhΨ`-7XSJ*4sNUPXs-48P+I\ P Z"*:CzϢf>J]HKWh"1@ݦg I[w$s_{"^-:כ2{!?,,zwaϟY})Fi?ŧwC%0߬j~*~#{^mIrdO},ii7i/{, aW"p{^#y;^xJB>N?~Ӈ8XƾDnby6 '{og'߯z.3?'Cd%c$D xcȼn`ctL(U&ޟ,_t-]7_~xVQ>6Ac#SRXJUZ2e"KPјeW;NjH|?M&q{*1$ ?I-@( i"aM0f%<9&U&M@mq5Ӥ'#qML (J{N\*mۯ=aQs: \OR;0̻ʇyW0}׮2+ (DuɴXRAy&9d}瀁!& UU6~ B%tvx1Jg$%/hٗ=[ i،j !% #BhcΐK19kͻf(劍]vfk/׵jeuZ]jVN{bAZsG>|(^>=1TvFa*߾mT7xomQӁYhk)u7No*WQY I5axS飂3ѣLmѶc/H+ʗQ^2/ 4 k JM DmH 2k'(7y՞-V\2" ҴaÿXM>KYbý}UIt2ÓpV<,/E< Pi_ߧyc<1G( y2GL`iݘ4R6o( T u@ u2@ri7m ;E Mk {2a$wQ͗H$[j-øȦ2$2re2ZEqPFZҎ:.h^fҕJ0r-#i*'&wA82)r$ZJ5\D܋]+d}Kݭ[%.]Ǘ/K}X9 OG2//5YeWT-C~4? \/gd{ \ ֎zl>vԜvˍq)7?'RLu MƦwʆjcꝂتe?[{mϔ{LCBɦ)w MƦdPTikwv>#s>: 5aJv.Gzp 4k0xͭH٦ ʵ*(yq7ZTV8b"_$B A՜ڒN/OʯAJK&>O+_wzW8Wg~M3j"X~D|% 4?:(l8_2%U~2Z#9'PDn D6K Su ҧF e(usA$r=GCmr&ʆ꘨*/?n&qar4"%sHQ_v!`gopvWׇuxVA?3-.Zp߭aD5K&<~rwqs^e^dҜ7o7yz+*"nekˬrh:² >ϳ;s9jF{E(ݠ[&a4l7͢q";3\ayp5G@AQ-3DZK2!auQkʮ':$Dw7WOOb EuCZ2&":NuSіBTSnrjQR7v8zz~rev 2F I 6NzZ1&8=9WʫQ8 0>1gx]WjqyƻsZ?/>?w#uso߶B R1 AO[J F 7%o{N(Z"tuc 4i+b\Ҩ"TkM1$Qd40Jj\ 1Ԉ(9hN'a *U C' g?$.]mD{U ǰvTc<=__b< 8g\~M֨~hi9o/a'/''AǴ4tRim$r!D5Fx.Y:BD׫4v.5@ I%% RwP2  !R %JXiVO+S繰&&9p!&NXЂE"IgbrFKJ9:[ KI?01?t"TiK?0>*5LU?B5ƃ`DLxssq`a$KjH#jj@P;dk ڪuaTx硠=t =n*;#ɨYuRI8:F,bwRˍc1p4oJd-#~*]\+5olUNu]2i-,k1(l}CN%%QUߖR `!߹VgnnH<٭өmvs $Ht[}pvBs) @ yp$$ByjgH%G}^k€#A"374ΕmeIAT*lQeD=մLK^MC5ı_<"_d\TB9E%8[%U^WQYfx"tq LۻRt`,%h[P4 W Ffz Sm/3g_-*Y؁~AքI@<&ȟgfR mǷVq67 νAY 1 yqv:׿=y\2BʝW}&Rzɫ3bNGwo֩`MTp)5JhsZOGӡo9;/N>mLt$$ڠ]3,Qy+7p!ONCϹ} PDrg/NuGP ̔yݩ&WVGPb|VMGV(ʃ\*?h!͞d<GaP8& z <ГG#.҂jE "_)ZUǿasy7Wysy7WeӀBҹN&Fe!&9Do iA0Ii FH# Nkᴶ-sh_0?2 _B X-\;c}SҘrBQ8j!1jb*k}TItT xbJͭlP ZქHFjpu9'4(S42(M'1x BGVH uRj' s%ksWJ^"Y(i! Vw=kf=N.#I3^#ik &O+/{kkDHB 0%h m Ā'`@(Nq iK9qD/5L_ MjL+Һb,ԭ q&AM@U1j^iӰx HgXTe =/f*z)x+;Y}n+yu :23~D<71_Mh[.uGӞ|$Ѩ6_>5|A!9./|F;rBwU[gREt[Zn01Gdڹ)|X;{dş-B)$t2gs;L( [`}銯Wiq s&:">4Zxvp#"o^r V36'0Hޫ;LmO"mk^,/wʃEhkczXyr qjܢ88LUyS(XGUz9QyaF7yK] g $SCbxS$ ү~RJ "e@f@ɨj[}Ŕ\I ŤD28÷NU7Wۉx~\*R$།#! kI޽  %sKPrT[2Q&GA uКpsDTE k "%lQ(:YsoɴHk6Bdf4q7@i*nh)deZ$#D u \P{<~㞗jH8VXR6=̹dBs?Y*ɨ0Exshw^fo5ʼn☣ZpN8z#z ϠQҔ /t!B4MJ18Uif/TMiꄱ GzR|M;"tqhFz6Ay?@ِR{V9?O R:z9%~3>y5Ù97{O].dP38P-=lx#?VԳǎ`W^e#jayKV+nQ9NzbU,JE}Do5!Biǁ(rNJzjbm`;XUv-J⸵q"'FI3xA/TmrvV<4:6TAIQS|/>#[B%$Kⴇf`D1ag%ʁq q[:"0 *Fgʚ_aeS}(f=3 K!n{'7QQ"*ԲX,$L$2% %Aq, 7*GQ,u)4HHU}0H`# J sZqK p'KD#GVg<ף]M&10^x<9[.]MeyJmYlW_I} a׾T`Q]DvEUU#{oA9L6XEQ+"pM=×TBWJdU}yqxXxnf~`zrwFa`cm$iZIUFv뻟#׳ $dt0_s9ݟatJ,nmA`֣G }PcS^d sFu~LNg/i Yfbx[wVX?c'Uwga/D>9o >JfcQky8L]e&&}{3ib0Wukh`¦5;?;sKLEaRR =khu :gS0&β$k4Z (;! CVȄ$k4Z /)QӚ\mpxKBoOږSmk8G:<44;iW4{dMӄ,v)> #;7b!x`A6}2t&ܒ+|,_۳zD3aۗYh^[cĻ-%YEy~* RMB\tz)/:Fn=p{y30x3_E~wn+ͬ,<Ѳ:]1VO_3 hyZIM5-Mo+>?˔8|CgXO~VaHKhLi>x8z-v;>Ij-?i+Q!!_nTb4N! ]-7i&q+#ۦLRzBQo퐬wd1Y?\T@2(GlUfqÒBr T'7fUjbӫno(oB1&+6e,j?T;}aS*-0,֨\g& ҕ^l`P}y9 .*q̊18XMg JJ<{R A'sHzi~^l@"TTVC\us q6:0C:w'|CZ;<9ۀSC b#pu;[s=~9$+$Q[1j0RLΌU-&QMO4ld匥01>v%Ǭn c2pԍkxe2m6OTFй x\h k*]y$콒5p~>Qt$l`Wg7^5AK1:U5YGȾZIIe 5wToCG J閡~xz+`2[3$z6Ʊ@vI!'%S5rX菣,@|>\\QwF׷MәY'q^D齈{*L#J* qb)E3%Tj'O üe6Nynz8ʈ9|4!t>Z2X뙏qa8y{W.{QI& "}]MQ;07V7kXÄ3 VZ bC((vZ cj 0VPeDRtAaNLPd DEQ.0͞`"0dEڋ뾵]&cϘ4][{Ғ Ζ=~ . (y'+N|Ql ܿ9!J󏥳y{z2-P}sˮFUsm,j@~n|$pm} Oq͛?r |F^]K8x)&lE7:e95]qM\5pF&jU ,@u0cC-hSɨ,0 ~I5qwYn=?ELAASI@me;eLdA``; Io an^ǃZXV3HZj-* N#XxZDZ;'i`?\" tr`XDd@cS`/T[6az{I@׌1d5ë(ᘞNRK)"`jt.5:˧eҶEYktdT G)M|V U]'UPGw rjV yJ-a"$!ggͻLMUeE@E,Nl0ZkaBA]rAI"ƃ,Ů04eE7Sz`I\?ja7!:6m:Vu5A@x.!= (Š`TR]HO-#"P=෮2dpQ$](7/fU&0m8N5ƸRG踍RD`#9+oN9a9DlE~tH71cȐWtD,Y&5K;.MID{Ԫ&b6 Ay×wrbtﲻ KX7IkCKnD#Zs@mE1k`C'D`U)0 )-3=U&߲Cḏ沃IZ"G_12+&jկj{U^LX8p8Kp(SXzhW[ ,x@\b* s`ܣ($ mZೲFB#E5_kn%P69 w@GV?-ZlhR :,:(Oy:Z.Ƽ/, /}gQLO[O8i2M1׃OAJ1:^<x**ʪTy$ڐ/\DsdJ2:x#lKZuD햋ADϰ2_o쉦j6$  JC)’s} 3/'``ژ1 XGNkXV1+y)/ϯC2ChĊ1/$2`XK{ 肱ԟT޺>L4pT KWsTj^|Ū7.Hm/CюkBMԥPdJ̖}$t׀вpgl4Ew{"#NծaKk&fAc 56qZVCaWkR*(=(s9{M]\?4}kH¼fA39V{^!A; m ʬzث\.kZ p8φZ.:kkxh#R 42At|jFbQ{ `:8_6%1mOK\T7ԴOk4y"hyi|ژHuF=sVQ-%f5c7ISR{pIiLAx p2-P8Q-(1ZTآ@UVv^" 5%VF\7|T Q1\DpEU154{,1.X)5A[yEU7(ae -UJɪ24JÛy J8WIv^Z_-\>\w_ 6~z~v)j~`=yư bA !*io_[K`d>2'ӯO2`<RWXa{(ɚrJa *lAf,NY`iA%qcAt SV?' Ki 8#ȑ@}pXC,4+sL*-vF;Kyeb%3*w г;]Z/0 *J!ǂa ;촗)\O#2+YbL.mwHֱDM}bɀĖ#@͖ޕ5#LotQEemOGtgv^< mԐ )xb`QY_&2G~PEZ}b\gPb Cb7lL} I(7j 0}>RJ3m0+ 0m셊 d^wQrbfPȇ-5 ] udsJYfGLx##e;.w K M-YC|&RQS?=XJ_vk *g StJG&8EL/w/,wWK"Eda[\P7_ٴ]o5ի,))O_&4*ֻ ܋4DRSL% ?4D5/˜׾[KLMn,!]`qv:r XꦌTkƳ5W! r=ߋ;}1bG1(6͒a1Ҭ 9"=kCY3B5ʄ#$&!0eQ_Bt.f،W*c<3-i6-d{Wdt5_8W( U<E@-ϝA^ӸYJw6;{wF!'Hȋde^GVmUr_wV i%TWy^_+k#.Z'd:KyR69.v@W1_Hm0L=:``q 2EAAAenTBg.dh9#'JC0  :91\$ 2ლU6|x}mDX'Pͪ_è\$+R ᜰ:'X&Xf΍r4B풦Zq.gE¡&V-7 2(6o"5~9h~b;p4TV6t;$"C@0"Jˉ&"IWZRPKt11zq&JXBIzBy!} p^nѥ:gۡV LUn1W[u7׍kS&0h,#܀-,m/ YG"=H+&VŽAyZEЛ-atZD,ԤZueMP|8OW T/.} B@CE˂!Œ(EJr0n*B4#~HNzR>1'k-8o(!)T}DQ xrVn:Sq O6ͪ𗧹. hgdV@3 ?ۏ]>u8hQ+^}?XFd#_!^ 9x0̽#ȯ_~(gGԖ4vqȘWDs "J#3J%?ӹ?K-^3sJ t#C*L|l8E.xLhsLNK&b (;޶m{]u`Zlڶ[=zݴ'n^6sn=PJcs<^jAxLԍO>9m#DP@NXzՒ8 ~8 M@Xs8VR-Ѯhl_K( \->B/Enl E {NepkTYjP!|6̭+K$+ Vy6;/rq+% y&ȦJ%"(l(oi ۦZ1ϨoOje"6o۰7nM+unKE->5Ƶ6!6 daͨfƨfї_iF]ep,p۶2(K>Zd>(޸'BRJ3`KlmwI%(%w ܽ͗3?~| 7< MEZ6#[vu|DNx 9s;gN>P4TV'}?IAeyT-UkEuVtGe˚PvYNsXן5 Ws=@ݡ]FRқH{ tܐf.#(+g%5K^eH,xYerإG?JcYK13x$ݱ;ᝳN,l~*u==&ǻN⚫G5{wu#}|u֢\/Dtqo0 /'pGdta8@K8OD5Ȍ%}DH:[%$vSm:YӔXہ$ Ҷ^b ۺCd^k%)]/4RFxx=q[PK\!D#B9"&"xxB9/EDväd%Vc;+GbƢ<0sA 2RXMrm= jm9FiJ0Bs*8Fr7\ l|n-x˔Z[|;TZۄ*} JR7*¶E{%A,"% J0&hK+nrhچ5HFk}j}zL׆NSGt2ٷpXKLAPZṽ}R)g3s3%9kp[iJL,4rfŚ MGPsqh9aI-vXBoeH*rB]ׄjQʺZPBQ-FԌJ :N:Ye'>A`ZQeYoXg4:_/\?l=M;0)ʥ?h2GXƂ(~!g ČQ[.+^-%Mǡyl߯(~d(HL2ͥ*Ĩ˝Y- rFQ4bJz.nQp,BkhTB]( c:[>V/ڲ{͛iY 8KeK${} gx3í[|Bo=M! 6J7;fQ@LyRk/\]/>$9ej[G̤pP+šDvޤ|0n p8;)e 1`Um|!~L$&|Y~,G(*gX.v=/GNU P#zҐ6qGj7iN?颀g~=>﮶i>_Vog\W~o<{D!_ @/sf~:([!nΈgt[XͳO- e"ITFrju}\zhҬcȆT D6㒡CE1#-d7 dܐR,8>6Ϊݯ'k LP)(RFXBTm8)~Ä^QwR-W;{/|;n>\_C0DP9r>_>GC mpC|?0ܒ#/˥;U `[إ]t32l#;p~XRc?%q8g702P@(*_m(ua Q|+r0?~v7bŌTNX!!bHD0D!$Y/!?nKI^% cB^ !0F|I9D@" ==t35d d6Eo߾MD7R()4 5O(`&+lƌsidf r+ 懷@bFACҢq3bK(L(8pR?_>f\!~"2}?f 1E/l>I3bfIდW>D9l>1#Tf/ϊd#;$m1Du2yUm3ywYcfy}H[ I([pTh<7X"ܴEr(:9F:`:hK|FD{= N& B Dfآ<28 Yk/^A/ rLhcwfh-r%WD|I6ޣ_$&Vo0˧y $[I%k!H5̝wW /w C嚱3A5Ͱ_`OsYV}J.l&eb4_^ӂNW/2CR/`qy-x%૚wο} 0(fY{G]}i;[3MqSM7umM-G$3"aX%r*Cln6#5D -5;!~3XP l6os@4 Sx7?̋gwv4~}D2~agUIE$ |v><m$v6e?^t(6 rxxُV`mCn>W#UE?FIqՈ5\Yŭʹ2#LHkm؝чTYJbƎIZAt[>g4RGyW =zŤn"5#KC2ٿ!.sVA6)RN#^S3(Ϗ;gNW';'.~)JK8sEE{"aPS~w7 eWZw]&j?b85W7&`CdH_Bw?&`ҏF B¯#lXQ4I9XN13 &d(0c9Ibp3({sSIRΩ؛`HiЦy5kAst?̨Ň؏߿{K S0[ϟ/x>m y{&vKDW^_t`4̝nnW)Nwy5]΍O2߹cp'6I1M;Werh!/f _:w2-:Z2pRʦnz_ɥj՜Z@YFezv>lim| r0/3H#Ǒ+2G Lib^^%0Q't~ҡe\`&J!kdf D`\['QA:l[`VwZfuߝA )ޏ~ m'OĸU8yRT;} TCdǶ{h&3ZKn~QFoԲd+(/P &9Ej,jF za@?[]d S\Nlj[PM ]?]*-F( d {O d F iA; Tmyi+k%۱r]PRSv[I Epg %,%=0ԕ15-pRݮ|>jlCZ}Ч{4K٩.zz>D\:SG#AU^ WjWw?w>\(YZ] mJ]#>I<*D3cd0y ^r9ظ@i.:)n>Q{ N{u4uG &qJhb<*A 1X3`BK봈kh[dx!Nn ?:k!-i|M5Ji#2SO]s*JQ-=P!5oj8@Ѫ^# JsRWxk07Ԡ $6 d Tnь+zZi47Ox\RY ԓ|xÅ9$߫t ]". 3n9nXc9;n]Ts,b\{f7u eY۞Xm͠(hmjFMшS,kSI?2m4- OZ*L*A4o>hm>0eB3SoZ-O}z"Zk'@l* $ZcFzl]ꀯC]npR R܀:IyeU&_~S}͉|dF`{kꪽ͉͉MaWvd  ƚygyIճj!z[ $!)!{4wbc^(qg=w %a?lsm͞;1׃єOQ%qkc򮢵Q 9h|붆 e?xzkʄ7‡ztv`4U~Vp*]g< +44BuqŎ.oreA\rW FE`hJhH͏@Ob;TDm*(+>ѸIf"PT)l,\n<Pݨ/qMP߮Z/RI}w_ljIL,d?l >pu,2~ÑCw6ngY+)M ʞPǃ[#*&U1xqw8̷WP.)hJM$qwe*qkL<)v,U}k&?\ BzѤdt]a Ձ%#0ϜnXc`|\}9+X+NDŶ`6Һ҆%hž:.4*>%糎&t8S'4g9m$aF2wz)IS* z$//(L}/?3ǀJ=bv`o:nF3ހ{O'v#\]/~m.bū·PH\ahMH2m֐YTٝ$hTN;#lS+&SuoC2%iU\?TGV**-S3M`5֍^Ю.8WN&e +5g͐Czuu:E k| 2ԪO-?Ih 3mʴL\u2'rl|f( F+K!^G炓Fkzb%U{,@*DjBD%(Bԉ,mۨ~\)w"-,"^XC!CaќHl0A|-$Dm;F]y&@a1ŝ<2 "F9@0KuꥳjB3` b4ХYb6꺖5DžhvkfLwdkǚM_iT_%dn6=WQ,.իRg7AD OpXy(Zv׾԰ԓv?&Xo:s'D3+d 4!lyQx\,YmICϼ!IȜ‡L촱y |t5b4v0)f0YB+4n WhP27 H@qF[̨0D !d8D 1OCH-S3F8|L*us,M&# R }pGX]3"/_=^_ьu CQ_QN,[;6zxEH\2AvQs=MuST@r/u˕P=z^֐5HL,eVѷ9XQ !Tݴ])O>x)]͕ێˢ0 /%kqn+An@><0fr dPoŠ*1@1{ˑhhf<#>=;le@~;7&޷g kAg`މ<U(ʙ)Et6bU=*xcQmL`&ѤpQS@ _' u~o\9 oeF҉Cc4k8IHMKy?wƿ_4A46S??]ܢ͏?S[?N?}=7jbLr_w1v^^ bOZʱ_XT-<(2s]b*dLH L?~Ze} *Br!;Jd"u`FPP%@6&߽]_j)QQR!/+`*FMӢ g:7hr!(41Up jh jfA؈dX!1n ZCΦTaXh/M bpp8 B257s:j.Lэc+FxŘvˣ8 sLJ[5S"4)%VZ?#_2@wߪހ<`>rt bTzr&f+$j7=<$@4c{J_+wQhxt[Z y]u4 cZ sSPmc5>S|H]hL`T|6S|x:EZt(3Q`Ȉn|to^.\򰣌^,lV%=t͚Qcc9QLI4Iמ>EȭCmwu,aV~tos +8}i;?i}͸FW6e &m5>-/)|T$#p5YkWD>Ķpγ()v]`hYd`>ipVK+A4>lqrl7[0մwW3FQXicKj4{+`@RRS.eZa͓m_J~E:S?e΁āāā2߀"u§q\B!=BZ 2:[8n cQB{a<Ђ u~N8+WYL;թЫipGOO82&5PZtZ_3ߙHײb*z,iWw+C@8)>a0I}.3رvUQ(Ae۳]n-?{WɍlV@g11) 1T7Eu6_ަB펰h{]LXε\^Zi4 ֙d'. Ikgp?0=;| ЗB(c4.G)ʤ}@q"=F#=& m.@m<Um6 k( a6:@ΆDKw+=ͷՉԆ$cd{PB:z1UE9];NeضuN%vqahZ|Rv૚c&d.1AյbS1k'y1D69幂T{WGE-Ex{`燏{V*f[(L*qtf*Z3-t._T͐Zp| &VIf.W$ʮrqɟ>xCGj|8]_wWZ+ZM=刑c;`fG&0@0 A$dQq.(FHFT=˘>jd-/N+-za^MԬ*J-}d78ehW@ZF(dK+>k=aMV(8SL\o(g<_=w5#ťԈR2=r)6Iʲ@ΒAx~A+}f1wZ04[.nQ FA;͝ 8v_h1 |~{|E¢GBd.#?/~Mz(|3/ I/|=o{|>DK=Kϝ+w'qP'y_4w=K.:KZXL 'u6SPn))Gfo )3B$=3mY1W^*H"R=O8˼pWpl=)M3 혓)j4u&M㇠W.dJmx~LLi5М ډ ۅͣ PNS/o s~Eݭ :9@0vw oJY'׆a2P)$y*/4춚hVcᒑ& ߍ&=b9m#TJs'4R.omZ$TucTG6+-w sn0Pi X lxO~P,M'羈v;g/}oērOyd b~?9 pf7@l8r @K!$8y|ĸ w.VJhm/^~YHuXv((|4l 1A zei>N.t]0^*jJ*c(T q=ZUk=@pa_=L?%SּWcrX)t<-U`4^7'}tЇ7eem i_=͖lWKG5L"B)!&&j4X 5KB)˒[ iqo y9y`q5Ε_+r% ?߳\s.Ϲ>"ϙ*5cT#S}䮽@N Z;k{]~=?N칪>TDu}'!C??r4yKu/S㥓*sQxߵ# ts)1`~n ;Ke^t̅[B"gO3?uTbMX w=[.j ^G*UQ4ȽF;"KXAՈ-K_[;&s\x=ű^7B,RP1HݩafG&2߂F_#A˲я_?f76{ 'NNя^?w8^\9Z3#Ru)ymI$E5ٲm4A1.EH~9p2l.b9SSTAv5:%c)b#>:&h8F7͍Yy#{zs4# a?ĭJME$͜sw[)/Z ?GLK)F (C*w,]sƅSYEÈIӨQ^d x-i͢hLz_k΢CB - !)s̥2) (Wc=$x芋SQtחom¢`8OמycR$lGid`i!ҟ0˽ز@o5 m`L-蕸zX=c%0LFW jf4OJZEwx-WI@DQd!zr~,M╫0Y7!1x!t5LNn(Ac"~~z.h8&eCЬj8ܺ 2ipQ)BFD`; ArK=$ʬkVRzx51(MV{rJFEw)z#iw`k+njr)*_PTSRh+פ v3ܫ = @M&QT]'rAF>hUGP^cӎhnk;oV8c|!AZVS2T8[i!pmU*<*7DdmX8; zl33FvyC}^p{h-y{V~)QOg KvK+^{PLcD{ϕorOkC:V)T⒢2]KZAqP2V^3D.QWtY%d"?A'CdNAa\ڣdQ|i\M0Z͐b>8t DHF2:;DUNrփ>HcwLtCr$|5yv;UyEilگKbD}U6׋z{?NOenY[zxbv?/#|TzG?_~~x`<;0W7zs\02}ƫfs~wN qj|}}uӇs)&1~owwX.%B n>AZ#PH{\}8.6M2goo< R$LuK9aEk/am[&g+A KE ,jV'4FI6M˚ ވn/-s X@6B̊gM:kip)y%0BګkL3[XۀGq>fxf5x ݢŧC^䍞⳧\!ѯ5Mj<<_~n뻿<>|:$ y\HX'F (Gt^=TM{&\YTlR~ iƁ ^I㎋!$56! SA0ۚv2lkϥ:h6}{k<X.vcRY̿6u}:~:LO7/HB$wLwn|ls>7lL~}e/{Eҡ'S|Ohd_^ՏSYs5c4Å 9E:6o2TL̼nwkl %/׷-RKFYL'+L+#Z;f33)Ƶa=D_eROO?oܯQtV2k?yEL 2lv Xq-'Jp,`*t:Vz (>/X}R4Q〢dԵ=/EL(.DI-1(C3iԁ`p%|z1$ ~3~)k@N·(O~A`Q(sy@eoǥgw70 p`߶ y#'Ġ^%9 m\9D/097ܲwThf l6ujTmIsݭ`sPR2J4BQZTb#AykM]s)C3ꃒv([ҬZNVH:llyP HǭI@tЦ$"8id$}kPLLtm*~ԡ! Y"320*OЌ|c'I@Ԩ=lw>lY;4c١F Ys>(>qq@3"X =?{W6_v$xMU>L%٫٤f&_vE`vrf_mlP(*qjXxn44XfV+NL~G%#!8QFEBVr/_ 8܁˞en~8h-M6NJ[QLJ?98cE=gJu%cYX$jg,A 3=E;!RJ|%f &z:`fCyǖv.)/2w9c: E@s(}i $|Aޔg6Fj P=l[fBlY,V 5 ZCC<(khi$͖5T&AEf F_EF{jB%R=K!`q'L 2!Fq)mH qa`3%=~{BB4_3fpwYvW Ar 24KexyL8CwDFD1 yn1 ;3CFe=Ærk O>n/_wK %qq6U͏/M>\}Z'5Y~/;z,AC@׺mR5UڒfV?T߽ OiFø]ӯ<ōG Ճ,"JuD92(s&WE׽_.-W5ȯmN_RfA LfZLsoP9.pUy^,kd)H#=L3 [ a`\:iZ<ͤy'?؞Q"mv#!n%#r;' qE1ëɾpcH1JWsܚ?L|2Eg %dAFON6ww̉vr.k,aW^C pv5 y eװԼ H-l\î^ÐZ ؑvv]AͳSAƽR$9gd,P $1LUsW %2i؜X c:^:Gb^7=ލde x/x 4^z7y+7E)Ƿ{@]jVoXN_)UwM g4׹( $9 @airLܪB6dջ9Z<\-w߿`"cG?akJ-m]^b sVBWw䭘Pg"[^^O~KE*bɩ &cD5H};y(P"lWV(v1'@:?t"ߏٸ{OwtӉKͺ%I' M4ɦ_ޘe1nĘN;nN 6"g>,7`#vK 1&Svp[jGc,7цMEjFܬzL/k`No.}fr=9C'aWYlqK6?OW> k\ݍ"gBI)9'2r 25!vZZ^Rͻ[@GϋbJ1\QjRIhM 9Ef1/4ByDIhC 0KW|f/We}Jp7δ~9gr5Y;o Z;y֑j%}<&ƒ91߭qop;B,YJξ!jzwkC'J=Y{h;?WNޞN:!E`pmw>vEGD(oJ 0Y1TL FK[ Usk_"E9OdĻ, Z0 0p ^Xjn*dS|J)Snu;`pLO.|26:_XU/z Ftb#J-Y# rxQh~Qp:b,MR=jU9v \6.>>-(avH_Iܭ/ d YC<6e gN`;_@WH6,:(JDry:mQqy }6!hn=jZ'/Tb$ XbZh+/n. +ο߅MM*íJg?sۅ/IKPRY`{g궷j&,2\H]KRZoEM9J臀>b%zEi,Bs,'BϫɻP!~C^MVJSh8:IEw6bdA!l1%~˹= I8\=j' 2᫅/azSAV5Č$#f2 g1Seԝ, n:`g~~vRۍ驋ls>&"C8rOJ픙nZHNH$)w}Y>֣m^gSOSUQLs!4+iʀƻץu׏D4d# h1nHY^ oPuDlM_%0jZH"(2G'Ʈ`|Ԝ(%lͫc3ncۙ:jbƵUg n#͈wqFn c.X mgfl\0|ݶy߀BUZ)iV֓F 3}7f| (/%rJV<%D^dR\(|m5˜N`et-x"B$VvtQzFYgaGi.$Bf6e/+AX) ek!*"3(BKYsC'鯥yCLrF82 5]nsO NY)^k{tE:o/9+%hmiV>>(?ZMd3@ 1 973xueFhFDl'iQ}~Qd!? M$"T0ZlYSuW!B!XɘDcھbQ܌EN[l_jJ l9rS_(*] oC751'sKzէ$ږŵS?\ sBJmgˍjYHߜWhस] X:挝 Í|OE/xt8}r%^^䧫W^ƴpzIB}w%ɭjCs07lYr>aTcZgzɍ_r=U`̜ Hd0xeGf7~eZ.lE#a]ؖW,VE,)'t3+U ~U5(5G"`>(S\ز|  [≦"s' Gd({=zu7nZu"jZ%w:(bTE&7umQCߙͧYt趮J.>L4E+],RxI@y_w|:yߴeL]()0eΤ,\w<^$J)g\Ό 4v_ (AH#QA*ύ<.]&xӳo.W7ZG$@HZE [ԁQ"x*Fwxo9i;c-ĂW\F}ȝ\;ZMVt"DkBkXrUgza:"NX΀vs( / "9jA5ϾȸD 1QpR 獉'dIH^(-j@F٧H9[εG.ApUqS큶ɫdo/EN E!L1* ˕( I7bҜ-&mߓM-zA-,B=}H/F?;"񑣅R\W zlQʴX+ y%2N > c Y^D>ާ~Okt. uZ Z+9շ,d]8-zkYc9}l>G0"JmwK-4NbRQ%. JSTP<ɭZ.aAWHM&e`}%bHn%,-"! fEd0f+(?W(X>3-quenI 6Ja=(f48f `1•hTͣD4> РgF;^-7BmWK8;x< ᡢHHѝZR,26ñ!t {5VX|=Eq(xlDqyT*5C=4^V4;\'^}.N9'AH8ezhoG&~<\ܣuH5/M,7!%Gjq0QX +\<{+ YldcDϱDw yC>5 6Fw#Q~Wi,Ktyy4yȢb}c$k~FW?4?bccGq-u[p=V%jG=lšwoͥ:CyJfC;pAT:NzT9 mhz;fL&1iɅ =Iǧ+O.^>:yI%NQh 2_yBDzgmr[) Cc:Tk@A;j&b`ic*) S5TxL)6r8$=ʀS&హ=T"J&r*%PǕIfEt}Q&04aBxhhD} 敒JK͹>8~ʋqjƝrc0nF#M G1폝$CfGb+.KD@QZxF["X O@wflL9OaP"T鷯2JT$vN˰X)ErlI _!od)->v†yIoRrM(rMOOp뻂H(Ei`R& D2m$6JewP(Vѣc"|\v~6[X=؜| bO-5_9_ʠ?_ϯ&׾*Cҏr(w/U`HBpL:@">AZB‹(+;y8 tFt|Ӳ~0Tqqi}5xM<{LhɱL]m咂}jaf&HЙX84DT|zA%{ ֭ wn> HJZgI.F7Ik1PHRKU&,5@iRdΤ`eNxN4YH6rP 澫~29鞜޾M&ԗ+j̳2c2Jsk4ANwlSk@}14q0 >5sW-7&2 AvQ1 # s^!#ͩ 齷r[T9 HY: !괠33Bi"tIvM5 Ur@f5C]Z{7!/΂y\uիBxZn"+Wa"f J:D;?ٛkW;Rwv/IA܏%P-41?H RG\>Oա'n>]^O~7y|Sd8r[|*0LVi7c3j1!էo ZuMO[ Zz11JˌjP_SF@n$S<6_UeS g_Djҫ:j^gWTv6;m{'pr!btHfF.|w F'R(|\ݗ/Mh:=&\sNq6]=%.}xdz3z>_̯ү|*A8hX=ws.1IJN1nkz|{DlA#!V_դת.Qja}D.yn%CoZ5)}]1LZC52&iA ++/gl{&wꛍ`]gi{_jնr{HḬD1pR 獉 IHHMH@nCxT[ !>g:CAЋ/߽ wq9tC/NrwprZ\PM65ZU!TzOI%ܔi7+Zh!{+gOȲ5@A6.VSRƻd_a>1ח0F5dT:@F͖K/fTlHEү!PfEc"*][M@‡2#'$s}qw_|pkB*W9좧*JUE/"QvG]co{zqT?NPhIO7|$n'wJ*{Jtm s 3/qnv!psdEQs!ȯĭ|m텆rVqBmZRgB'gz0"G/:'3 MQ 9&Zm4nO4~c*TdEvjt_) KىdGw=&`:RZM9NdI0RRPE2;KiQ헿o9+Ns{dh:,n3%;wZ4uQ2g˚R#S'Utͯ"h_U;7[^\3]mb],Z~1CWlErbZ!5S<KlA.[m˻'0GCF*Ʉ T+سFɡZj(/EAE=8ZF1IHh8Rkj a aI FNJU(mwJU',Y*!gV&!1$9nfVOW2gpeT.|Y3Mt8eTrWOre٨ݔB.!@8YWrUہ#Ot!]-,`YN.4" ])[ŏ>KꙬ,Ro`cT)h#[y&L+݃)eN 4[XF5)(-|YZ+l[O鹽v}euPSj T"@r2-c6א5 S.3|h-% mN$?|WT~jnlNlM 9|.C,&U~#v=D^/8{JP$4a@|ٹT?W7o'sΟ_敘Ww_J5A?gzwrf-B`!!8MT</>>Fo/.-oPz^f\(C| ] FjP}H " ^&/(Qy;͗S%3 d`?o"q9 XqgfNng7;Ti@6)a`=,i2ĚƋC`2H,Vǣ 3swCKxex=?5sQ [o7AkM)LD.nIhsE0O7!P}MLlq{Fq!ks [؃0qEXeɗ7Fa΢fk"Tr!p3*RAW @X8i  e ݦ1!q[ț.F{AޘE|]HT:vX8E#p$NcRDh5ب_ 3M o`|Rc)uB:J)ġD:F1N97L:/` f9ƙ`YŴ5eIfT›@, ,Zkq|k|+JWzW+)5Acnx(6} !kS61N*8U{( f[ǛqX>8=3W_ƅt-&/O>&mOao\ț8} :yu4.fWO`7_ӝ6d+! o6{%|zel=̫.h+Dr91rw{(}  +FV҂ǘj+0kO ,GaY dG:mUal z 1b.+()< 3މ$ RvINQ# EBOk &Zvw?i**B5FcE;.cjyLy?6$#{==3(d`(TƎ1l *OϚ^C[OI)p@vP"X0!)wM~MbSUρ;8B}k'$+Dz"JHAl'JR溕l'J\vCPC4)' Y +FlHnH9W˾d?ن&Z tW*$hHt<@S ZF51{Tۻjŧ絘o8L)Pܘ[cWi=<߇%Fc<7`"Cˠ2W \AJq̂!`Dۃ)#l*c8y1fU^t\LdwEZd"r`\ǺSNi/7R7ϽIZa,ыGuU(kq(o@uwĝUnU[W+4&|K+2iW~DLuYp$] K!%Fr2 T u#TAڱ WfȻ]@6ɖщeL$ wڑůo7͒jAhծ36Dil" [}*`Ӡvz/a+׸‰I+93p%~G"I<%lPu 3?O;hlyLF)߬N{Q+xU]rI@oxCFAwOo9F@OE$y֮=8FX]U,(&*onj;o~W7 fI=8_<<0äG0ߠNZ7 ߪyI1V^&_ +*zP :Bxex:LjE췌(\ q̐kSmϝ/l<|d>7f.@lX^07ϱW;K?/.|4?ZEkpz:ZS=+Jcϩ3ZO^11G x>g ~>A6i{۵AӃ󲙌ჴXP]ˮ?(J Ƶdo0}0~6Vr+>Y8WL}4p+8< xS-\ڗh[+R\kK@uh 3kY"bc~iBd1+"kYZO*Q _ ]WqZJtTYk<͊%@q9mi΀[R55 @ #• 1[IfJ$U*?KN#Gq-HY(vHVO差Z:c+n]7XUKȣmu Չme7K Z6J 'lk2†F@Y^w=dOainѡV~vh:vhXbg9VuCܾo}y kzkutEκkݓ 0!etDz"&P7! >L!6IO6"0{t\prɳ֗2/4ťԎr (w'1(>Yg_aƀ׵4[YWgbbgʐH&1|Laaw!x/E⋦џSS>=4GҹOBz[^1"ѱzHmQt;.Lw#gJqUV%*ʪcovfBk+U4a׏0I oxC.Q{eTɛXRU׼Lzx2(BPY{o{dǷsX9EfﯦOf2:ʊ(j|Lr}ݒqlaEI Ƶ5zOϵ}s[ Y<#&8xY2:.Fw4Y^u|Hyʠ~wh2\OLX]?q+&!묠ש*!8c1t›C`Rf˺&D)&D4t[o}yUVl,֚`WVDս\VK+%h4/9-#&(eM@-C@SC8!3:ҭP-0ݿE5@(T9Mo8˨5 Ղ;,8XFJEqLQkj25ԛ*er&)q(AJDRH$R`2sKтj+Q-ZJ0iYjJsNqȍ Y[,*4՚}U^r)Je \s!fBZ,KjA5])%X$H@1hV$xYšXK$a^!>J2DjGy->6OfgÇma̪ڶVcּ=>}7sȅf`in`yMO\Qs ayԤ#>ܿftfϟXP>L惓EdApv> "}   ,pr,,.nņZvl$+PTJYkޞ[#u-5_ܠ5pՄdMo9L͢o?Cn0:v Y̏P,J@b Est5JlpUaos+$JծkkSL+GT 2!KY0PA I9n^-W9Ur#1A|@d'c-_䘀рL+$5) EcFiuV!žI%aseE,e5K}z&Tw6dĥM52 HO+*^ܷ+ FoŏcJkHO-뾐?IXSF{Mʄ2ls17N" 8!=>JC>??6uc|6Lvٌ߆8k;|q~' GzCvrGߡ:ܘr1:!Dj=5 o fٵC {7כhnA`KwD,"ۅQ0+M%+ )S2s/jKYu+-n$d;N1*F:Պ9!&3O>1ZS-j=ɝ Q_ ¸(3< ~`hEtTNG6JjJ-촠Ph:}J$Ίd#$}w]]?%8sf#O9]Ysɑ+8JFi,ޘn0ꔰ2J3v߬H6.ݸDڊD42/ D͙1 Q,A(i71xHR}qRR #v^9|QAgd鄪ghh O7]8g56_s~<~{l,yw8@\<=sW҂`͊ӷoYP?W ?c?~gSUsO7|w5Ht~JπO;c;e˸; >xgh A!T|yO2D 1W/C|f㆗\rzrkB5"r#z/ݿy]ܐA ?G>>r(ԘpζR῟KwtKwt,.-ހCRM^Hf7KWPNh)YS{qC~L~a huOu@:IȍhaV9}Afy;jiH#:]Dw9"]HiS} bQEI2Zxms3DOS(Xe : Q]ꤾ][}"F1Y!n׹._07K_V 3^-~ y!ˡIaV'mz f6wg=jy@y>&}RTyi%q#({eхQ(Q1S}vK߈5d8s8%o{!39[w!kƓGswf FKLWwʩ8zpj2gerL߫BՐF5c 2Tfȵڂ2Fm,rr磝,iwFgkv me;gq1n΂:0:IFbu9a;,R~=>>)'w!i#ù?hbdj \*q@A<1~*O1XRI^)p9{[ +ֱcQW/},,0мH_YhLϲZ;!l>{3Ǻf6K+vnB^A IN߀XN atT PH G><&k2g .$N=޶i-oҔPUI"xOm#+畱J&vEOtrH4D!M_)f> _ [.Nx5^-._յ=2l=ng٤fn)C7Fnadǃ4X~Pv/^r qoժT==n0YᅽR}Kzye^=1_cDRSLhBNR+HVR(cPS?%1DE}Y_5,{ ÔPl' >C>ё$-/ihʤJ'_'HdBi P'ݡ'\Eh*IT&w1ƾf35_xl 瞲,grrQ zOЖprp^VbtD!h:+|?sUd.-'tA6a{Anɵ|v% T?hYr[]猺N>ȃ9$~JJRLy0.l ?I$˓W[o(qV͒,Zo@a%,9#ӷ@TF p:8dPxn1RVL L,#&MR)<L9#8n1 $@ыhn/uJvRM jt(1ZID jb3` -QK@6|gT1F +@d,| K¶Q`>`F*"8|-͠v P`PϵdBjnz)R)Pu`VgSո,e[ymwXXe޻y2 HLIN~O\boϾrZ5kEb\bA+*9tsu3InvnߓB*KOU ;\񴫱nnCP\ Zb03kNDPkoFD:Fqڳp"N}#o2($> y@#{-eÁߛ o/Yy^ܡ;׼٪V2%7F.;INHs1o=;F Pm=saٙG\r09V3~@ޞr(Ώ5)(ƚ?*X`M=\55''6JS}hv[[eѣ r _3Dr{^9^?(SdvYK(̜DC=pqK%hE1ZXiW`DI2;bNc>^lk v߻l2v<\;zvtb#'.0XW1RsPHγ29ytYF0VlU`ր''2 WG4jǒTȈJAZ_ւIEN:9@|<[F!|lyw~mqOK?k~u@5ԧ Wm:( UmO:p8e5hQ6P|NwD)3i{]Pr|%,,T\壒2Wmc Iksj(j;V JzظE"ډ1w1r  f%")?D*8=|<`~)D`tH1!JĤJI*()nlclOaJ>;{24ׯ/A)`chLWH%2ƒ6d ! ]Hy[(!D_\~hv$FB0l$)͑ݽĦTӹŝ``м7/k;%>T90(lX|+S}՘I['@~A^ʄ!Y7j\nNkW'4պqh0oyx4ZKך6H`77[p~+_F@sy^Y՟AVL yvWZ90l4g|ZJ/_a#Fӷ~5O%v[ quչ[&As)lĹV\0y[in@zQ* grg F7.؉֤ldIKteq 08ń,m$>m卸w7qM8@HԨ&K=:(+QSakB9/Lb8u:ު`nq!4K'hA"hDY!lHpL)UWWW5 ta2}tN L52}<h} $e`K4m!j H˥ $ *F .6"FKf,HW΁}m[+9rQ½P3uGI q+暁O O2ڊ3ճt|Sq6鋱t͸D9Y۬ $`A["q\19Q:Z I.X:PCڷȡ:o]AzT `A{3G;~JπO;NYZ>^}Y^
w.=^<|Hҫw;nIYd: P MvԒ'R?/HӅbE5Ɨ`~| XRt>G# % 4|x佃h-{jӰru޻' R3!SND EٲjNK{ݞmoc=)qÌT?d+*T@ZE Z%&Cqio?_Y~2nr h4QA5#TH2,񟜐!&}l'㑻^ }uc_u! z|(1]X*g\V!&ɗaeb ]SX04_ #:ld\%郓bf;ƼY/մ!TPׂ^̕6Gu`8D 4ZSBb.qzG!5X &zNs5 ZD8bIr˨x,:9nKJhqOV w4[ЬfZ1m4KR̈l./)Mm2Z&aĘ'w)`}y0wtXe)䄤kcsҦ,`聯JDKA0@toV>)9٬}_a"P|ԴWDNzQ %?k|4㜱M'8 Q>3rڤ7鉒:+,~mF:VR;TbLZ>;) )q1pq=@O~K+߂ƔSb|X۹mW9_F+ٌ2wx)bQr:Lm9pL*QFX'[wx:f@֕N*b`R?ZzsCp8wːd~F;7'aӑM'wSk7&1],/< R^Z3Mfd/tr{~ .>p_&ׅ}^u&]g1oΟ.?)Νsu@H_9>31l,@cA]w|`)Lguk)*e2_m9*\hcIcZI 9,иJ3~E@L7vP{֐:ZLK* qj(Md^p1BtBCwP: Ӑ.#"1}ɻ;+@dq>M`e #%)iB`ᵩHXc+JԚ3-;j?JTeK HtHA" F:%P (}P'%y i#GG]@dG%k7^3dfXI}M (s+dzb\Ocm(k:.)L9s BV',QZlR4G\е y_ݓ~ǧ'cCI' 9Md/ IZmsUJ)F ES<5Hɍ|+K f< P3YƹB+r\ZDQ;j1SahHULbhCdIEhO!Ђ73Fc^3(ŐaXA[<z!|+4ѝ:kcxOnu2@ B~ M=4GϝadIj9X`) &n40d|EQVVD4:V}fԁ%mM}˯Z[F"5 (AAJ׫ +QZJ O}>J,lPFIxw* @,g<%}gcɶ7*+eC&yBo+%$qnYI&h$ \L IF5Z-ls gGĿϳOۀA1)H#*svTӉ$n $jaW~XX*M¤"KSRJ4lcGy֊.)L(OeEHD|gHڛHM1# ~NC8;qQN^X2Aٳʜl8QzFUmh O-7/WsҖWRLkrpZ9sz W}y"V>{@m 8NP幰_B*ugZaܺ]3{aV/~.FP:#B[(U}lלvdv GkD`4`~P͵F4 .rMބcjNZ *B}gx\sp nh] x% Re%6a I<.s"m@x2ލtsirNGtr7\W8ox< ŽtiO\{7qi!.JlN߶!D/Aޏ|m/:e?O^wU{ͱF!A+B9S L\rF!-!N-\a%S`Y}M^ Q,P; 8][0a#e {Ư޶|{} +?8b1>WGaPgOsOz2 `6qz8ME.-e6VFx[76rZ &-sR5`[Ό^lOI+XǛč3s\T[Lg@9#kDl?Ɖ8{?!D~Va;eȕ%ueL稯>1~>cWuA5ÂUcǹ|̱P vaG 됡+BX Ŷ;) %2pC c+c$4uRs)VHI}v\S#Ib.Um iTR`ƀSq*4F 3cHj2M@1 IPsis)(1X%uܳVewI5"g6~eORK$2{s^C=KLӸ C͚.ϐr %I 5)P;nR  E?K+ 8a+^JIJ} \4ESb݆ ǽbRA#VTӊvjJcWEPUV7P6XcD3[vP:Av4˄bӌ#̀yD<`AK)5DgZd2He8F'|@B_8OЊR̍GPܠR&.C\xE>ȾLFAD&Ű3n-\%SxxdF!ǏE^|}x6 {AD$'˗"/tƓEv:|f;]?7%/3Mx6<^~|*)inM|2\D0hnsH FalJgrzh+9H˲ݼe 7 F,֊uuÏtԗ܊o8hkx` lWЧu;7u φӾ C4>t;rs)/tC7= ꩏4)>i ~k "W鏇 o2Yr6מnR弬[zI  9s3?Mg !)]i-bP?:; {րA~CurH\h8j<4'G=8Z#s!`j~ϰce TH(;yxR|stNFG?evT [ڠZVd<Ib(/w2U,>~c;0BC\E)YiBpQ[TWZPg~ROŸm`5ݿIlmu㱭nܟ>^߱]u1Cl-m'@c@Lnz߳ѓLP xs3|㙞_3 Gwbxc{%jT{{yoF^zL߼\ݍxy&:{a0V 'S2 ; _cn4 Rn4yQFBqbT`hn*R' ӜTR5h&~aƴqP10N 9+߆On4wy2_'Ǡ~g?zoM!\*xay12[o &瀗٧dưv>3<0V2G;Kq|[ ӋP1PyYqrnTh,@_~ uĦO_6 %ȃPW{ŭ3䏧r%},O4.v{vq`ZS`E#˨#TrTFrp"xAK`K |-X\]^_^'7-5#LۇN2)SX,aD1b*<tCyKvkAQISa%9DmD[2RnH K͌b<, ծ:#A kxw` ۃtqT-6)Jx *x}*J5);1Lsε9abFLDh¢Z*Y: yboMĚo7QNd `UJv1`bGĠ70*˱2]䪙5L1}}Q7Fo3{W¸mfֵf>q;3/x ZBuy9SWxmR҂L;` ӍKq;a8(e2X{q)мEr7Y]/VQ#Z%|Bӱ`m c/C5:bG52T=WrRD߀6H !*ϸ.l1&=ÎD(1QgdUڐ^-]ݩYGK,¶@":P^,xx{/1mj SBva4F<"JGUST `:!![фP )YXR^!*_y*5\e#&Q\+hӣxDL=}}h7U:|J5Fa\K2Z<_ U]n'RK:G Hr2З!}EE|)i((ȫ?{ҟRyo-bwnoKGL?y8'808q+3pef- 8W:{P>jQmgc5eW0 Jcv_|3d8yqjϋS[IZ`1-.3-"Kf n|c"g1Թ4vZM!$=:+.k\01w8ťGr7}@ &мe\>l!Gҙ6'%srKxvK~300a"*KR~P)oYTڏgJZa ^qky?LnXmEmP`qנR8=QcS !dĻw;l^pyCfbc:X9}QN%=<ƄpE*ldݍGH 6Xq3CnӨ'ACt'32axF`RPDPe}c3h!L+o8<$2fUAuf\I郰g[<`^hmԑ(,S4]icsX)C+o4!B`A_kALdtцjK 6 hӂFAϢh&}2SQapV iydL˓X#K̗a#0 q@#0!!%lFT-YQ_>E?7fvDk'{v>yw]-?;"5]6 2*9 ]aUYiDK#Zɖ!_J[?+ByV9trbMx):gnJޗYm@;غm{[;h'}ӣcO h e? 9PB+%o{݂{x%x'N]LF\_ee&Ca&3YN8]''XI6bF }}h'Yc ՠ2Q0aPpҐ2'~o9`fPI}S` K&Z  bʴ8c'B(wpjJkL3-FC>k%55u3)"i51! 9"%Z=^L -hODs{;)@=N! ꦵ5([TB){oTWtzz ;{TYz42@{D_ШOa\g^h)ɉRC%vXKnC: )PqY H?^Z2K?mB_ꏛу.~XEUN_ldl fU1&Z.cTmwϿ|bܜ[$>(.M:(\7(F$ EY-^6m7iBq|[ J8Ff9$H</^)?ED ^G ~dk\%d5-<# TaJ^Z I'L4)EݚV$lkv#+v`' :Fa-V1lFf+;bL+>-X /_ҔGC 1i I/8 {4cneK>{Lw45|6Yk npհ75L)6EWd !HujrIz˨ 5O{K!ñqr;*JC6|g>*j<F6␥swde0„@Eh H,yz(<+B~pJlLڤ?͔qVͿLCR"1"Etg"\vZM0}nVd |$^m,>ȇj{s<&?8Xw\00=}w,:֑ }7-::9ccrz&#|q5 ztg:l7]CLԫjI +[LpD\0j Ppx`-d awCvȗd#W`_C8:cv\nBg{`?!!oY0e +m`Ef8g^238}\ 0 '^o6ˏn+":9ӝǕia@th6/`5m|ffn >wZ XtxqIrv} B_!o 8)0q*pDUar\MOCGtzBG%͔+*D&;>\s; BF)̪B9H`Au`G(B`_`[( Y>kʃ1WMGg"iXW]?)'5{N֦]hbf6vY?h_g)|g)|^`Wq̅6K % i1.8U9Qn>ݝ3qa{˫ IޟF7' :h݉_-%J&ZIsROFAWJP|W8nBjaQ@NcivV2P1eZ3>rIRܰJ\#ea\dJ$S4TIۍT%%Rv& G S\H?qFkyxxBF"Y@5zˢ/,"Nx>i,i,vDa:K>Xo2qINpMt`)C`RD~ʉv@J)O#W1UzDaJC$emz8BE:TC],~&{]$W,m#I !,]SC8.c{%8K$jH&%QeSG݁#zuUuuwA+FKbx 2VEcG=JWD٫|d&Xnbδa-.34:ЧdWğ=[Q߲8+dMOOW9 \tu,07 |0 ZB4<p;B,e)!(6" TBn]kN(pHa6}S=i}7 4|/Savf6s|ӳ<~ɛ_O=u`AU| W$AdO[:&@qo@L KLf\0?yɇ̪llq7w3x0ĉqw4J|:{i6} }p=͉h4G7פB{Y:J9??W~w4{ z #w?yo{Y%nB͚s^wߺ -*aW$J0_tV'=U喤lgǑ#n`A|2]uZetQցաnP{oǡU K_ǹCP3'ܽ4Ӟ'}<4mҝ *MJٞ^C30%uANSp:/7)jrc`<,/ a!]_߮nR9Qw?#4^?`bcooYCԱ~Hc^/:8^A'A/7Y::X@1}78׼ 6JChtK>g gގ ~O`:j?Hl fED_WE~2pNH)eI3._[lwΊ_ -f?x6>sT0;޽[\󻻿i I_aކ*"!A=/[s&PCqx-q}h0޹0r3J6yr9FDT60:i{t,v\I(Y#7?ɥ-p+,1 _| BsgǯOu/ (톅VЩvc_#s0-K1M`YΠ*I%/]}nҍ`Zȷ+X°z 5Q(q{MmvISnP*̦hR5y9"b Kt֧bFTa{&E 㡗fըw+WʢHbC0ieLsFaL!1`Z.e6oa{%*bIOc,@Is{95 <&6TNLČF1ZIBN5Uq0MCY9Qas B-AeM#QQ_"HiL %R6hU}32Gˡ . C(*GW(>6~X7- 4bI`G#j,+o7J,MGGmg^lͶZV% MadP%6[#ɭOtbP`jÀHRcE$ৢ02BUD22,8!@jlP k fp"IMQ0G@PR(FGW˴ mb{Sļ\4Ǭjk x*dnQ)<.`4tӝw8j) bsI`4u ?]GPP-:@qu%2 2*+ple Q9 ȞM0!"[<̓j voZ͜h82jK'se)ۄCle>M+Yt~xyDe!ٽE9d+̘xFCb;ֲV-SBmmpz_Qɮʺ$0 !7ϨbsXt+1OAM/eAD{ ~8~ iĈ:O =Ёj7b2z8&oSVv$!!Us.8JZfN+ aBН%Vкfl㞌b4"ggqj9bYeF5zZfi2t*9/zY zݿL嬇^oԏ{e?݈kd}ަۛ:ςAޝ#=2m9_|vO=x~Fb3Ĵ{΋}mw.`02=ig_d?% YfӀ F CDH Ȅ$A: f/M_X诰gQM.T)BE Y$hp1{zI[ee^ɦ)iGtdlroUM[O5Sۭgf'zY.(~?hil64}gvI}5CU8+.(:*LoF{SQ<|v;m&m,luhcy` MB"j]^|V&|"6ɓĽSI%.q/f̹HG)z(=~ m+nx63/NMr} _mAsxTBw/>%j|D΢5U] ,G`FNt@Ը@[&~e]7Գe=oyKjs '\=-;"%ޯ=H 9YlLJXze?)rxaMy?N3@<#tjI4giQ'ꡅ3DbKjnre(~NQ @U9$c72V(k Iqɝ[cuhSH।ñh]X_bFV#Qiʟao-ďLgwS̏SE;J.U5egے@9H#61YQF@xK:s]yWs0 d0M".#cƒ B&][v$clkgig3˒w,5,k-(" 5;(^ Ck+5o3VK%z{6Jrv,[+̤ԧlUѴZ=aӮmKhٵYA5M]yjpg!FN]Jp`LIi.dB?r+ ѭC,,^`*7,$o$d;hӄ>JaT*GrS\ap|{ڃboaA"n *9*=J oT XnF~ܥ|uQ7!ƹ-rY FS;sx.şOoS7B y߈[ ܌ͤHt߸5N [ܙMRmaQ\Qd/(3Me7R5?ەv@:50I;SlKzqL=p۞{@hKjOM)CBUK3&ȮB_҈Uzo_xW.'ףUNqun.ġv9}j8O&aJw||ah UsuK2:iʆ?\2/Ma\dtzr CAv\Wy)O*lBI}HsS٢A̮&$J.5/"Tk?ħ @:"f7=8iA턆҅l$|+bţl}"R= ۊvMfgWv4lvS,ʊ)m5MD4y>ZfT;+O# m@]9ׅph7ӕċx&J??*(wU]6 %ln2y]qv+mxu4_+ N\s^1&Z/i~2Po=ed."`R;u 䦛DIԜ& =upqF-c9\<\z+g*N̻>_߾o߿ƺp0g=DxWvjَs"ԉd㽦[?W6M4oKMJ.h':R#m̕1h${i8vvXٮW@;]>BQy^鞰m>ib5ih@'V`YIɠ<=קT|ߜgt׵iyV˭?&)P?OwvfeIJJW,XnĢW gZz>Rh._AMi5j5AϋIg3-* \5|ՔVMQr(2SV[iNܔK˼g:2 &hK0#?վ0(EAv?1+&ы`zJH^B1*R(U#a2hAq^kPlhGJKŔXrCP㟃۱-@9<~PNh%y)LВhlZR@f8NLh֢n]).!9p"SktP :hv4 kp@@c"/-vt.d!thc0 $GC PɀKnh#$ YY7){T~MlaQu8$Isaypt LmU0|o^оTZr={3VQ i:FzjG? !L "OoPN>ѧu1}T~D{J1gPO' +@\3$$Q| p4o'f<1A}ʫǼCwݠGaœjt r+v@bV{g ?Eww֊3Vb5/&KYfTJkw/;ۈfΔ$.^vCϱw7CT P;{vH :gDG\s:|P_ɍ?8};r\ AK>NxI<) MJ?V[ C dw.TTCXx^曅җŧme2iFTScw3.{SҝwżSB~|ۄF$]ЈL\:Xv'r6DĬ D?,e?j# ŵȶId耠VJ_XMiA!ݿ6&/)7-O^iţWBR[ QV*K7# xZޓpSz 'h>mWdK Rͣ\Q&IBrAsS#MS.#z|\u("t^3%R6Rw ]7U4Kg}sHTveC1tŢkg07 ˊja,h&(~ ږWH]\z<*?YtGתW8+G'9HwD;NBo|;]}f_R7o(iZ*׳s47+ozNGԹv+%Pc n KIr8?TKoI^P {5~eLpS1,eaD=OB0B B]#/1LoȉgvBI6)6_0-J4|W2$Gh݂Teqi9̦P!Syy9SAKkI"5$Q5<žRR* ӦpłjCJJ0Emܻ:B̓ig]鴰,7Xu7Y,x*PW찝bi!iv!p q27!/z*櫃݇*4M--Zn@yBpoCUp*Sĥh6pnbV[Їg/&SqFE鯏{#YYP:\`m0--9n8GjQTuŗ̒D`hB:*hnt5{S)t aI,9_&̜1}"QNjrDY)%g6n9s6jCY2oDJQ6:KVʌ+/9.qtDHH9xUO \A΍%x[ryI"!v4 Mk% 勳mt&"|tg&BG#(պfmɥ.ٌ̐(ӎ#'CJhK ٣]ٍҴT,4JԸD<^Y!RdN&UTP㼊(Q%4Z "/7Wҩ{\m{"R "0mO#iV} "Q^ 1]O Hk9z(-dP߼ٺ|J]zXz=3d0` /yVMys{̭#Rr_iu0k{IfR̜׼Pu~Hypǭ퇟?YNO˧`#ɗ:W5\y`J5Ù F-vmqE@Spk|̍TRÐ?RM浘{ӻFAzL#H$K"FYNKYXiA(K(ؗT(K 7 =D&6XwXY[ A@t,'-"ʎN撁tqu S2E2;|/Q5)_Q ,$$ŵߧ* &z .Lg!"?f?);ѺLuD2e]#9,~܁H6F9%qVG ,I5(xAE{gZ۶񗞞)eɇNLs7cʖ*yN E,P )ᴓ2E,~X.9ோj:-/ճЁ."~ja|4+1]uo~rȥ3Y fIc|w@Wd~VStǑip˕#S@k%N}15J.˲։5Ow\6ŒXj:FvDF@Z~дv xj:$mM&I'[(>:FvDִ[xPS!!߸ɔ`7B[)GnNU1c$[D}[hL1vjGFSJǽ?u { ,}u%H!O, +`OO ~B'Ի'0@'~B+YK@'~B+#OO PJ@9TK $kueKCE|T;xH!(&1Dw_>Lfcv%[z%nI\L;z]jx;ZgEt.|U PYiȌ{)kUlŝW4'Moŗ.TAVch4٫@fü̧tn܎g:Ui8N>X!jӄ$KDqdd,͗lY}ۋZ~uUЏMJB%IH@ن> A)8AxINΪ^^vi<&!hwuv2j*"_g ǗK0㾵+Ğjlu PjT  *TCk|WTӒ^p(>IcT"O7%AZOD}xꐐo\DCdrvtI;1Hb>sӤݎ$S&tc'JέmzpҸW149Bw:FfZ -IxܮUEf܊e|6y==*Ao|uu߿WV Gܳ ~l`2&H[x˺F8)80pT4Cma@VD[]^|,ڼ8Aެc<|r%~JX׫׿v,*9bGC,B受-xNzPC x:(cn{>qM/O۴TˋphPMhR).%"h3wO!4zݵGO~s[љS+_88_jeIرXTa(iq H l–)5ko5΀P[ӯe`Jdɒ|RU{65r'RqY0r9p^P.&SP8M_,M'!f@g g3!z!UO>l)rg@cho'W+>6w5{v/\ѡ{!-% GUxtGUV@y՟oZ7}ش;̭}~JQOVCAzy,Tlu=${H}U翱|ʴ"/Ɵ'nf捞fwJSv@),Q` ARB $QFJtf}KH-W5ghQNd6boYQ P3}l}.뎳,.*K-rMS-I*3dp4\?^0^SYS8U7Υv #m阸Ot _=W\l0cS, P:)GVE.:T8z brgOSc픻XWj!)k`ܶE}Vs}O~yzub.@ܒvDf6O?ƉMfj71`vf7.`vf7 vZ%ꀌd 3iT UCt„dc)Asc~S+>]itVbmվ0hv}}7njBPo`p }2xի/ +|W_yمÙ[,闏,.5K5W=ZqIx”dXd2"T) ,M45DPv]^+~fWhjC]m{ R~5řaj|`?xe3w~q~A>~5Hp[֦_ߟ]DPDO]Dq4/"bj8wfqg߯{ؓ#xnD0("9-^*.]^5?.B{nB*Gy_zX2h"PU=*hxc@w ]sZT'Xp[ϔU_fZ }=1ՆZ(= ~- 7y u0/z5LSd ! E_~A 4 ).K-[j2wGѽ]eW- އA5 LcF ojtZ{;ϭ'Tj]B Q ]"&J-PNǫwS|zG@,S$&u=*77)欎K͖Z;xvTAk%  H&G! h1lD9< mOl'}EJ@y&Jᮊ&  èpg |N~5wOIY ÀIF$Cr3 Oݦd2G%i@dݼ֩Ti%ީR4''\?dFrI@i 0#82a$ baT3I SB*a9̴fbD-Q $c$3y8PXkXMK}Bh4i4H+ a#S,[pcJeK,ՙN|?M7rj22ZOnJy'Tj[2uUeDwj:TvN[79/Ecg;s<]twn5:2Mv dWdɨDbzAR1+SI_\”lҒI仺IpKwLJZp9A+6d6f] `fbY&#xnOpp_S.'4׆A4N&8M탨Z5R7z%}|Nw3X>OX/䍱p洁+9;k LkDgwwIcǂ21! h gZQw^Bg_8gg ԡ,S8k\ŻM^<42SBC ;Kwu9k#Tq X^?|9H^?AaSR{E }.zh h)5n9@:F. f|LocȳGl2^Zmώ+#C>WN20!S3 $ wmm|9XdIb@O'8HgȒ!Jx[=Ma $LW]U_aL,G*n9+w`FT!WH! 0@"8es.R5w2(QҪBe릈j LeĞ9GA"%SM<$3lmX+L Ueڥq_.L:#XxSv>KNpy?,f, _~|s2+F(&O_?JX`QWm:\R1<1ADLgo3˯ߎGqv7_u:p Ҭr%سo$HY>케yWfݕ7@LºC{=F 7Ƅ#l r❗VlT-5,Ya)_fƽPPCT G;/B:̳WbRLcć4gqi!C߃z TJޯ5GP=k o6VY4)k‹V/m kق Ziհ#2S'^C4SR+ז/6Fi~GR˵Be""o `pWIJڜ( P\*e| ST{*1*DJ%+,8 #DH.<~{z!U"W48F/^çxR;B]o;;_N1tRJX:n+ Q\q7/ Bl=L"kI{ʙ@|:IUAE,&W^}&xUX?*ƕĂF?`fJ[$[7Yb^dQ齆+T` O}IPR=ځaH ɫ$! *,R7cHv@**eBvWHHL!S'^n)R4DtavpW"(%ZSlf3&So9A{fU[M<QD`BNbP$\r6x\Qo7 #!O'‰W;!F0 Rc/6)c/C"H <( *c ?ŏ/ڰD,tcX-R YP+ ׇ;3ۡ5QW@J8]LS|b40uT3:)[t7ܾ;;#LQw4>Ke:Ć}0Id??222벮IGVsR!)^VqvG)*KЋd@]|KFG|qenD!T#(ci6S>nf`DA|)֋gʗp-Ymae}[aɁGTTC *yBPá%E{2y0ȶ҈"X$Ah- "…4{&>22 $1&j'N']P2fbsJTb0\; 3)⤏!pMԎ(I47n6_`ERΘf \?t>~2񳻳7Q&{uF˷eAdCAujb2D x uζs챺#+Vyȹxϱz< ź# [nwV/u^gC!Q69]CĖΰRv!COF|Ѐ$=2uC)ezY@7iDl4`."VWX=8`qMw-%F\Ⱦ٤MR&JJ<( m.j%Bө ڡ0MQ6Q!Rn:[tuItňw Bu^i=0"vq&&yDW6|(/+8suoJT38hjh.(zt<_gswv=gG%dm-}s5W2u|І Mdb IEF] ̹N"v\!19T,Hm"}ips臋C}gf`ǒ[}*; ="c'Ћ!s/j,T9,w p8l_LVpԂ\ŖԢ 6Rit.Fa̭U\Sw\CjWRo￯Z0w}εh b I-FXy-i`ΥN9{p]uGpbJ LmEA9zv>*<-Y+7"Z ~ۻbJ11g4n˻H:hۥw+'nmX+7"in 9VA>w;Ai;rޭ8=6׻`!DlJ ?n6ѻbc:hݎGByrLֆr=ܦ8JuCB} ]ѝk:X%F]v.5i'xAۉ uEкR"uUnۺMhQ~U+ ˴M}OZ=b [ŏ][)x[!"u3E#D#X%3ES#=J_UidF)Sr,&V/0j4:.]ت+ܾFqCc% l(2Xlj/ p8ϨGTbc=֔B3hF$}խ=#jƭb,o$1wWK_q;4QqGr-OJlqҶ ޠnp ~8ɞgBF9J qjaTP]DEqru* lݷ[1Ιq<8 `Kԑ Qxf<òC;FdbRuRhnd%׋çDgv\ރrM)K#xnV/ˏ='S%g  x12n-_?~:Dc+Kz\ήj"ۣgJxtm/cg'Li6i5.nk[gj~| F+1^DϿ]7^C]_lJ-dm+T>IT+:.K,\iB^nS\Jz>n -T@J6):/[ZadCx:Q1^HZl7I;y3%PP"Ov;JUZuC2@`m/)evA)?6C G'DCt(&{Jl.j9ߙp}pza0vmj眜$ǣbJg'%O5e1z:--I 겈M%U"1Z+<gjVyM{]MU17Jkuko3`Ƿi` 1TuzJRlz8=H-2YediuKa{"yn4!Q;+DH籁W.%{ Õ>/L:}4LK_)O|*@EJ{u'ҍAby rMI Dq hp8հ27FhrAP"x&r)DZGP"+ԳA4&glzUuYX4i]&uW˿GTŸ;م=*{Y EO8-𐗈ᨸhtfh@J:{ɨ!aUK>7BƼ.m5o*EmI.D$CBHv1S5=Dk{Tp^c sJ5fq0~7n;e>/`=/`22f9Ps`SFeci@Z8r38]]5CcbױXv>/뾷 3?*Ӓ !,Y`iӞ1{C$Gj@sN:((dt&VykJF1)@)R zs<xkUjrЛ!#cޛ xHŨc&k)aBRKMfLp"s޳@9}*e-K@hbW'20{B/A^]w*dۑ#مW΂7q7OדE: zLX/B߬Wo ZM>A_`#VRWZ4L~0[6\/T(lgvotR,MwXSzA#̣Od†ӹ~4_f|3`NKKTWi*4 ZB K@{vCԊ{vg yi!z5IH>y:!Z^:`"S'<(}@A84(8"Rm` zYz ;a%4bHKp~ĕ4z,ĭӁʖTwnO5~xrLP?!ЫФTt.kԚU߁?P-6PamMjOo̡ }tq6U ӛWdt𽛯gmȝi312p1o'ϩα<.uòDhRI臉E*V]X o]|rT]V%Vnzo\DȔ,I}d9ڭ+)u&xW"iyxڭ ELIN~Sh_Ͽ^?'\3)CNK"+%L`SƖ12 Wb@}/ć.*F`^$n> MK3{iIN 0ܐWaj.-Z~tM6 MPb6HSGύz(7 kΓ?aVQlsyJ4>[w7&O6#}LM>6$\q`#X)!1n0 )uN’ga@s JV9E95&$8۽#vu: .2fX19%Z@1Rxd!n]{U+EaXYF@kzmO@`'w.cE_YhÃ=耮(oϯi'?} \^~*rߎ T{`Iz ъR׉ԗv &y}ΫN( //%ƾVn )-p\J8wa G#mQNcOpI~n<)Mڬ!LCESG~zP͏^@Nm# R,dZw^Mr#rKKaG\F?J| c+4>w`Ypۗ*ou ެc?“ ԗ:{uW]ls.ZzF̥6"0Ay/4FU6ZA]ۚff/3?y\17]d BbQ26!Q%ƻ􁜒uf"p*qR¾%X9Dk \QH qP" pPrsF,+cm\b>?qyz,~f, 9יf22f!*}U~\ap)|,e"|+bxoϲA?$Bw$ <)a,) 胸BW=n?^%]3I<2G$ꑊXQo8"-~Ī}*Z! 49+b틆&/hYB-fHK) ߢQxrQՕ _ocɿ](Y oH,z bQ8;`U~ ř:;u" -E׀ (CIKM-ly[#yO" 3*cef: i'xr=d͊4șY쬛]|oB. MʠJ4$V=R B s[fX|-.pc"">wbJ07@ÖQz)Dְ&[Q{W5Bxd8|wm9Q\<(mKZ9 ZBd A?ayX$T^hgغɋI+R;H.CkZ$sb -ǸTPYB59izMr*\8eIslabTJmjǐW@Y9TU[v \Wl7uOͧqS[ gj{8_!b]]&@_.A~ыWlg߯zHII=!Eل"G=OUWWUWWD@R% W#D ೊ" @qMW5fOo]V1P.~ָ3v[AkHcT][x{l"pi@wCZzr\["a墲_xn3;""7"[ l[e5k{[ 6lzjX#8P vmwf/+TL]|=1j`zn/Tu2_cw ELI']GkvAŠFtv"O$-jnCHW.dJHfJ&\([_ jD;h-eyٛn'Z݆\D{ɔ8h Z ϴ'UfwtL%I$Eo3;jt Etm w7R<.]'Der醸 uCB X:E +[Vm װ]pKrm3t(\'oc'|+{\&Ȥ[ǽW/ȥeb858YЭK"M0;{N)oN &ɕ;kVW4$z4 M{Né^Z.gg,c$t'} {v,ʍo Co:ng31=)>;wy5"lLu.w ~is/6H) S-M'*X9RhEԅ^Fh)dV͍b鐗8e5#_ytS8נY;S~QY):4 -p%7ܵr5ce&NYJ8Ф-ٲȼ``1\0 U$zUJ43TTOnc5k1OQR*3d  T[ɽIAdɸ^gY)͡Pz֎ &o !0$ig)dipSz1Vc#%˒dr6j>jFeid\Rh}彁h/ -Z4# $T@dt#PIG2e2EKVRZ: 'aD")O & g ֆ:f&#`,E'CT?i-JnE^\z6ӣx35E>n'_}l:M̘dK Uj<jovk~ys0ʧWד?ޜ^wJwg\r?)=L+ Bљeqh(* 66P ˟n..h2UF̤]TW~hnN~tA\dp12yRH!y^de֤Q8 #1$#hդ9vZy'ɨבLHY22='ʜg"DƓԆi'n}%'[j S)j8T3`N?f`Բg(-)ꐌA+lѨuo欉UI;KQ;O9RJ29 AgRdߤezP kVarL-WIeA.rb$G&XRI6[^ Sj#! פG% 5`@ab$A@H<̩m"mZ` dуj\O:$aT*0ȕO O#I¡'B{##ETk\NT c1Hy5*MddP@"&?d <`66[.H\<$X$$g>ն(XM:U# w̒F dR&E"̘ LEnsRig<9dJi54'@Ad \Ҽ-{^>:*dZⱮRtW1S?^^U:eF \*Ս/=mŢyoZA+o]8O  -7]QOr.A@Om.ssv=Jwf/ChЭ9k4O˫oH%kVd٬\,%K6A_={5j^oS)fLS̹%/bMc"]-$2{vZ@[f) bdV–FkG&#\Z]2ܮրZ.}zEJy59W‘G"F;f4ߤǫMKi#p "M kb FIVpAk$$Pp $$}9$sDv8ImI ǣ-W2p{xUcƊCS*!+ύ@\n&X knt|Ek䬎NHnVCnϙ,.#@rB|6!V^IAB"XJOS}H5{KYJ*` #0_> d%]Nrz%4PuRJF9rRNjT-X a_Vv:ϛI[?&?6>,]HEacjUz\oZ /~EIt]]^Njҋ vZ I״ʅޅy/o&o&)bCv y A% 64O#xU/͈\n.-fŴ/cbV KV8;^.J޻-=s#fC٫P_i^6|a*8aSY1Z_۾VJA r&[Mã99բ rg"O'4CkS$<`K!R1䅣ZYN@%nėxK/(Tc욱z#hx}I:h3+; "*!ƕbRk>Xk6D]T@Ѧ4LF3 @i.Q&fY3U Wk3٠,&ڜր.(L.h؄:+ҏHSRr8 \Kާ"6ż^MG>"̓3--p`hS}tQ/D.ҞjR*30u*t2@&> JPنa]o FF.|Do456'QZL1rd!uFM&[X)C9κ|)i)Q}Tt|;h]9(A=Ԃ+ύih0DF'C۽#غ8 ͇9Flˢ4B{H <8'|&8'G^g M`yGqW0!b 5W.b-2ALS1e/8ְ~1|z{_z>og`K1k\(i%__+1$:]h\5uTf}Fp _tcKp 4P a<,|G#OS}H5|4*oJ%;x9:`2E1%XLdOB$˘gFE4DnBVFl@]cV΂_%YvZ)TbҐ5z]E|rYȬ8I.^;"ז$EÊ hRN4 \,]U)x5 a'vvj9a^NJiX9w,w흽VĠq7=˛r;6r-ZV5W=fO.~91G= 0Ko/\44)g(!'$ɿN&\$F̡)ml h/>GwN$zk~lg=eɲ$L3>dž{RihT1RoJ10xt0}FO^1]ĕgdK?AW(?ހ-ۛ+Z-@Yf1u[+uK\in"_~; m^؉6ӟgjt~׷kK9fգ*34s8svtwh<~X=3LmǓDśY+Z8[2'ܭ~GU'0czAgˬ?W2w_K&X.?;̯P^#9gE'Xo`9wT?7K!77= 5Br`R9Bttms}M [Q{7]9Nؕ_X!;R 65ڗħL!#p2)Ŀ5\˖zR-* pTfn)4n_>ȳJY%y&r;(2jyKgUqE2X%W|՝ShP З}wnk]Fu|pzM}ש@yvY1pFϻ;<قDDR5 K\a0x1d { ^YR8/s+Ha,\uǔkWW2[KY!;jwjHJTqM e.sS3иZ67fv- :s򻕗,(Hhrc$co߹WN:zKNS/3vfm:}4}CJB1dP 4F (AQrD%TEzBwfXӱy*?֖}4f2?78e5=K vpͺu>pVrYZN #{<kOvygw+.V_'3Bm4  %eT $<=ܽ8t*D ;ȧlS")C6޴C,E궪EpkT)-$jX۱ʮ]DYVJz!"_VH~[B_&4MFP:|~:R+ژ׿^"J׃nh7%YoӒrx 9dr,|^xC*֛ 'k$vNTi'7iYi%x8N '̭%{_$Ӂt mpAe0Yegg@;V'M}{3ߖ9}f?gL oga~n8T#O:8^I(š{έ3#3uoO Nj؎5%=g!i:äNfo,@ +Mpuk]tPCr z7ݏB;h;?wQM5&F% ( G eőrR~TA8U~H7WlD3r&ԏ":P߿Fl;LБsyn} @=7Z({JGdQ=-(awf=srrr͑'b$$nQ%@ AmѢVBEGT hPЗ]VbHJYuܭȃ`+dc h%@di &`!KI@Rw JۻiJ{+meRߐtye ְ݅㥑 zqg~6kk~F{Aw5HxW;=+d9mK-IQYs]9:U";x]5t8 |Z|wtA pw&-)dbZ&N]h2xÈHQ 5$YJ:DR\9PY7WJpݵnl¸X7J6\[ R(D)Rѐ٣ 㑡k< h8ңt=Y{J޺>'M7ko=k'=QBzN'}9/F|e0ؿ߄駍h龥OwMs^ho},C>sѥV|b /4/ *VYhZ.u7wnp(>61W1_A`k1qv2UJ<ކY+ӪnCGaރKU(|s>@>}'r$Q$>tkv}zASnsiYE +_|4d~4e~B[ɟz˻EU`m]_5]/5W{z*H+va:쿯Q'qXxz p9AB C\8}%?bh9.kv[][g_YO:ԦN瑱< ~nNOjPxJH Ε֟>:_(zQTQsr@s5zE0[m|;Fuo ~;ue|QjcGX\@4Y6WVnze,ڶròrLvZi۟;}=D?g` )i=~ppvN-qDY[xtˉč'-d7MkmnsͶTjg?o=*n-ik ߃נ:?_<5f:_Yk(64l]]Ӝnj^jL%emr}6YtO9 go=n(?6t wn'4w]^VCKgѽy F=v5tA٣ S6Kr%(a_&{Ҙ]whx X#ZUتJw'R ײ6WvE4ʆZF||:'kvSY5YAGbYSI Sz}2ϯk-u]Blu]C%өb0D `Ȅ49=GuhdFc9dr" 9.LϒLiGwM@4NrrNftb^ʼnuxUR7;K\"QƌO,d1$xtSc| /2i`&@!dcZ4ߗ73H!bԄYpZ{%*P ~7C )'IGX\KD0D hVBidSm\\/U&j%ؗOδڗk/ei*2>9m1""!0A r E~̆TKn/ IZծ;0O–XWfp(|4(8F:FיA0 !i%Vz5PV81j&ろj^[^3eI)O t'b^.wOnUsy8R5pj07X1B7^M|k h;BQkc7ϜIXF"$V˧O ƃBY8xby Kw%:Y\(Ҹ/L(/2HZwC*:=Z-M$>9 H P& #2ώV+@!]ҳ, 7r{Q'V|C=M#B,diXM2b Uib=f}*I ltV-⌋(DrgK^:heJ<>gj1$O m0A'sE)\ ]RO{z*B-0W3vj5YNc G旫Fswe[mLTIsۀ@&OW'wg '` C7>Tu^cdr37 :; CᶖA*WӚ"/c /~{G#D˺]&T˻jEbJ+J3dU])j[("Exx77:4t v.3ızhQOb!W VE^}zULh r FUKXW-:x2,΄{P!d"O^UK&W}8!F2m8 ?xW˹s`s CR 6eXZ҉4P:vH5vkT)TBzqja" kaJ B^:9 ܴwiG4M;iIgLsmQ\R[cNrĮ`>1<}{Z kDzV?RvQW޵Y ݊pȾҘmLԐkMsto#v&e:Lc^4j-oꀐjW)d{TrVLb"hξl.o+@qznGKi&* {FVd#ڝ40 ڑ/ih_7T6F?zՀ)W XWI$(*!)! YR,u}JrjrK2*JA=XWGOmkL􏥹wHHEt-7|bo GZ; aͅe_l+vԌjޕYX-< C΂*Cl8UXz6/Gde,rIy ;SvZ[nkI7Ei͟.gNR6a4Ng9OZf,%̗^m4VsagKj%lW iuEߜ?2LcsuZWh @SKP7KF}#—jmVZK$R&]2U|ެv ]Q"~i3tss+/N1P? I OF`4M|a[U D#<<6u0izh tvNMgFkc'vͿ7\[&0{>vxKrQ?*iAdPݭ2NFQ/T z7mQ: ,* bo@%[FM6m.^mN1cP P6UBz4`.t"&E_tN&8g5l~aO `nca'd'ӦbSF~(2NKX ;><9xFg3gXg༈>qXBT`Nؾ0a^".|i^=SoTtN#AUF zdn?)PIHQz=jA_)}R{Rglev?M7ÓPL&'gݶ sf¡nnSm^ڒI%ZNH1+H1o}Xl}>(ʉyEW+\ɭZFY~Y)IQ~mC4(\<\u:Gܻ@֫ކ^cqs{ר?[5Zg?zJ{g=s9QXV˂ˡY1ҁYXsʢRu] 50-bx#\)xusAj B*yU6h_g?yٱУ4nϡ߷|~}}er,`&L,/\sHTDt?_oClDכ,Ӧ+9W&脩ׯ9rd87!?GANKb%./Pt5޹&:Y8ҐNݒ-(2{={ (I4=OG ~+Pǀ)ΟUp0:jNwDp_rTxyFJ%`UM`SkE'7. _p0᧺X/x_xPg ^q`JZU-7yPkC@:)-׃.b72ӢrZY[uq(ȹ܄S$K)_/6[9s-ѩcC 8 NWF Wfc c8bn[ G8P_] aUk$ϕ)HQКkV[ބ {0$γZ޴Ű!W*Z@o+ P9ӝeULN˾mJ E4NW@jhַ G*yrPL`.pVn<%b}4. H)./6(թ@""` %di;J _|'ch HkEfoXG?~v"93)2Ir|t!a& [L3b%cbQK11ҭO>?=x;qbhmh*?ߴV!Xn_ǝӼW?#ٮgHNj$o}IB2-z>:$wtv_!il BawM)$U]QI?Wor6cIE" V[}/W4Jw!欋JslMmU/٫g˗"5";bc`)/9u7;`7;^ڡ^5{{ZBL0A|V8S%lr><ܼyoP^3?)}3`Y \q/7p<^;JV߲ެ9LP)pK{&mom?ĶͦyhgM{ٷ흳q0b-*r NcwPt4+{_h a-I\j%|׬NV6{XZG Uvm,gKgr=AY`g796;եqWiwLmv;)/~h4Od<93!,Z&b)(旋mDDn*B)IDR>@h22x1q҅yQ)f'65 DZh9Yt]D=UidwɦkL>Y^G$ZP$#m3^j"XiIA J[µ0Rq$D4V(00 {|u-eJVB>s Am #]!$9*K\+ܧ"6.B֏k+c4a5gq1eB^w`[T/0UjQR-ebQ$ 8f KgDjd ةQǏ Tbb%b 4(urBh܁xpK97*,O"L[Lpe, 38ɕO=CeL"RTL\q tp#B!@(ڴ/wDjB!q< (LM}]0l9 K䨙J](gE_xǚl@(A`CL `G)aC[s3}:sXm[Y،X21=zj8TjCbH#hHDu gD NQ"aopGσqOY  f2z%13_#9*؂+ggY+u@Td,aę4X('i@`,smV\U{,@m0!#kYR+uPҘqnRߓ_|x(.Px4~6*БUQVٓm6"Z@âVH/b)@_Z6ev!թ% x]YZ&iS/W\+Rct:qCRGJL, 6U(Lƫ1ts¹nl&稢B,5g)#8BYWq5,+B/=̩Lsek2Zsy2˜RͭLK8k9A" ~I$ks;,b%OlcOS"Ț I3mńjXtl Nݪ6-)]?i NL| .6G,AJRi߂X `sv R}-B,8+ZQP|˚`9FcQ}yAjo lAų4!]L:TEXi-NEX:7} ƍVkc8QJ8E 3.cڃpIyVC`Ɣ6UuS<ҥY1 x,>%0}wg kN^+NqW"bxLh ]k/^œbMDy G`'Alry ԨQ\Е)L*ޑ$$ mOFI:1{:L)}_94q9r+L2Ō|xz$H)Y3$,"8!L? ! ]Aa?V"h#8ZN?g>6o\'MmL"Ê!b I?^ :_d+DvFT_:OpH ZZ( u \CRk'a^yoչ7n*UZn*UyJsw]?T]ǫm-K `ۯd]^`ιy ܆+*l+s.`3Rp]o'+vA5Pqkrhxǝܭ1/OLm}-!/VJD ض!/ MGl ,eOF+oڬ>x5%f6t ȵ=O5ȿT mdHЧ\;8peԪUW0柪D:57()B^D*Y'T,$9:13"=+`5%?f΋(;?C﬚w1x9]|^,kAPjqW46F~t?I"& =^:_Xwu,ꗻ/`O!=6{?+5$!_ׅ֒o+3tCŠcCKbnڭ=ڭ EL)}v#l>[] N>nECV?JEք|"dbLo,}9gQzd4_?ŋItmy)|oN]p{xZ2]#4fzodFΕ].]@i4L?{s9}Hp{{yEIySh GɃ3R͜d0a$s>Mr[ s_#.emi@ESlf U]ڥaX?˟QiyҟqM_FjpձqBoxo:f_N*S sV 8௧Чvzhf8+Ζ}bu vnMAlw Ak uRU}d-NSJ1;dJ"6amZLxLi^*QXU.+F2+D8n$q{0ۘOfկQ.ҤAoj{H_%9 CWwWI. .,֮,iIk;OPPyᐲ]@=쩮z* x)JG.(AIV%N6jJ^ݟm*k*;ijneKD5Db*OfLl %(t|i3M{Mw9X}., DM/|Tpc/˞~ 0s/l/`ntdTIRF܁JCS^煟8х)$(pTbdnÕZ DP$gVGPE5¨Àѳ`w3rYו{ Us/&VHNM* @w9BXZg:@n m`+$("5vXcWR_o~wJ@W }v@JF&t]|-R}Cg =$'F =hDR{(@sYɺMCnqj,gjbZ?&ݫM~ vv qcrc'B}2^TZ32yGd'r9Wc,mA#VRYe"Rн[2Bko+&G #Uծ *OD6/ay&O[QBM{<؃ҕmCBe{&AB:'Ted'8 `9O,9ȺSާ#oG.ϐK~lƞ:#PcGX>_i\MBcGiptb1fɅN($)vN.&^_+K[ [?s_>ÙNu-j8$Q athIOGeTd3bGBUizFCvaF-^DMܛ5m-ַ29Ź]6g kZ|V|;b҅8G SPhe]GFhbR*T(,$e &g/G4JFhhp~n6{nQoM+S(l^$]^ive XBp4 ^)XF@8WhO QYf! ,p P ޟji=t)H((+ N'Z%BFbЊjˢ2L&fnQMj` ?׍jko$,jo?%/:'/0MgUSWɣXg@&ƓO8:məZ5{j1ay|2y wo*g#U/ƌ)M3C|Xh^m쫻X;ۮٻ*K='_Zp[YhƋbڅBBE0BKx;˭#?װxYɒdL!-YBY! T5$袩_a,yU(4Y3DFyB؅rk68r%淎6KTNV&+D9gh}ix?N#7i2D~?m#atƘ$t"y3'o(h᲎%CRjm,}*#2shY7X 'Lk&W%gW@,a|O5eYuZ0ۮa`|}Vhʹʳ$t&Ș<#+N,@qi&g8Vf#DP7[m4!K!Bd\ќYÀFHFyܶ]C̴PBA% H2꒙C2v#K2'l7PJr]#pE3v)x:i7ePYc%K%xpe.5foB[!B6` .F1.F'z_p1B99}#طq(f,N#0݇mbMJ_'C;Qx; QrOt Qf s/މ^I9àO[ʊÇr7tK0/t'\3Adx8Ҕ5rE5>K69NQzu,gx}(NTux~lrP77"~1TS=ZzO7i'1ݢ=]t8xzzrUn'c}"H;#evznr)n0YBTA=bR7h*3Gٓ;,Gc8S >wuc {(9ԑ W|p3߃6aЍz6OĽ@$ =<8,ޟjH+\vih8"%|7ٯT2u*>]=_=4|_rXe՟powpW}vӧkк Zmm9NP0A.,ꏒKXU5`v?<,9ii[VڳR|ˆ(WA4WA:w]&;6%3] %4u}YhА\E³8w[:6EwtNx`:m0I6*S`=ΐSHv`%LK0%#V°N*Π\sD!gCZJtڭ @cACҵ`WByeMSFKvavW&fK۝e{q,`P8i'I!{N}+C2ށvhRŒ`Bى[sZb| CѪz^`#DJ[5 kT;fKޟ9Qq|w%`0nmNJp.snUNҫ\. ý?~fXR+jOOTjm~?<QG!OKMQȉjC;zeq`Es16"|h/aϡ䙐zlkTʹdh*m3U#Q0B"?7㚃 #IF`/pKoQ'K+\L-kN3T fޏjF"W"`u]tG h`cDʁ Jh5 mE!b EZu{!7M@73u}j6g(Be*: )al'loA<Nu/ےz؟U(G z{5ɓ[OC81Ы.9| 3FuYp2TOE,S~$ӭt0٬ڡAr@}} -"Q6| h8HƜl3$_Ra[J=S(O=|fs8V!FZz ܪu '|^ljk:~s\ $ eaLҴL:@ibJ /;c]ԡ ;RDd"BP$6@-JtbٮXWah;qfru %U\~ q$d,>FlW6]<4m$^u0Y:0wby/(5Y`j}9\a26NzH 2mKvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005036234515137054144017711 0ustar rootrootJan 30 05:07:40 crc systemd[1]: Starting Kubernetes Kubelet... Jan 30 05:07:40 crc restorecon[4763]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:40 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:41 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:42 crc restorecon[4763]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 05:07:42 crc restorecon[4763]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 30 05:07:44 crc kubenswrapper[4841]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 05:07:44 crc kubenswrapper[4841]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 30 05:07:44 crc kubenswrapper[4841]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 05:07:44 crc kubenswrapper[4841]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 05:07:44 crc kubenswrapper[4841]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 30 05:07:44 crc kubenswrapper[4841]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.023166 4841 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028304 4841 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028337 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028348 4841 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028357 4841 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028366 4841 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028376 4841 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028386 4841 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028394 4841 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028439 4841 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028448 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028471 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028481 4841 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028489 4841 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028498 4841 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028506 4841 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028515 4841 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028522 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028530 4841 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028537 4841 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028545 4841 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028553 4841 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028561 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028568 4841 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028576 4841 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028583 4841 feature_gate.go:330] unrecognized feature gate: Example Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028591 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028599 4841 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028606 4841 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028614 4841 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028621 4841 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028629 4841 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028637 4841 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028645 4841 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028653 4841 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028661 4841 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028669 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028677 4841 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028685 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028693 4841 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028701 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028708 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028716 4841 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028724 4841 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028734 4841 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028745 4841 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028754 4841 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.028832 4841 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.029689 4841 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.029701 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.029717 4841 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.029727 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.029737 4841 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.029745 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.029754 4841 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.029763 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.029771 4841 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.029779 4841 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.029788 4841 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.029797 4841 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.030222 4841 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.030234 4841 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.030242 4841 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.030253 4841 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.030264 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.030276 4841 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.030286 4841 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.030294 4841 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.030303 4841 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.030311 4841 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.030319 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.030334 4841 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.031999 4841 flags.go:64] FLAG: --address="0.0.0.0" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032166 4841 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032717 4841 flags.go:64] FLAG: --anonymous-auth="true" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032778 4841 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032799 4841 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032812 4841 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032831 4841 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032847 4841 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032861 4841 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032873 4841 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032887 4841 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032900 4841 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032911 4841 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032924 4841 flags.go:64] FLAG: --cgroup-root="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032935 4841 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032948 4841 flags.go:64] FLAG: --client-ca-file="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032959 4841 flags.go:64] FLAG: --cloud-config="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032970 4841 flags.go:64] FLAG: --cloud-provider="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.032982 4841 flags.go:64] FLAG: --cluster-dns="[]" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033004 4841 flags.go:64] FLAG: --cluster-domain="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033016 4841 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033028 4841 flags.go:64] FLAG: --config-dir="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033039 4841 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033052 4841 flags.go:64] FLAG: --container-log-max-files="5" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033068 4841 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033081 4841 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033094 4841 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033106 4841 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033118 4841 flags.go:64] FLAG: --contention-profiling="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033130 4841 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033142 4841 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033155 4841 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033167 4841 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033185 4841 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033198 4841 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033209 4841 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033222 4841 flags.go:64] FLAG: --enable-load-reader="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033237 4841 flags.go:64] FLAG: --enable-server="true" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033249 4841 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033268 4841 flags.go:64] FLAG: --event-burst="100" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033280 4841 flags.go:64] FLAG: --event-qps="50" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033293 4841 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033305 4841 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033316 4841 flags.go:64] FLAG: --eviction-hard="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033331 4841 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033343 4841 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033355 4841 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033367 4841 flags.go:64] FLAG: --eviction-soft="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033379 4841 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033390 4841 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033448 4841 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033463 4841 flags.go:64] FLAG: --experimental-mounter-path="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033475 4841 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033486 4841 flags.go:64] FLAG: --fail-swap-on="true" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033497 4841 flags.go:64] FLAG: --feature-gates="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033512 4841 flags.go:64] FLAG: --file-check-frequency="20s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033524 4841 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033537 4841 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033549 4841 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033563 4841 flags.go:64] FLAG: --healthz-port="10248" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033576 4841 flags.go:64] FLAG: --help="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033587 4841 flags.go:64] FLAG: --hostname-override="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033600 4841 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033612 4841 flags.go:64] FLAG: --http-check-frequency="20s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033626 4841 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033638 4841 flags.go:64] FLAG: --image-credential-provider-config="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033651 4841 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033664 4841 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033675 4841 flags.go:64] FLAG: --image-service-endpoint="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033687 4841 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033700 4841 flags.go:64] FLAG: --kube-api-burst="100" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.033712 4841 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039047 4841 flags.go:64] FLAG: --kube-api-qps="50" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039065 4841 flags.go:64] FLAG: --kube-reserved="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039079 4841 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039092 4841 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039105 4841 flags.go:64] FLAG: --kubelet-cgroups="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039116 4841 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039130 4841 flags.go:64] FLAG: --lock-file="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039142 4841 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039154 4841 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039167 4841 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039189 4841 flags.go:64] FLAG: --log-json-split-stream="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039201 4841 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039213 4841 flags.go:64] FLAG: --log-text-split-stream="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039225 4841 flags.go:64] FLAG: --logging-format="text" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039237 4841 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039249 4841 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039262 4841 flags.go:64] FLAG: --manifest-url="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039274 4841 flags.go:64] FLAG: --manifest-url-header="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039294 4841 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039306 4841 flags.go:64] FLAG: --max-open-files="1000000" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039321 4841 flags.go:64] FLAG: --max-pods="110" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039333 4841 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039344 4841 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039355 4841 flags.go:64] FLAG: --memory-manager-policy="None" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039366 4841 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039378 4841 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039391 4841 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039437 4841 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039479 4841 flags.go:64] FLAG: --node-status-max-images="50" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039492 4841 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039505 4841 flags.go:64] FLAG: --oom-score-adj="-999" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039518 4841 flags.go:64] FLAG: --pod-cidr="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039530 4841 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039548 4841 flags.go:64] FLAG: --pod-manifest-path="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039560 4841 flags.go:64] FLAG: --pod-max-pids="-1" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039572 4841 flags.go:64] FLAG: --pods-per-core="0" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039586 4841 flags.go:64] FLAG: --port="10250" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039598 4841 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039610 4841 flags.go:64] FLAG: --provider-id="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039621 4841 flags.go:64] FLAG: --qos-reserved="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039633 4841 flags.go:64] FLAG: --read-only-port="10255" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039647 4841 flags.go:64] FLAG: --register-node="true" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039658 4841 flags.go:64] FLAG: --register-schedulable="true" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039670 4841 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039692 4841 flags.go:64] FLAG: --registry-burst="10" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039703 4841 flags.go:64] FLAG: --registry-qps="5" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039715 4841 flags.go:64] FLAG: --reserved-cpus="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039727 4841 flags.go:64] FLAG: --reserved-memory="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039741 4841 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039753 4841 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039765 4841 flags.go:64] FLAG: --rotate-certificates="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039777 4841 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039788 4841 flags.go:64] FLAG: --runonce="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039799 4841 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039812 4841 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039824 4841 flags.go:64] FLAG: --seccomp-default="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039836 4841 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039850 4841 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039862 4841 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039874 4841 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039886 4841 flags.go:64] FLAG: --storage-driver-password="root" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039898 4841 flags.go:64] FLAG: --storage-driver-secure="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039910 4841 flags.go:64] FLAG: --storage-driver-table="stats" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039922 4841 flags.go:64] FLAG: --storage-driver-user="root" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039935 4841 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039948 4841 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039960 4841 flags.go:64] FLAG: --system-cgroups="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039971 4841 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.039993 4841 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.040004 4841 flags.go:64] FLAG: --tls-cert-file="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.040016 4841 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.040035 4841 flags.go:64] FLAG: --tls-min-version="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.040047 4841 flags.go:64] FLAG: --tls-private-key-file="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.040059 4841 flags.go:64] FLAG: --topology-manager-policy="none" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.040070 4841 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.040082 4841 flags.go:64] FLAG: --topology-manager-scope="container" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.040095 4841 flags.go:64] FLAG: --v="2" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.040113 4841 flags.go:64] FLAG: --version="false" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.040129 4841 flags.go:64] FLAG: --vmodule="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.040143 4841 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.040156 4841 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040584 4841 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040604 4841 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040618 4841 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040630 4841 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040641 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040653 4841 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040665 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040677 4841 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040688 4841 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040698 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040708 4841 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040718 4841 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040734 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040745 4841 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040755 4841 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040765 4841 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040775 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040786 4841 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040796 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040805 4841 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040815 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040825 4841 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040834 4841 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040846 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040860 4841 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040874 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040886 4841 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040897 4841 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040907 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040920 4841 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040930 4841 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040941 4841 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040951 4841 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040962 4841 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040973 4841 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040984 4841 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.040994 4841 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041004 4841 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041014 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041025 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041036 4841 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041049 4841 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041062 4841 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041074 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041087 4841 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041098 4841 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041109 4841 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041121 4841 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041134 4841 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041145 4841 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041155 4841 feature_gate.go:330] unrecognized feature gate: Example Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041165 4841 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041175 4841 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041185 4841 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041195 4841 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041209 4841 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041220 4841 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041234 4841 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041247 4841 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041259 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041271 4841 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041282 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041292 4841 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041302 4841 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041313 4841 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041324 4841 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041333 4841 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041343 4841 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041354 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041364 4841 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.041376 4841 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.042178 4841 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.057701 4841 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.057747 4841 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057884 4841 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057899 4841 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057909 4841 feature_gate.go:330] unrecognized feature gate: Example Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057919 4841 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057930 4841 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057940 4841 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057949 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057958 4841 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057966 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057973 4841 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057982 4841 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057989 4841 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.057997 4841 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058005 4841 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058013 4841 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058020 4841 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058029 4841 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058036 4841 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058044 4841 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058052 4841 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058060 4841 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058069 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058076 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058084 4841 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058094 4841 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058104 4841 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058114 4841 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058124 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058132 4841 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058140 4841 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058149 4841 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058158 4841 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058166 4841 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058174 4841 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058184 4841 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058193 4841 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058202 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058211 4841 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058219 4841 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058227 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058235 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058243 4841 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058251 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058260 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058267 4841 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058275 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058284 4841 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058292 4841 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058299 4841 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058307 4841 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058315 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058322 4841 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058330 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058338 4841 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058349 4841 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058358 4841 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058369 4841 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058381 4841 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058390 4841 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058434 4841 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058443 4841 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058452 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058487 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058497 4841 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058506 4841 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058514 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058522 4841 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058529 4841 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058537 4841 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058545 4841 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058554 4841 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.058566 4841 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058821 4841 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058832 4841 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058841 4841 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058849 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058857 4841 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058866 4841 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058875 4841 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058883 4841 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058891 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058900 4841 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058907 4841 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058916 4841 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058924 4841 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058932 4841 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058940 4841 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058948 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058956 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058964 4841 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058974 4841 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058982 4841 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058990 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.058999 4841 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059007 4841 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059014 4841 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059022 4841 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059030 4841 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059040 4841 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059049 4841 feature_gate.go:330] unrecognized feature gate: Example Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059057 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059064 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059073 4841 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059083 4841 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059093 4841 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059102 4841 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059125 4841 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059135 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059145 4841 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059155 4841 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059163 4841 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059171 4841 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059179 4841 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059188 4841 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059196 4841 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059204 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059214 4841 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059224 4841 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059234 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059244 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059253 4841 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059261 4841 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059269 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059280 4841 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059289 4841 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059298 4841 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059307 4841 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059315 4841 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059323 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059331 4841 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059339 4841 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059347 4841 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059356 4841 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059363 4841 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059372 4841 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059380 4841 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059389 4841 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059422 4841 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059430 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059438 4841 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059446 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059455 4841 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.059475 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.059487 4841 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.067215 4841 server.go:940] "Client rotation is on, will bootstrap in background" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.073146 4841 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.073299 4841 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.076576 4841 server.go:997] "Starting client certificate rotation" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.076616 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.078649 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-08 05:02:51.759339792 +0000 UTC Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.078777 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.133345 4841 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.138713 4841 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.139717 4841 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.173534 4841 log.go:25] "Validated CRI v1 runtime API" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.246886 4841 log.go:25] "Validated CRI v1 image API" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.249272 4841 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.254222 4841 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-30-05-02-23-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.254273 4841 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.288666 4841 manager.go:217] Machine: {Timestamp:2026-01-30 05:07:44.286090912 +0000 UTC m=+1.279563640 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a1d8c29a-8e5b-4ce6-b544-127e3ff9ee5c BootID:98d8a8ed-cf63-480b-98f6-6728ad28fc06 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:fe:d3:ee Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:fe:d3:ee Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e7:7e:65 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2b:f2:d2 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d6:04:e1 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:85:26:02 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:7b:91:3f Speed:-1 Mtu:1496} {Name:eth10 MacAddress:12:85:ce:04:83:5a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ce:8d:92:dc:1a:7d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.289093 4841 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.289303 4841 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.289909 4841 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.290299 4841 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.290372 4841 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.290819 4841 topology_manager.go:138] "Creating topology manager with none policy" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.290840 4841 container_manager_linux.go:303] "Creating device plugin manager" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.305359 4841 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.305461 4841 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.305821 4841 state_mem.go:36] "Initialized new in-memory state store" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.306011 4841 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.320750 4841 kubelet.go:418] "Attempting to sync node with API server" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.320806 4841 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.320846 4841 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.320877 4841 kubelet.go:324] "Adding apiserver pod source" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.321331 4841 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.327296 4841 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.328616 4841 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.329840 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.329922 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.330001 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.330034 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.333294 4841 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.335293 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.335341 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.335360 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.335376 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.335429 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.335444 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.335459 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.335482 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.335501 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.335519 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.335542 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.335558 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.337557 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.338492 4841 server.go:1280] "Started kubelet" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.338666 4841 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 30 05:07:44 crc systemd[1]: Started Kubernetes Kubelet. Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.342300 4841 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.342632 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.343182 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.343243 4841 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.343253 4841 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.343520 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.343667 4841 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.343654 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 19:05:36.943729935 +0000 UTC Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.343699 4841 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.343903 4841 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.350004 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.350157 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.350393 4841 factory.go:55] Registering systemd factory Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.350488 4841 factory.go:221] Registration of the systemd container factory successfully Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.351339 4841 factory.go:153] Registering CRI-O factory Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.351374 4841 server.go:460] "Adding debug handlers to kubelet server" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.353787 4841 factory.go:221] Registration of the crio container factory successfully Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.353448 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="200ms" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.358256 4841 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.358328 4841 factory.go:103] Registering Raw factory Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.358370 4841 manager.go:1196] Started watching for new ooms in manager Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.359377 4841 manager.go:319] Starting recovery of all containers Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.364364 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.36:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188f69f174426910 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 05:07:44.338438416 +0000 UTC m=+1.331911094,LastTimestamp:2026-01-30 05:07:44.338438416 +0000 UTC m=+1.331911094,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369130 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369210 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369236 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369285 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369308 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369327 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369346 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369365 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369392 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369461 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369483 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369504 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369524 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369548 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369572 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369592 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369611 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369632 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369650 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369669 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369687 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369708 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369727 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369746 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369767 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369786 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369812 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369834 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369882 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369905 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369925 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369944 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369963 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.369982 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370001 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370020 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370040 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370059 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370077 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370097 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370118 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370137 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370156 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370176 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370195 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370215 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370234 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370256 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370277 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370302 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370323 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370343 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370369 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370390 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370507 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370533 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370568 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370587 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370606 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370667 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370687 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370705 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370724 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370742 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370763 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370782 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370800 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370818 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370838 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370855 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370873 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370892 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370915 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370933 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370950 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370969 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.370987 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371005 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371024 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371042 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371097 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371119 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371138 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371157 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371174 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371193 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371210 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371228 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371249 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371297 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371315 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371332 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371350 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371368 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371387 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371443 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371466 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371484 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371502 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371519 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371538 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371556 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371574 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371590 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371618 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371639 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371659 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371679 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371699 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371722 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371741 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371760 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371781 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371806 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371826 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371845 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.371862 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372078 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372104 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372121 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372150 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372168 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372185 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372202 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372219 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372237 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372255 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372272 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372290 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372307 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372323 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372341 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372358 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372375 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372393 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372452 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372478 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372496 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372514 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372531 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372552 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372569 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372586 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372605 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372623 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372640 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372656 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372673 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372691 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372708 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372727 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372745 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.372777 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382170 4841 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382271 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382309 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382334 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382358 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382383 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382441 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382464 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382485 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382506 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382529 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382552 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382574 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382595 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382649 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382674 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382696 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382721 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382743 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382767 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382790 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382813 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382887 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382911 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382933 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382955 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382976 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.382998 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383021 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383041 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383061 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383081 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383117 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383144 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383165 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383186 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383211 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383232 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383256 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383281 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383301 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383323 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383344 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383363 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383383 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383437 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383460 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383481 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383505 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383528 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383551 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383573 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383598 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383620 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383642 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383662 4841 reconstruct.go:97] "Volume reconstruction finished" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.383676 4841 reconciler.go:26] "Reconciler: start to sync state" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.403640 4841 manager.go:324] Recovery completed Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.420331 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.422942 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.423025 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.423048 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.424212 4841 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.424240 4841 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.424274 4841 state_mem.go:36] "Initialized new in-memory state store" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.428035 4841 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.430500 4841 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.430570 4841 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.430612 4841 kubelet.go:2335] "Starting kubelet main sync loop" Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.430731 4841 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 30 05:07:44 crc kubenswrapper[4841]: W0130 05:07:44.436687 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.437067 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.444614 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.493549 4841 policy_none.go:49] "None policy: Start" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.494990 4841 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.495114 4841 state_mem.go:35] "Initializing new in-memory state store" Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.531206 4841 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.545013 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.555884 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="400ms" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.601643 4841 manager.go:334] "Starting Device Plugin manager" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.601745 4841 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.601798 4841 server.go:79] "Starting device plugin registration server" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.602557 4841 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.602589 4841 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.602818 4841 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.603013 4841 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.603041 4841 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.614874 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.703200 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.704928 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.704959 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.704972 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.705001 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.705718 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.36:6443: connect: connection refused" node="crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.731882 4841 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.732038 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.733380 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.733461 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.733482 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.733714 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.733902 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.733976 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.740940 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.740987 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.740997 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.741189 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.741552 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.741724 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.742695 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.742750 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.742772 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.742999 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.742695 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.743626 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.743645 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.744013 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.744085 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.745325 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.745363 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.745380 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.745796 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.745840 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.745863 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.746252 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.746588 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.746630 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.746650 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.747154 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.747217 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.748107 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.748177 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.748195 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.748513 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.748564 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.749483 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.749541 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.749563 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.750437 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.750463 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.750473 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.788441 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.788518 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.788570 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.788615 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.788742 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.788790 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.788841 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.788888 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.788934 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.788983 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.789041 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.789085 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.789133 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890083 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890163 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890213 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890257 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890308 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890339 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890358 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890339 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890454 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890451 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890501 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890451 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890504 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890615 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890316 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890698 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890647 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890640 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890808 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890932 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.890989 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.891043 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.891092 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.891114 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.891153 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.891211 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.891157 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.891157 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.906050 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.907967 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.908205 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.908227 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.908268 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.908763 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.36:6443: connect: connection refused" node="crc" Jan 30 05:07:44 crc kubenswrapper[4841]: E0130 05:07:44.956842 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="800ms" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.993065 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.993146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.993190 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:44 crc kubenswrapper[4841]: I0130 05:07:44.993378 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.082354 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.115689 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.126388 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:45 crc kubenswrapper[4841]: W0130 05:07:45.144834 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:45 crc kubenswrapper[4841]: E0130 05:07:45.144977 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.147498 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.155128 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:45 crc kubenswrapper[4841]: W0130 05:07:45.208757 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:45 crc kubenswrapper[4841]: E0130 05:07:45.208890 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:45 crc kubenswrapper[4841]: W0130 05:07:45.209004 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-851bd747b871e89d82a835260dd2a67efba82b4d47523aabbce725ee24191b38 WatchSource:0}: Error finding container 851bd747b871e89d82a835260dd2a67efba82b4d47523aabbce725ee24191b38: Status 404 returned error can't find the container with id 851bd747b871e89d82a835260dd2a67efba82b4d47523aabbce725ee24191b38 Jan 30 05:07:45 crc kubenswrapper[4841]: W0130 05:07:45.213211 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-71dd44b30c4b89fd04c860260be941f9cc66328f4e5a5aed392564a36ab7ad40 WatchSource:0}: Error finding container 71dd44b30c4b89fd04c860260be941f9cc66328f4e5a5aed392564a36ab7ad40: Status 404 returned error can't find the container with id 71dd44b30c4b89fd04c860260be941f9cc66328f4e5a5aed392564a36ab7ad40 Jan 30 05:07:45 crc kubenswrapper[4841]: W0130 05:07:45.220672 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-92ff1bc5b91450ca3e386d754cce847dc71ff527e9cfca9ec3adda07013b782d WatchSource:0}: Error finding container 92ff1bc5b91450ca3e386d754cce847dc71ff527e9cfca9ec3adda07013b782d: Status 404 returned error can't find the container with id 92ff1bc5b91450ca3e386d754cce847dc71ff527e9cfca9ec3adda07013b782d Jan 30 05:07:45 crc kubenswrapper[4841]: W0130 05:07:45.224156 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-60399dc30b8905faa7d90efe5cbd6b7201bc51a51f4de16d7ed8b01415f528fa WatchSource:0}: Error finding container 60399dc30b8905faa7d90efe5cbd6b7201bc51a51f4de16d7ed8b01415f528fa: Status 404 returned error can't find the container with id 60399dc30b8905faa7d90efe5cbd6b7201bc51a51f4de16d7ed8b01415f528fa Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.309164 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.311394 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.311493 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.311519 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.311569 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:45 crc kubenswrapper[4841]: E0130 05:07:45.312395 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.36:6443: connect: connection refused" node="crc" Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.343770 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 00:09:16.152204484 +0000 UTC Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.343822 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:45 crc kubenswrapper[4841]: W0130 05:07:45.405576 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:45 crc kubenswrapper[4841]: E0130 05:07:45.405705 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.436642 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"60399dc30b8905faa7d90efe5cbd6b7201bc51a51f4de16d7ed8b01415f528fa"} Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.438362 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"92ff1bc5b91450ca3e386d754cce847dc71ff527e9cfca9ec3adda07013b782d"} Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.440106 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ec617a3a95eb822447a4afffa5af05ca9956ae71ceec2222a85386b53229aa82"} Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.441754 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"71dd44b30c4b89fd04c860260be941f9cc66328f4e5a5aed392564a36ab7ad40"} Jan 30 05:07:45 crc kubenswrapper[4841]: I0130 05:07:45.443594 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"851bd747b871e89d82a835260dd2a67efba82b4d47523aabbce725ee24191b38"} Jan 30 05:07:45 crc kubenswrapper[4841]: E0130 05:07:45.758491 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="1.6s" Jan 30 05:07:45 crc kubenswrapper[4841]: W0130 05:07:45.760307 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:45 crc kubenswrapper[4841]: E0130 05:07:45.760385 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:46 crc kubenswrapper[4841]: I0130 05:07:46.112841 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:46 crc kubenswrapper[4841]: I0130 05:07:46.114348 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:46 crc kubenswrapper[4841]: I0130 05:07:46.114426 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:46 crc kubenswrapper[4841]: I0130 05:07:46.114467 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:46 crc kubenswrapper[4841]: I0130 05:07:46.114503 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:46 crc kubenswrapper[4841]: E0130 05:07:46.115031 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.36:6443: connect: connection refused" node="crc" Jan 30 05:07:46 crc kubenswrapper[4841]: I0130 05:07:46.214827 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 05:07:46 crc kubenswrapper[4841]: E0130 05:07:46.216181 4841 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:46 crc kubenswrapper[4841]: I0130 05:07:46.343891 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 00:06:10.9013868 +0000 UTC Jan 30 05:07:46 crc kubenswrapper[4841]: I0130 05:07:46.343977 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:46 crc kubenswrapper[4841]: W0130 05:07:46.869775 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:46 crc kubenswrapper[4841]: E0130 05:07:46.869892 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:46 crc kubenswrapper[4841]: W0130 05:07:46.892969 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:46 crc kubenswrapper[4841]: E0130 05:07:46.893069 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.344144 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.344084 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 03:58:56.806527357 +0000 UTC Jan 30 05:07:47 crc kubenswrapper[4841]: E0130 05:07:47.360127 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="3.2s" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.453147 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351"} Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.453228 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e"} Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.455459 4841 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fda30fa2c06c048a03d0b463096bf9e05eb1152bf49416fdcb40fb792441ce3d" exitCode=0 Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.455526 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fda30fa2c06c048a03d0b463096bf9e05eb1152bf49416fdcb40fb792441ce3d"} Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.455657 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.457711 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.457763 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.457780 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.458092 4841 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="bcbe9e91dc1fd993a94fc5b67568d12e2c5e3c5a705cb3baf183e755d59a5eb2" exitCode=0 Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.458162 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"bcbe9e91dc1fd993a94fc5b67568d12e2c5e3c5a705cb3baf183e755d59a5eb2"} Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.458237 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.459578 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.459615 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.459632 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.461190 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9" exitCode=0 Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.461296 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9"} Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.461302 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.462551 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.462603 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.462619 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.463728 4841 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="b9ab50cac4f5c12464f2bff09fcbb6a3f368e235061f4b67c4fe2ab49b975438" exitCode=0 Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.463774 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"b9ab50cac4f5c12464f2bff09fcbb6a3f368e235061f4b67c4fe2ab49b975438"} Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.463874 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.464478 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.465049 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.465141 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.465160 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.465477 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.465512 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.465530 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:47 crc kubenswrapper[4841]: W0130 05:07:47.578070 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:47 crc kubenswrapper[4841]: E0130 05:07:47.578164 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.716062 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.717141 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.717183 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.717193 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:47 crc kubenswrapper[4841]: I0130 05:07:47.717222 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:47 crc kubenswrapper[4841]: E0130 05:07:47.717720 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.36:6443: connect: connection refused" node="crc" Jan 30 05:07:47 crc kubenswrapper[4841]: W0130 05:07:47.740674 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:47 crc kubenswrapper[4841]: E0130 05:07:47.740739 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.343979 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.344469 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 21:30:26.417240393 +0000 UTC Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.469748 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617"} Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.475367 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f6da22759a7d7487af7123d3e2d465ee78da15c6627dc0f0c7cd3a414b5d1775"} Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.480341 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b"} Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.480440 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7"} Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.480514 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.482772 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.482838 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.482866 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.484068 4841 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e5a0f31bdddbeb383b68c0f918b0e6b5b76cbd1cf788b8dd861f67cfa7abc288" exitCode=0 Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.484138 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e5a0f31bdddbeb383b68c0f918b0e6b5b76cbd1cf788b8dd861f67cfa7abc288"} Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.484207 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.485695 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.485721 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.485733 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.486731 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f8bd71a73208ab2c63de999ec362cd6f9ea71861dd9d3505791af4f8c33d36e3"} Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.486845 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.488220 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.488259 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:48 crc kubenswrapper[4841]: I0130 05:07:48.488285 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.081700 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.344520 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.344597 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 04:24:22.534602583 +0000 UTC Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.444552 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.493940 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5f2f35d2e276fd96a4f49bbd37c0ba111c7079f110164888076de4f0e9dd0239"} Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.494001 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54"} Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.494017 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d"} Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.494030 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea"} Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.494102 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.495410 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.495443 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.495458 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.496990 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d84921ef0fdefb2dc5428907c3775d4a50cea4deeb09780945fd3bd20476a578"} Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.497043 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"028759e02e76ccdad6b24bb2b911492de4959f3d4d5f88a3b1f545e7844d2019"} Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.497052 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.498045 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.498077 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.498086 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.499818 4841 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="19f11271dfe01f89e5e3a2da8e60f104f6feed66f1bf272b67b13c64128e9d44" exitCode=0 Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.499883 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.499909 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"19f11271dfe01f89e5e3a2da8e60f104f6feed66f1bf272b67b13c64128e9d44"} Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.499934 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.500034 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.500791 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.500834 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.500847 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.504947 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.504986 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.505000 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.508151 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.508189 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:49 crc kubenswrapper[4841]: I0130 05:07:49.508206 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.225954 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 05:07:50 crc kubenswrapper[4841]: E0130 05:07:50.227188 4841 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.344141 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.345115 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 10:20:58.44738907 +0000 UTC Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.504202 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.504235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d16a98982cc2ab29228ed23deafd46766f4753b30b2caf925e57d7624c1b3240"} Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.504295 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c39d71cac3e64cd49376902a990d96b760640e0a15e3482c2b7a04dfde07e973"} Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.504342 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.504362 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.504480 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.504538 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.505011 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.505050 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.505062 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.505764 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.505788 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.505785 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.505822 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.505842 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.505799 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:50 crc kubenswrapper[4841]: E0130 05:07:50.561155 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="6.4s" Jan 30 05:07:50 crc kubenswrapper[4841]: W0130 05:07:50.617131 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Jan 30 05:07:50 crc kubenswrapper[4841]: E0130 05:07:50.617245 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.696352 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.696670 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.696736 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.918633 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.921147 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.921180 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.921201 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:50 crc kubenswrapper[4841]: I0130 05:07:50.921227 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:50 crc kubenswrapper[4841]: E0130 05:07:50.922517 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.36:6443: connect: connection refused" node="crc" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.053310 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.345794 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 19:37:38.682775458 +0000 UTC Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.509809 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.512550 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5f2f35d2e276fd96a4f49bbd37c0ba111c7079f110164888076de4f0e9dd0239" exitCode=255 Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.512684 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5f2f35d2e276fd96a4f49bbd37c0ba111c7079f110164888076de4f0e9dd0239"} Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.512728 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.513898 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.513944 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.513962 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.514804 4841 scope.go:117] "RemoveContainer" containerID="5f2f35d2e276fd96a4f49bbd37c0ba111c7079f110164888076de4f0e9dd0239" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.519861 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"64a150e9eb78e6196c85225046c2ce17025bfb15bee59d27e576abd90fca5956"} Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.519911 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bb0d699d46bbf35bc85e7aa3209b15e1d142141f3b1faec170658b430a7b8a4e"} Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.519939 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"efa5fb37ea7c8ef51234e89b68c448d16fe8c49ae75ac6ee1e152fda3b8b6cf0"} Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.520079 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.520120 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.520294 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.521649 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.521700 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.521721 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.521656 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.521765 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.521781 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.521796 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.521813 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.521798 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:51 crc kubenswrapper[4841]: I0130 05:07:51.932326 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:52 crc kubenswrapper[4841]: I0130 05:07:52.346389 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 20:06:56.958757407 +0000 UTC Jan 30 05:07:52 crc kubenswrapper[4841]: I0130 05:07:52.526156 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 05:07:52 crc kubenswrapper[4841]: I0130 05:07:52.528820 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3"} Jan 30 05:07:52 crc kubenswrapper[4841]: I0130 05:07:52.528891 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:52 crc kubenswrapper[4841]: I0130 05:07:52.529031 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:52 crc kubenswrapper[4841]: I0130 05:07:52.530464 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:52 crc kubenswrapper[4841]: I0130 05:07:52.530525 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:52 crc kubenswrapper[4841]: I0130 05:07:52.530546 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:52 crc kubenswrapper[4841]: I0130 05:07:52.530551 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:52 crc kubenswrapper[4841]: I0130 05:07:52.530591 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:52 crc kubenswrapper[4841]: I0130 05:07:52.530610 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:52 crc kubenswrapper[4841]: I0130 05:07:52.727939 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 30 05:07:53 crc kubenswrapper[4841]: I0130 05:07:53.347121 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 09:30:39.037860748 +0000 UTC Jan 30 05:07:53 crc kubenswrapper[4841]: I0130 05:07:53.531816 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:53 crc kubenswrapper[4841]: I0130 05:07:53.531904 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:53 crc kubenswrapper[4841]: I0130 05:07:53.532035 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:07:53 crc kubenswrapper[4841]: I0130 05:07:53.533486 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:53 crc kubenswrapper[4841]: I0130 05:07:53.533546 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:53 crc kubenswrapper[4841]: I0130 05:07:53.533566 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:53 crc kubenswrapper[4841]: I0130 05:07:53.533598 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:53 crc kubenswrapper[4841]: I0130 05:07:53.533633 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:53 crc kubenswrapper[4841]: I0130 05:07:53.533652 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:54 crc kubenswrapper[4841]: I0130 05:07:54.053715 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 05:07:54 crc kubenswrapper[4841]: I0130 05:07:54.053790 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 05:07:54 crc kubenswrapper[4841]: I0130 05:07:54.348350 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 03:39:51.717079044 +0000 UTC Jan 30 05:07:54 crc kubenswrapper[4841]: I0130 05:07:54.536119 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:54 crc kubenswrapper[4841]: I0130 05:07:54.537815 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:54 crc kubenswrapper[4841]: I0130 05:07:54.537900 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:54 crc kubenswrapper[4841]: I0130 05:07:54.537919 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:54 crc kubenswrapper[4841]: E0130 05:07:54.615266 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 05:07:55 crc kubenswrapper[4841]: I0130 05:07:55.349364 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 22:45:57.826388572 +0000 UTC Jan 30 05:07:56 crc kubenswrapper[4841]: I0130 05:07:56.350180 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 01:44:17.878776471 +0000 UTC Jan 30 05:07:57 crc kubenswrapper[4841]: I0130 05:07:57.323535 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:57 crc kubenswrapper[4841]: I0130 05:07:57.325219 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:57 crc kubenswrapper[4841]: I0130 05:07:57.325286 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:57 crc kubenswrapper[4841]: I0130 05:07:57.325310 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:57 crc kubenswrapper[4841]: I0130 05:07:57.325351 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:07:57 crc kubenswrapper[4841]: I0130 05:07:57.351085 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 17:53:24.766901077 +0000 UTC Jan 30 05:07:57 crc kubenswrapper[4841]: I0130 05:07:57.843926 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:57 crc kubenswrapper[4841]: I0130 05:07:57.844161 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:57 crc kubenswrapper[4841]: I0130 05:07:57.846038 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:57 crc kubenswrapper[4841]: I0130 05:07:57.846119 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:57 crc kubenswrapper[4841]: I0130 05:07:57.846138 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:57 crc kubenswrapper[4841]: I0130 05:07:57.852025 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:58 crc kubenswrapper[4841]: I0130 05:07:58.312474 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 05:07:58 crc kubenswrapper[4841]: I0130 05:07:58.352198 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 03:28:50.363277981 +0000 UTC Jan 30 05:07:58 crc kubenswrapper[4841]: I0130 05:07:58.547053 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:58 crc kubenswrapper[4841]: I0130 05:07:58.548724 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:58 crc kubenswrapper[4841]: I0130 05:07:58.548785 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:58 crc kubenswrapper[4841]: I0130 05:07:58.548803 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:07:58 crc kubenswrapper[4841]: I0130 05:07:58.555768 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:07:59 crc kubenswrapper[4841]: I0130 05:07:59.353272 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 13:06:56.341897611 +0000 UTC Jan 30 05:07:59 crc kubenswrapper[4841]: I0130 05:07:59.549895 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:07:59 crc kubenswrapper[4841]: I0130 05:07:59.551147 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:07:59 crc kubenswrapper[4841]: I0130 05:07:59.551261 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:07:59 crc kubenswrapper[4841]: I0130 05:07:59.551369 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:00 crc kubenswrapper[4841]: I0130 05:08:00.354089 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 16:21:47.509026879 +0000 UTC Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.260156 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.260463 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.262077 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.262121 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.262133 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:01 crc kubenswrapper[4841]: W0130 05:08:01.314426 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.314539 4841 trace.go:236] Trace[687338056]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 05:07:51.312) (total time: 10002ms): Jan 30 05:08:01 crc kubenswrapper[4841]: Trace[687338056]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:08:01.314) Jan 30 05:08:01 crc kubenswrapper[4841]: Trace[687338056]: [10.002020565s] [10.002020565s] END Jan 30 05:08:01 crc kubenswrapper[4841]: E0130 05:08:01.314602 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.344660 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.355087 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 06:24:04.572420271 +0000 UTC Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.376631 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 30 05:08:01 crc kubenswrapper[4841]: W0130 05:08:01.461345 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.461457 4841 trace.go:236] Trace[988680542]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 05:07:51.459) (total time: 10001ms): Jan 30 05:08:01 crc kubenswrapper[4841]: Trace[988680542]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:08:01.461) Jan 30 05:08:01 crc kubenswrapper[4841]: Trace[988680542]: [10.001508173s] [10.001508173s] END Jan 30 05:08:01 crc kubenswrapper[4841]: E0130 05:08:01.461487 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.555089 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.556526 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.556744 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.556887 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.577516 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.905516 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.905601 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.915911 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 30 05:08:01 crc kubenswrapper[4841]: I0130 05:08:01.915982 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 30 05:08:02 crc kubenswrapper[4841]: I0130 05:08:02.355226 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 16:38:35.770810286 +0000 UTC Jan 30 05:08:02 crc kubenswrapper[4841]: I0130 05:08:02.557126 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:08:02 crc kubenswrapper[4841]: I0130 05:08:02.558335 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:02 crc kubenswrapper[4841]: I0130 05:08:02.558432 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:02 crc kubenswrapper[4841]: I0130 05:08:02.558455 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:03 crc kubenswrapper[4841]: I0130 05:08:03.355475 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 21:53:27.1771507 +0000 UTC Jan 30 05:08:04 crc kubenswrapper[4841]: I0130 05:08:04.054497 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 05:08:04 crc kubenswrapper[4841]: I0130 05:08:04.054595 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 05:08:04 crc kubenswrapper[4841]: I0130 05:08:04.279521 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 30 05:08:04 crc kubenswrapper[4841]: I0130 05:08:04.279678 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 30 05:08:04 crc kubenswrapper[4841]: I0130 05:08:04.356323 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 17:59:02.107904766 +0000 UTC Jan 30 05:08:04 crc kubenswrapper[4841]: E0130 05:08:04.615471 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 05:08:05 crc kubenswrapper[4841]: I0130 05:08:05.356729 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 03:04:45.031780883 +0000 UTC Jan 30 05:08:05 crc kubenswrapper[4841]: I0130 05:08:05.702666 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:08:05 crc kubenswrapper[4841]: I0130 05:08:05.702945 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:08:05 crc kubenswrapper[4841]: I0130 05:08:05.703357 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 30 05:08:05 crc kubenswrapper[4841]: I0130 05:08:05.703479 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 30 05:08:05 crc kubenswrapper[4841]: I0130 05:08:05.704287 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:05 crc kubenswrapper[4841]: I0130 05:08:05.704321 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:05 crc kubenswrapper[4841]: I0130 05:08:05.704334 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:05 crc kubenswrapper[4841]: I0130 05:08:05.710984 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.357197 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 06:56:19.413101335 +0000 UTC Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.568951 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.569503 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.569565 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.570386 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.570476 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.570531 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.906368 4841 trace.go:236] Trace[573743355]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 05:07:54.037) (total time: 12868ms): Jan 30 05:08:06 crc kubenswrapper[4841]: Trace[573743355]: ---"Objects listed" error: 12868ms (05:08:06.906) Jan 30 05:08:06 crc kubenswrapper[4841]: Trace[573743355]: [12.86859004s] [12.86859004s] END Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.906447 4841 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 05:08:06 crc kubenswrapper[4841]: E0130 05:08:06.910249 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.914798 4841 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.915009 4841 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.931382 4841 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.968074 4841 csr.go:261] certificate signing request csr-pkc8v is approved, waiting to be issued Jan 30 05:08:06 crc kubenswrapper[4841]: I0130 05:08:06.985021 4841 csr.go:257] certificate signing request csr-pkc8v is issued Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.358337 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 23:47:25.454928687 +0000 UTC Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.375025 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52044->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.375072 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52044->192.168.126.11:17697: read: connection reset by peer" Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.572503 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.572981 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.574808 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3" exitCode=255 Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.574843 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3"} Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.574874 4841 scope.go:117] "RemoveContainer" containerID="5f2f35d2e276fd96a4f49bbd37c0ba111c7079f110164888076de4f0e9dd0239" Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.574991 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.575655 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.575686 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.575696 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.576212 4841 scope.go:117] "RemoveContainer" containerID="042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3" Jan 30 05:08:07 crc kubenswrapper[4841]: E0130 05:08:07.576345 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.987023 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-30 05:03:06 +0000 UTC, rotation deadline is 2026-10-26 13:02:39.102590912 +0000 UTC Jan 30 05:08:07 crc kubenswrapper[4841]: I0130 05:08:07.987063 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6463h54m31.115530286s for next certificate rotation Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.273017 4841 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.333087 4841 apiserver.go:52] "Watching apiserver" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.346899 4841 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.347276 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-2hhs7","openshift-multus/multus-additional-cni-plugins-2ww56","openshift-multus/multus-c49cw","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-machine-config-operator/machine-config-daemon-hd8v2","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-node-4fl5g"] Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.347695 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.347811 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.347952 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.348155 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.348250 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.348310 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.348442 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.348782 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2hhs7" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.348884 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.348944 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.349304 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.349560 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.350344 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.350462 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.352225 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.352382 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.353841 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.355246 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.355658 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.356027 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.357115 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.357666 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.357972 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.358303 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.357972 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.358394 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.359359 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.359382 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.359912 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.360172 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.360368 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.360636 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.360595 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 21:44:49.448503603 +0000 UTC Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.360507 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.360502 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.362990 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.363158 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.363312 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.363523 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.363602 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.363602 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.363663 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.363796 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.363961 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.364150 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.364437 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.385626 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.397785 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.409301 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.429561 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.442543 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.446470 4841 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.453063 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.461203 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.471192 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.482591 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.498039 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.515490 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524149 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524322 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524354 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524377 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524425 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524456 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524482 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524501 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524524 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524548 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524574 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524600 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.524649 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:09.024606803 +0000 UTC m=+26.018079481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524679 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524712 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524767 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524806 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524839 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524873 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524911 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524947 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524980 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525015 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525049 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.524842 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525060 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525082 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525117 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525150 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525184 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525216 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525250 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525285 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525327 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525361 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525393 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525451 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525457 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525492 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525536 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525537 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525571 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525605 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525611 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.527515 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525656 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525839 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525848 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525972 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.525985 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.526034 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.526050 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.526190 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.526183 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.526248 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.526297 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.526586 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.526613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.526646 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.526694 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.527025 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.527084 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.527417 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.527572 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.527852 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.527900 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.527920 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.527942 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.528027 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.528088 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.528146 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.528200 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.528254 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.528305 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.528360 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529020 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529088 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529133 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529168 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529202 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529237 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529295 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529338 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529376 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529441 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529504 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529554 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529627 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529675 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529715 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529824 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529861 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529895 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529936 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529972 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530004 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530038 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530074 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530120 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530169 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530222 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530277 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530329 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530377 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530452 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530488 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530526 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530565 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530609 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530650 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530694 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530739 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530781 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530823 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530864 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530906 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530950 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530993 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531037 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531079 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531122 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531158 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531193 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531244 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531284 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531318 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531356 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.528088 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.528150 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.528652 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529031 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529506 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.529880 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530174 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530362 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530437 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530755 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.530833 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531028 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531183 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531299 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531417 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531432 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531391 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531852 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531887 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531923 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.531973 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532012 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532058 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532104 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532151 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532190 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532214 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532253 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532290 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532328 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532367 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532450 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532529 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532582 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532644 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532694 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532730 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532765 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532799 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532836 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532870 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532910 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.532975 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533036 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533273 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533335 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533381 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533450 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533494 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533530 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533571 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533613 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533652 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533694 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533730 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533774 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533809 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533847 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533889 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.533958 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534009 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534052 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534093 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534130 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534148 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534175 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534212 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534250 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534289 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534325 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534373 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534415 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534536 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534571 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534625 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534666 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534702 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534782 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534925 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534973 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535013 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535053 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535098 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535139 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535525 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535565 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535605 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535653 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535693 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535737 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535774 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535810 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535856 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535891 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535963 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536010 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536051 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536088 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536127 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536174 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536211 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536258 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536295 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536331 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536371 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536436 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536508 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536548 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536596 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536637 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536685 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536725 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536776 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536819 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536862 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536908 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536954 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537045 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537104 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz5ft\" (UniqueName: \"kubernetes.io/projected/cd2ac8bc-c695-499d-aacb-0e47a29aa569-kube-api-access-lz5ft\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537148 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-node-log\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537195 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-env-overrides\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537245 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537291 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/262e0db9-4560-4557-823d-8a4145e03fd1-multus-daemon-config\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537332 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-kubelet\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537368 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qldm9\" (UniqueName: \"kubernetes.io/projected/8f5d8664-d53a-4e96-9458-fd915cec77b5-kube-api-access-qldm9\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537460 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-var-lib-cni-bin\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537568 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-systemd\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538195 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-cni-bin\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538251 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bd6f6537-92be-4744-b292-751dd9ccdb1b-hosts-file\") pod \"node-resolver-2hhs7\" (UID: \"bd6f6537-92be-4744-b292-751dd9ccdb1b\") " pod="openshift-dns/node-resolver-2hhs7" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538337 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxs7q\" (UniqueName: \"kubernetes.io/projected/bd6f6537-92be-4744-b292-751dd9ccdb1b-kube-api-access-jxs7q\") pod \"node-resolver-2hhs7\" (UID: \"bd6f6537-92be-4744-b292-751dd9ccdb1b\") " pod="openshift-dns/node-resolver-2hhs7" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538374 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-os-release\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538543 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-hostroot\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538602 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538639 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-run-netns\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538673 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-var-lib-openvswitch\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538708 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovnkube-config\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538744 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovn-node-metrics-cert\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538791 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538831 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsqj2\" (UniqueName: \"kubernetes.io/projected/a24700eb-27ff-4126-9f6a-40ee9575e5ef-kube-api-access-nsqj2\") pod \"machine-config-daemon-hd8v2\" (UID: \"a24700eb-27ff-4126-9f6a-40ee9575e5ef\") " pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538868 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd2ac8bc-c695-499d-aacb-0e47a29aa569-cnibin\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538903 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd2ac8bc-c695-499d-aacb-0e47a29aa569-os-release\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538947 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cd2ac8bc-c695-499d-aacb-0e47a29aa569-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538981 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-run-k8s-cni-cncf-io\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539022 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-var-lib-kubelet\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539062 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-etc-openvswitch\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539104 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539149 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539184 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539305 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-cni-netd\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539341 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a24700eb-27ff-4126-9f6a-40ee9575e5ef-rootfs\") pod \"machine-config-daemon-hd8v2\" (UID: \"a24700eb-27ff-4126-9f6a-40ee9575e5ef\") " pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539378 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/262e0db9-4560-4557-823d-8a4145e03fd1-cni-binary-copy\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539437 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovnkube-script-lib\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539480 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd2ac8bc-c695-499d-aacb-0e47a29aa569-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539531 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539582 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-multus-cni-dir\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539617 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-slash\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539653 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-ovn\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539697 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a24700eb-27ff-4126-9f6a-40ee9575e5ef-mcd-auth-proxy-config\") pod \"machine-config-daemon-hd8v2\" (UID: \"a24700eb-27ff-4126-9f6a-40ee9575e5ef\") " pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539741 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd2ac8bc-c695-499d-aacb-0e47a29aa569-cni-binary-copy\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539781 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-systemd-units\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539822 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-openvswitch\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539858 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-log-socket\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539909 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.539958 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540012 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540133 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-cnibin\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540199 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-run-netns\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540245 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a24700eb-27ff-4126-9f6a-40ee9575e5ef-proxy-tls\") pod \"machine-config-daemon-hd8v2\" (UID: \"a24700eb-27ff-4126-9f6a-40ee9575e5ef\") " pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540293 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540348 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd2ac8bc-c695-499d-aacb-0e47a29aa569-system-cni-dir\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540497 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540736 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540794 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdgz9\" (UniqueName: \"kubernetes.io/projected/262e0db9-4560-4557-823d-8a4145e03fd1-kube-api-access-mdgz9\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540832 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540877 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-system-cni-dir\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540931 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-var-lib-cni-multus\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540998 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.541044 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-multus-socket-dir-parent\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.541238 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-multus-conf-dir\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.541274 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-run-multus-certs\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.541315 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-etc-kubernetes\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537080 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.547170 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534794 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534830 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.534853 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535078 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535083 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535118 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535143 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535161 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535323 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535440 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535598 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535602 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535608 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535628 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535716 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.535870 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536208 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536223 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536272 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536316 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536765 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536804 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.536978 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537107 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537147 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.537253 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538012 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538089 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538166 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538327 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538596 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538705 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538725 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.538877 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540591 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540900 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.540981 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.541295 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.541465 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.541481 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.541546 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.541753 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.541769 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.542313 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.542328 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.542564 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.542613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.542803 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.542804 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.542825 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.542955 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.542974 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.543334 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.543335 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.543346 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.543352 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.543363 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.543493 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.547672 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.543812 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.544631 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.544863 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.544881 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.545019 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.545298 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.545505 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.545572 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.545736 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.545819 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.545340 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.546277 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.546352 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.546474 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.546750 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.546797 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.546826 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.547075 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.547084 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.547149 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.547851 4841 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.547210 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.547787 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.548036 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.548151 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.548243 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.548257 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:09.048236968 +0000 UTC m=+26.041709606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.548300 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:09.048286399 +0000 UTC m=+26.041759037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.548875 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.548899 4841 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.548914 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.548927 4841 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.549001 4841 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.549017 4841 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.549069 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.549128 4841 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.549304 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.549341 4841 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.549353 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.549364 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.549913 4841 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.549971 4841 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.549987 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550001 4841 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550095 4841 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550108 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550119 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550131 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550143 4841 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550156 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550215 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550232 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550244 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550281 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550322 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550337 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550349 4841 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550405 4841 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550420 4841 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550431 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550443 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.550455 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.551012 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.551494 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.551517 4841 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.551530 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.551547 4841 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.551560 4841 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.551574 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.551587 4841 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.551599 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.551611 4841 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.551623 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.551681 4841 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.552458 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.552744 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.553942 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.553981 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.553988 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.554492 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.554840 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.554933 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.555018 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.555065 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.555141 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.555252 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.555438 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.555557 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.555612 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.555736 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.556015 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.556214 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.556917 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.559553 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.559672 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.559973 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.560775 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.560816 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.560837 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.560852 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.560913 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:09.060894346 +0000 UTC m=+26.054367084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.561303 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.561855 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.562219 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.562706 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.562868 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.563656 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.564174 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.564283 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.567735 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.567748 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.567846 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.567876 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.567889 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.567887 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.567944 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:09.067926573 +0000 UTC m=+26.061399331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.568557 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.568308 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.568672 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.568853 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.569057 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.569221 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.569230 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.569377 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.569527 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.569635 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.568479 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.569667 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.569864 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.570126 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.570431 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.570664 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.571046 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.571117 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.571217 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.571469 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.572611 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.573274 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.573639 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.573801 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.573824 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.573860 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.573879 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.574130 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.574230 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.574273 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.575385 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.576647 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.577601 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.578417 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.579145 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.582199 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.589033 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.591513 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.600691 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.604741 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652531 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652562 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-system-cni-dir\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652576 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-var-lib-cni-multus\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652591 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdgz9\" (UniqueName: \"kubernetes.io/projected/262e0db9-4560-4557-823d-8a4145e03fd1-kube-api-access-mdgz9\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652606 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-multus-conf-dir\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652619 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-run-multus-certs\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652631 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-etc-kubernetes\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652652 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-multus-socket-dir-parent\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652666 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-node-log\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652680 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-env-overrides\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652701 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz5ft\" (UniqueName: \"kubernetes.io/projected/cd2ac8bc-c695-499d-aacb-0e47a29aa569-kube-api-access-lz5ft\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652715 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/262e0db9-4560-4557-823d-8a4145e03fd1-multus-daemon-config\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652727 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-kubelet\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652742 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-var-lib-cni-bin\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652755 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-systemd\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652769 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-cni-bin\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652782 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qldm9\" (UniqueName: \"kubernetes.io/projected/8f5d8664-d53a-4e96-9458-fd915cec77b5-kube-api-access-qldm9\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652795 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-os-release\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652807 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-hostroot\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652826 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bd6f6537-92be-4744-b292-751dd9ccdb1b-hosts-file\") pod \"node-resolver-2hhs7\" (UID: \"bd6f6537-92be-4744-b292-751dd9ccdb1b\") " pod="openshift-dns/node-resolver-2hhs7" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652838 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxs7q\" (UniqueName: \"kubernetes.io/projected/bd6f6537-92be-4744-b292-751dd9ccdb1b-kube-api-access-jxs7q\") pod \"node-resolver-2hhs7\" (UID: \"bd6f6537-92be-4744-b292-751dd9ccdb1b\") " pod="openshift-dns/node-resolver-2hhs7" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652852 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-run-netns\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652865 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-var-lib-openvswitch\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652880 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovnkube-config\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652894 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovn-node-metrics-cert\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652907 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd2ac8bc-c695-499d-aacb-0e47a29aa569-cnibin\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652922 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd2ac8bc-c695-499d-aacb-0e47a29aa569-os-release\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652935 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsqj2\" (UniqueName: \"kubernetes.io/projected/a24700eb-27ff-4126-9f6a-40ee9575e5ef-kube-api-access-nsqj2\") pod \"machine-config-daemon-hd8v2\" (UID: \"a24700eb-27ff-4126-9f6a-40ee9575e5ef\") " pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652948 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-var-lib-kubelet\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652962 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-etc-openvswitch\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652977 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.652992 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cd2ac8bc-c695-499d-aacb-0e47a29aa569-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653005 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-run-k8s-cni-cncf-io\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653019 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653032 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-cni-netd\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653066 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a24700eb-27ff-4126-9f6a-40ee9575e5ef-rootfs\") pod \"machine-config-daemon-hd8v2\" (UID: \"a24700eb-27ff-4126-9f6a-40ee9575e5ef\") " pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653081 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653095 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-multus-cni-dir\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653108 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/262e0db9-4560-4557-823d-8a4145e03fd1-cni-binary-copy\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653123 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovnkube-script-lib\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653136 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd2ac8bc-c695-499d-aacb-0e47a29aa569-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653149 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-ovn\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653163 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a24700eb-27ff-4126-9f6a-40ee9575e5ef-mcd-auth-proxy-config\") pod \"machine-config-daemon-hd8v2\" (UID: \"a24700eb-27ff-4126-9f6a-40ee9575e5ef\") " pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653185 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd2ac8bc-c695-499d-aacb-0e47a29aa569-cni-binary-copy\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653199 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-slash\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653192 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-hostroot\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653212 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-cnibin\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653259 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-cnibin\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653283 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-run-netns\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653300 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bd6f6537-92be-4744-b292-751dd9ccdb1b-hosts-file\") pod \"node-resolver-2hhs7\" (UID: \"bd6f6537-92be-4744-b292-751dd9ccdb1b\") " pod="openshift-dns/node-resolver-2hhs7" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653306 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-systemd-units\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653325 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-openvswitch\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653343 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-log-socket\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653377 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a24700eb-27ff-4126-9f6a-40ee9575e5ef-proxy-tls\") pod \"machine-config-daemon-hd8v2\" (UID: \"a24700eb-27ff-4126-9f6a-40ee9575e5ef\") " pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.653422 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd2ac8bc-c695-499d-aacb-0e47a29aa569-system-cni-dir\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.654141 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-os-release\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.654165 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd2ac8bc-c695-499d-aacb-0e47a29aa569-system-cni-dir\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.654288 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-run-netns\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655221 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/262e0db9-4560-4557-823d-8a4145e03fd1-multus-daemon-config\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655308 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-var-lib-cni-multus\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655330 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-systemd\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655373 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-openvswitch\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655485 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-kubelet\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655577 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-cni-bin\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655632 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-systemd-units\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655640 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-run-multus-certs\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655714 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-var-lib-cni-bin\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655787 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-etc-kubernetes\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655803 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-multus-socket-dir-parent\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655846 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-multus-conf-dir\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655889 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-node-log\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.655804 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-system-cni-dir\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.656113 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-run-k8s-cni-cncf-io\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.656620 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd2ac8bc-c695-499d-aacb-0e47a29aa569-os-release\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.656663 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a24700eb-27ff-4126-9f6a-40ee9575e5ef-rootfs\") pod \"machine-config-daemon-hd8v2\" (UID: \"a24700eb-27ff-4126-9f6a-40ee9575e5ef\") " pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.656695 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-run-netns\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.656852 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-env-overrides\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.657009 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-var-lib-openvswitch\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.657752 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a24700eb-27ff-4126-9f6a-40ee9575e5ef-mcd-auth-proxy-config\") pod \"machine-config-daemon-hd8v2\" (UID: \"a24700eb-27ff-4126-9f6a-40ee9575e5ef\") " pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.657918 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.657920 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-host-var-lib-kubelet\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.657951 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd2ac8bc-c695-499d-aacb-0e47a29aa569-cnibin\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.657971 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-cni-netd\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.657984 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.657983 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.658057 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-ovn\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.658086 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-log-socket\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.658330 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-slash\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.658199 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-etc-openvswitch\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.658535 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/262e0db9-4560-4557-823d-8a4145e03fd1-multus-cni-dir\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.658637 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovnkube-config\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.658648 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.659251 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd2ac8bc-c695-499d-aacb-0e47a29aa569-cni-binary-copy\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.659428 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cd2ac8bc-c695-499d-aacb-0e47a29aa569-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.659538 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovnkube-script-lib\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.657856 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/262e0db9-4560-4557-823d-8a4145e03fd1-cni-binary-copy\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.659863 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.659891 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.659905 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.659927 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660660 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660702 4841 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660723 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660737 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660750 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660763 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660785 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660799 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660811 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660824 4841 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660840 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660852 4841 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660864 4841 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660880 4841 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660893 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660906 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.660919 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661078 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661092 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661108 4841 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661121 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661137 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661150 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661162 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661259 4841 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661372 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661409 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661423 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661442 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661455 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661576 4841 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661784 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661810 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661824 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661837 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661850 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661862 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661875 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661888 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661900 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661913 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661926 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661939 4841 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661954 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661966 4841 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661978 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.661991 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662003 4841 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662015 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662026 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662038 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662050 4841 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662062 4841 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662074 4841 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662087 4841 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662099 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662111 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662124 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662137 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662148 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662160 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662172 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662184 4841 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662196 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662209 4841 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662221 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662235 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662248 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662260 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662272 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662284 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662296 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662309 4841 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662321 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662332 4841 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662344 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662357 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662368 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662379 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662391 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662426 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662439 4841 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662452 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662465 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662476 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662488 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662500 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662511 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662522 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662534 4841 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662545 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662557 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662568 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662580 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662592 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662604 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662616 4841 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662628 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662640 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662654 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662665 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662677 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662690 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662732 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662743 4841 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662756 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662768 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662779 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662790 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662801 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662804 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovn-node-metrics-cert\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662833 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662846 4841 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662858 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662869 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662882 4841 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662893 4841 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662905 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662916 4841 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662928 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662939 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662951 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662963 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662975 4841 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.662987 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663006 4841 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663019 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663030 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663042 4841 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663054 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663074 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663085 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663097 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663109 4841 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663121 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663132 4841 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663143 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663154 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663165 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663176 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663188 4841 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663199 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663210 4841 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.663530 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a24700eb-27ff-4126-9f6a-40ee9575e5ef-proxy-tls\") pod \"machine-config-daemon-hd8v2\" (UID: \"a24700eb-27ff-4126-9f6a-40ee9575e5ef\") " pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.667251 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxs7q\" (UniqueName: \"kubernetes.io/projected/bd6f6537-92be-4744-b292-751dd9ccdb1b-kube-api-access-jxs7q\") pod \"node-resolver-2hhs7\" (UID: \"bd6f6537-92be-4744-b292-751dd9ccdb1b\") " pod="openshift-dns/node-resolver-2hhs7" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.668045 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd2ac8bc-c695-499d-aacb-0e47a29aa569-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.674386 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.677048 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qldm9\" (UniqueName: \"kubernetes.io/projected/8f5d8664-d53a-4e96-9458-fd915cec77b5-kube-api-access-qldm9\") pod \"ovnkube-node-4fl5g\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.678511 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdgz9\" (UniqueName: \"kubernetes.io/projected/262e0db9-4560-4557-823d-8a4145e03fd1-kube-api-access-mdgz9\") pod \"multus-c49cw\" (UID: \"262e0db9-4560-4557-823d-8a4145e03fd1\") " pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.680043 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsqj2\" (UniqueName: \"kubernetes.io/projected/a24700eb-27ff-4126-9f6a-40ee9575e5ef-kube-api-access-nsqj2\") pod \"machine-config-daemon-hd8v2\" (UID: \"a24700eb-27ff-4126-9f6a-40ee9575e5ef\") " pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.681769 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz5ft\" (UniqueName: \"kubernetes.io/projected/cd2ac8bc-c695-499d-aacb-0e47a29aa569-kube-api-access-lz5ft\") pod \"multus-additional-cni-plugins-2ww56\" (UID: \"cd2ac8bc-c695-499d-aacb-0e47a29aa569\") " pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.686016 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.691563 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:08 crc kubenswrapper[4841]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Jan 30 05:08:08 crc kubenswrapper[4841]: set -o allexport Jan 30 05:08:08 crc kubenswrapper[4841]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Jan 30 05:08:08 crc kubenswrapper[4841]: source /etc/kubernetes/apiserver-url.env Jan 30 05:08:08 crc kubenswrapper[4841]: else Jan 30 05:08:08 crc kubenswrapper[4841]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Jan 30 05:08:08 crc kubenswrapper[4841]: exit 1 Jan 30 05:08:08 crc kubenswrapper[4841]: fi Jan 30 05:08:08 crc kubenswrapper[4841]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Jan 30 05:08:08 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:08 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.692683 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.693335 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 05:08:08 crc kubenswrapper[4841]: W0130 05:08:08.699201 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-11ba72333be3006758d58388a45c113cb337749ecf22bc73e22316a8aba0f030 WatchSource:0}: Error finding container 11ba72333be3006758d58388a45c113cb337749ecf22bc73e22316a8aba0f030: Status 404 returned error can't find the container with id 11ba72333be3006758d58388a45c113cb337749ecf22bc73e22316a8aba0f030 Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.702171 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.703360 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.704520 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2hhs7" Jan 30 05:08:08 crc kubenswrapper[4841]: W0130 05:08:08.706874 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-46963d5340255917b2e92a6028f6c2f975fc453f6e7fa671ed9242b99ca7ae92 WatchSource:0}: Error finding container 46963d5340255917b2e92a6028f6c2f975fc453f6e7fa671ed9242b99ca7ae92: Status 404 returned error can't find the container with id 46963d5340255917b2e92a6028f6c2f975fc453f6e7fa671ed9242b99ca7ae92 Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.710609 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.713360 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:08 crc kubenswrapper[4841]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Jan 30 05:08:08 crc kubenswrapper[4841]: if [[ -f "/env/_master" ]]; then Jan 30 05:08:08 crc kubenswrapper[4841]: set -o allexport Jan 30 05:08:08 crc kubenswrapper[4841]: source "/env/_master" Jan 30 05:08:08 crc kubenswrapper[4841]: set +o allexport Jan 30 05:08:08 crc kubenswrapper[4841]: fi Jan 30 05:08:08 crc kubenswrapper[4841]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Jan 30 05:08:08 crc kubenswrapper[4841]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Jan 30 05:08:08 crc kubenswrapper[4841]: ho_enable="--enable-hybrid-overlay" Jan 30 05:08:08 crc kubenswrapper[4841]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Jan 30 05:08:08 crc kubenswrapper[4841]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Jan 30 05:08:08 crc kubenswrapper[4841]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Jan 30 05:08:08 crc kubenswrapper[4841]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Jan 30 05:08:08 crc kubenswrapper[4841]: --webhook-cert-dir="/etc/webhook-cert" \ Jan 30 05:08:08 crc kubenswrapper[4841]: --webhook-host=127.0.0.1 \ Jan 30 05:08:08 crc kubenswrapper[4841]: --webhook-port=9743 \ Jan 30 05:08:08 crc kubenswrapper[4841]: ${ho_enable} \ Jan 30 05:08:08 crc kubenswrapper[4841]: --enable-interconnect \ Jan 30 05:08:08 crc kubenswrapper[4841]: --disable-approver \ Jan 30 05:08:08 crc kubenswrapper[4841]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Jan 30 05:08:08 crc kubenswrapper[4841]: --wait-for-kubernetes-api=200s \ Jan 30 05:08:08 crc kubenswrapper[4841]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Jan 30 05:08:08 crc kubenswrapper[4841]: --loglevel="${LOGLEVEL}" Jan 30 05:08:08 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:08 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.716670 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:08 crc kubenswrapper[4841]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Jan 30 05:08:08 crc kubenswrapper[4841]: if [[ -f "/env/_master" ]]; then Jan 30 05:08:08 crc kubenswrapper[4841]: set -o allexport Jan 30 05:08:08 crc kubenswrapper[4841]: source "/env/_master" Jan 30 05:08:08 crc kubenswrapper[4841]: set +o allexport Jan 30 05:08:08 crc kubenswrapper[4841]: fi Jan 30 05:08:08 crc kubenswrapper[4841]: Jan 30 05:08:08 crc kubenswrapper[4841]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Jan 30 05:08:08 crc kubenswrapper[4841]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Jan 30 05:08:08 crc kubenswrapper[4841]: --disable-webhook \ Jan 30 05:08:08 crc kubenswrapper[4841]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Jan 30 05:08:08 crc kubenswrapper[4841]: --loglevel="${LOGLEVEL}" Jan 30 05:08:08 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:08 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.717917 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.719088 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c49cw" Jan 30 05:08:08 crc kubenswrapper[4841]: W0130 05:08:08.720888 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd6f6537_92be_4744_b292_751dd9ccdb1b.slice/crio-688e3bacc49a200b526f32373d10f35434b3d2b6f26ce90a5d4364ee8bbb5b81 WatchSource:0}: Error finding container 688e3bacc49a200b526f32373d10f35434b3d2b6f26ce90a5d4364ee8bbb5b81: Status 404 returned error can't find the container with id 688e3bacc49a200b526f32373d10f35434b3d2b6f26ce90a5d4364ee8bbb5b81 Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.727519 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:08 crc kubenswrapper[4841]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Jan 30 05:08:08 crc kubenswrapper[4841]: set -uo pipefail Jan 30 05:08:08 crc kubenswrapper[4841]: Jan 30 05:08:08 crc kubenswrapper[4841]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Jan 30 05:08:08 crc kubenswrapper[4841]: Jan 30 05:08:08 crc kubenswrapper[4841]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Jan 30 05:08:08 crc kubenswrapper[4841]: HOSTS_FILE="/etc/hosts" Jan 30 05:08:08 crc kubenswrapper[4841]: TEMP_FILE="/etc/hosts.tmp" Jan 30 05:08:08 crc kubenswrapper[4841]: Jan 30 05:08:08 crc kubenswrapper[4841]: IFS=', ' read -r -a services <<< "${SERVICES}" Jan 30 05:08:08 crc kubenswrapper[4841]: Jan 30 05:08:08 crc kubenswrapper[4841]: # Make a temporary file with the old hosts file's attributes. Jan 30 05:08:08 crc kubenswrapper[4841]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Jan 30 05:08:08 crc kubenswrapper[4841]: echo "Failed to preserve hosts file. Exiting." Jan 30 05:08:08 crc kubenswrapper[4841]: exit 1 Jan 30 05:08:08 crc kubenswrapper[4841]: fi Jan 30 05:08:08 crc kubenswrapper[4841]: Jan 30 05:08:08 crc kubenswrapper[4841]: while true; do Jan 30 05:08:08 crc kubenswrapper[4841]: declare -A svc_ips Jan 30 05:08:08 crc kubenswrapper[4841]: for svc in "${services[@]}"; do Jan 30 05:08:08 crc kubenswrapper[4841]: # Fetch service IP from cluster dns if present. We make several tries Jan 30 05:08:08 crc kubenswrapper[4841]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Jan 30 05:08:08 crc kubenswrapper[4841]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Jan 30 05:08:08 crc kubenswrapper[4841]: # support UDP loadbalancers and require reaching DNS through TCP. Jan 30 05:08:08 crc kubenswrapper[4841]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 30 05:08:08 crc kubenswrapper[4841]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 30 05:08:08 crc kubenswrapper[4841]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 30 05:08:08 crc kubenswrapper[4841]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Jan 30 05:08:08 crc kubenswrapper[4841]: for i in ${!cmds[*]} Jan 30 05:08:08 crc kubenswrapper[4841]: do Jan 30 05:08:08 crc kubenswrapper[4841]: ips=($(eval "${cmds[i]}")) Jan 30 05:08:08 crc kubenswrapper[4841]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Jan 30 05:08:08 crc kubenswrapper[4841]: svc_ips["${svc}"]="${ips[@]}" Jan 30 05:08:08 crc kubenswrapper[4841]: break Jan 30 05:08:08 crc kubenswrapper[4841]: fi Jan 30 05:08:08 crc kubenswrapper[4841]: done Jan 30 05:08:08 crc kubenswrapper[4841]: done Jan 30 05:08:08 crc kubenswrapper[4841]: Jan 30 05:08:08 crc kubenswrapper[4841]: # Update /etc/hosts only if we get valid service IPs Jan 30 05:08:08 crc kubenswrapper[4841]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Jan 30 05:08:08 crc kubenswrapper[4841]: # Stale entries could exist in /etc/hosts if the service is deleted Jan 30 05:08:08 crc kubenswrapper[4841]: if [[ -n "${svc_ips[*]-}" ]]; then Jan 30 05:08:08 crc kubenswrapper[4841]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Jan 30 05:08:08 crc kubenswrapper[4841]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Jan 30 05:08:08 crc kubenswrapper[4841]: # Only continue rebuilding the hosts entries if its original content is preserved Jan 30 05:08:08 crc kubenswrapper[4841]: sleep 60 & wait Jan 30 05:08:08 crc kubenswrapper[4841]: continue Jan 30 05:08:08 crc kubenswrapper[4841]: fi Jan 30 05:08:08 crc kubenswrapper[4841]: Jan 30 05:08:08 crc kubenswrapper[4841]: # Append resolver entries for services Jan 30 05:08:08 crc kubenswrapper[4841]: rc=0 Jan 30 05:08:08 crc kubenswrapper[4841]: for svc in "${!svc_ips[@]}"; do Jan 30 05:08:08 crc kubenswrapper[4841]: for ip in ${svc_ips[${svc}]}; do Jan 30 05:08:08 crc kubenswrapper[4841]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Jan 30 05:08:08 crc kubenswrapper[4841]: done Jan 30 05:08:08 crc kubenswrapper[4841]: done Jan 30 05:08:08 crc kubenswrapper[4841]: if [[ $rc -ne 0 ]]; then Jan 30 05:08:08 crc kubenswrapper[4841]: sleep 60 & wait Jan 30 05:08:08 crc kubenswrapper[4841]: continue Jan 30 05:08:08 crc kubenswrapper[4841]: fi Jan 30 05:08:08 crc kubenswrapper[4841]: Jan 30 05:08:08 crc kubenswrapper[4841]: Jan 30 05:08:08 crc kubenswrapper[4841]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Jan 30 05:08:08 crc kubenswrapper[4841]: # Replace /etc/hosts with our modified version if needed Jan 30 05:08:08 crc kubenswrapper[4841]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Jan 30 05:08:08 crc kubenswrapper[4841]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Jan 30 05:08:08 crc kubenswrapper[4841]: fi Jan 30 05:08:08 crc kubenswrapper[4841]: sleep 60 & wait Jan 30 05:08:08 crc kubenswrapper[4841]: unset svc_ips Jan 30 05:08:08 crc kubenswrapper[4841]: done Jan 30 05:08:08 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jxs7q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-2hhs7_openshift-dns(bd6f6537-92be-4744-b292-751dd9ccdb1b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:08 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.728661 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2ww56" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.728912 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-2hhs7" podUID="bd6f6537-92be-4744-b292-751dd9ccdb1b" Jan 30 05:08:08 crc kubenswrapper[4841]: W0130 05:08:08.735044 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda24700eb_27ff_4126_9f6a_40ee9575e5ef.slice/crio-a1b3c2cf7e1fae9e2c833c0cf38470d95e5ed8df5c7aa723dbb3d9ec86dc651e WatchSource:0}: Error finding container a1b3c2cf7e1fae9e2c833c0cf38470d95e5ed8df5c7aa723dbb3d9ec86dc651e: Status 404 returned error can't find the container with id a1b3c2cf7e1fae9e2c833c0cf38470d95e5ed8df5c7aa723dbb3d9ec86dc651e Jan 30 05:08:08 crc kubenswrapper[4841]: W0130 05:08:08.737210 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod262e0db9_4560_4557_823d_8a4145e03fd1.slice/crio-d8d2f180dd638e5ffbb816eef30ca3d0e37b93c3a53ce51a1ae0d8cf727bc397 WatchSource:0}: Error finding container d8d2f180dd638e5ffbb816eef30ca3d0e37b93c3a53ce51a1ae0d8cf727bc397: Status 404 returned error can't find the container with id d8d2f180dd638e5ffbb816eef30ca3d0e37b93c3a53ce51a1ae0d8cf727bc397 Jan 30 05:08:08 crc kubenswrapper[4841]: I0130 05:08:08.737280 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.739141 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nsqj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.742227 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:08 crc kubenswrapper[4841]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Jan 30 05:08:08 crc kubenswrapper[4841]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Jan 30 05:08:08 crc kubenswrapper[4841]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mdgz9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-c49cw_openshift-multus(262e0db9-4560-4557-823d-8a4145e03fd1): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:08 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.743285 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-c49cw" podUID="262e0db9-4560-4557-823d-8a4145e03fd1" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.744168 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nsqj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.745512 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:08:08 crc kubenswrapper[4841]: W0130 05:08:08.750720 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd2ac8bc_c695_499d_aacb_0e47a29aa569.slice/crio-a6045d7f9121152ebfb1e24ce71da4eb660a39bc2fca155fb31e094572baafe5 WatchSource:0}: Error finding container a6045d7f9121152ebfb1e24ce71da4eb660a39bc2fca155fb31e094572baafe5: Status 404 returned error can't find the container with id a6045d7f9121152ebfb1e24ce71da4eb660a39bc2fca155fb31e094572baafe5 Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.753562 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lz5ft,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-2ww56_openshift-multus(cd2ac8bc-c695-499d-aacb-0e47a29aa569): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.754702 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-2ww56" podUID="cd2ac8bc-c695-499d-aacb-0e47a29aa569" Jan 30 05:08:08 crc kubenswrapper[4841]: W0130 05:08:08.774206 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f5d8664_d53a_4e96_9458_fd915cec77b5.slice/crio-835b0026c306346f11b0b21630d16cee154fc8d3e6e7b6f10631f09b7c217c02 WatchSource:0}: Error finding container 835b0026c306346f11b0b21630d16cee154fc8d3e6e7b6f10631f09b7c217c02: Status 404 returned error can't find the container with id 835b0026c306346f11b0b21630d16cee154fc8d3e6e7b6f10631f09b7c217c02 Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.776858 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:08 crc kubenswrapper[4841]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Jan 30 05:08:08 crc kubenswrapper[4841]: apiVersion: v1 Jan 30 05:08:08 crc kubenswrapper[4841]: clusters: Jan 30 05:08:08 crc kubenswrapper[4841]: - cluster: Jan 30 05:08:08 crc kubenswrapper[4841]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Jan 30 05:08:08 crc kubenswrapper[4841]: server: https://api-int.crc.testing:6443 Jan 30 05:08:08 crc kubenswrapper[4841]: name: default-cluster Jan 30 05:08:08 crc kubenswrapper[4841]: contexts: Jan 30 05:08:08 crc kubenswrapper[4841]: - context: Jan 30 05:08:08 crc kubenswrapper[4841]: cluster: default-cluster Jan 30 05:08:08 crc kubenswrapper[4841]: namespace: default Jan 30 05:08:08 crc kubenswrapper[4841]: user: default-auth Jan 30 05:08:08 crc kubenswrapper[4841]: name: default-context Jan 30 05:08:08 crc kubenswrapper[4841]: current-context: default-context Jan 30 05:08:08 crc kubenswrapper[4841]: kind: Config Jan 30 05:08:08 crc kubenswrapper[4841]: preferences: {} Jan 30 05:08:08 crc kubenswrapper[4841]: users: Jan 30 05:08:08 crc kubenswrapper[4841]: - name: default-auth Jan 30 05:08:08 crc kubenswrapper[4841]: user: Jan 30 05:08:08 crc kubenswrapper[4841]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Jan 30 05:08:08 crc kubenswrapper[4841]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Jan 30 05:08:08 crc kubenswrapper[4841]: EOF Jan 30 05:08:08 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qldm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-4fl5g_openshift-ovn-kubernetes(8f5d8664-d53a-4e96-9458-fd915cec77b5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:08 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:08 crc kubenswrapper[4841]: E0130 05:08:08.778703 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.066648 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.066828 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:10.066796515 +0000 UTC m=+27.060269153 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.066932 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.066999 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.067041 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.067123 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.067196 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.067218 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.067242 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.067246 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:10.067214136 +0000 UTC m=+27.060686844 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.067257 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.067286 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:10.067264427 +0000 UTC m=+27.060737095 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.067319 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:10.067299728 +0000 UTC m=+27.060772516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.167745 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.167896 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.167928 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.167957 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.168013 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:10.167997773 +0000 UTC m=+27.161470411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.360787 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 17:06:41.282409833 +0000 UTC Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.470276 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-d7bdh"] Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.470831 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d7bdh" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.474447 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.474659 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.474735 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.477317 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.487822 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.498394 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.515064 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.527034 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.542076 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.568471 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.572080 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9p8z\" (UniqueName: \"kubernetes.io/projected/577a5760-2c37-4852-90c0-4962c2362824-kube-api-access-n9p8z\") pod \"node-ca-d7bdh\" (UID: \"577a5760-2c37-4852-90c0-4962c2362824\") " pod="openshift-image-registry/node-ca-d7bdh" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.572188 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/577a5760-2c37-4852-90c0-4962c2362824-host\") pod \"node-ca-d7bdh\" (UID: \"577a5760-2c37-4852-90c0-4962c2362824\") " pod="openshift-image-registry/node-ca-d7bdh" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.572280 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/577a5760-2c37-4852-90c0-4962c2362824-serviceca\") pod \"node-ca-d7bdh\" (UID: \"577a5760-2c37-4852-90c0-4962c2362824\") " pod="openshift-image-registry/node-ca-d7bdh" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.585915 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c49cw" event={"ID":"262e0db9-4560-4557-823d-8a4145e03fd1","Type":"ContainerStarted","Data":"d8d2f180dd638e5ffbb816eef30ca3d0e37b93c3a53ce51a1ae0d8cf727bc397"} Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.587390 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.587927 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:09 crc kubenswrapper[4841]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Jan 30 05:08:09 crc kubenswrapper[4841]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Jan 30 05:08:09 crc kubenswrapper[4841]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mdgz9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-c49cw_openshift-multus(262e0db9-4560-4557-823d-8a4145e03fd1): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:09 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.588385 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"11ba72333be3006758d58388a45c113cb337749ecf22bc73e22316a8aba0f030"} Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.588980 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-c49cw" podUID="262e0db9-4560-4557-823d-8a4145e03fd1" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.589911 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"527aaa0a94b3965903065c16d3ae31fe2f148506abf93293467a250f7bec2743"} Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.590051 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.591178 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.593119 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerStarted","Data":"835b0026c306346f11b0b21630d16cee154fc8d3e6e7b6f10631f09b7c217c02"} Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.593310 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:09 crc kubenswrapper[4841]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Jan 30 05:08:09 crc kubenswrapper[4841]: set -o allexport Jan 30 05:08:09 crc kubenswrapper[4841]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Jan 30 05:08:09 crc kubenswrapper[4841]: source /etc/kubernetes/apiserver-url.env Jan 30 05:08:09 crc kubenswrapper[4841]: else Jan 30 05:08:09 crc kubenswrapper[4841]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Jan 30 05:08:09 crc kubenswrapper[4841]: exit 1 Jan 30 05:08:09 crc kubenswrapper[4841]: fi Jan 30 05:08:09 crc kubenswrapper[4841]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Jan 30 05:08:09 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:09 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.594930 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.595825 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" event={"ID":"cd2ac8bc-c695-499d-aacb-0e47a29aa569","Type":"ContainerStarted","Data":"a6045d7f9121152ebfb1e24ce71da4eb660a39bc2fca155fb31e094572baafe5"} Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.596083 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:09 crc kubenswrapper[4841]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Jan 30 05:08:09 crc kubenswrapper[4841]: apiVersion: v1 Jan 30 05:08:09 crc kubenswrapper[4841]: clusters: Jan 30 05:08:09 crc kubenswrapper[4841]: - cluster: Jan 30 05:08:09 crc kubenswrapper[4841]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Jan 30 05:08:09 crc kubenswrapper[4841]: server: https://api-int.crc.testing:6443 Jan 30 05:08:09 crc kubenswrapper[4841]: name: default-cluster Jan 30 05:08:09 crc kubenswrapper[4841]: contexts: Jan 30 05:08:09 crc kubenswrapper[4841]: - context: Jan 30 05:08:09 crc kubenswrapper[4841]: cluster: default-cluster Jan 30 05:08:09 crc kubenswrapper[4841]: namespace: default Jan 30 05:08:09 crc kubenswrapper[4841]: user: default-auth Jan 30 05:08:09 crc kubenswrapper[4841]: name: default-context Jan 30 05:08:09 crc kubenswrapper[4841]: current-context: default-context Jan 30 05:08:09 crc kubenswrapper[4841]: kind: Config Jan 30 05:08:09 crc kubenswrapper[4841]: preferences: {} Jan 30 05:08:09 crc kubenswrapper[4841]: users: Jan 30 05:08:09 crc kubenswrapper[4841]: - name: default-auth Jan 30 05:08:09 crc kubenswrapper[4841]: user: Jan 30 05:08:09 crc kubenswrapper[4841]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Jan 30 05:08:09 crc kubenswrapper[4841]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Jan 30 05:08:09 crc kubenswrapper[4841]: EOF Jan 30 05:08:09 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qldm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-4fl5g_openshift-ovn-kubernetes(8f5d8664-d53a-4e96-9458-fd915cec77b5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:09 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.597763 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.598565 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lz5ft,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-2ww56_openshift-multus(cd2ac8bc-c695-499d-aacb-0e47a29aa569): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.598687 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"a1b3c2cf7e1fae9e2c833c0cf38470d95e5ed8df5c7aa723dbb3d9ec86dc651e"} Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.600603 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nsqj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.600664 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2hhs7" event={"ID":"bd6f6537-92be-4744-b292-751dd9ccdb1b","Type":"ContainerStarted","Data":"688e3bacc49a200b526f32373d10f35434b3d2b6f26ce90a5d4364ee8bbb5b81"} Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.601183 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-2ww56" podUID="cd2ac8bc-c695-499d-aacb-0e47a29aa569" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.602006 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:09 crc kubenswrapper[4841]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Jan 30 05:08:09 crc kubenswrapper[4841]: set -uo pipefail Jan 30 05:08:09 crc kubenswrapper[4841]: Jan 30 05:08:09 crc kubenswrapper[4841]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Jan 30 05:08:09 crc kubenswrapper[4841]: Jan 30 05:08:09 crc kubenswrapper[4841]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Jan 30 05:08:09 crc kubenswrapper[4841]: HOSTS_FILE="/etc/hosts" Jan 30 05:08:09 crc kubenswrapper[4841]: TEMP_FILE="/etc/hosts.tmp" Jan 30 05:08:09 crc kubenswrapper[4841]: Jan 30 05:08:09 crc kubenswrapper[4841]: IFS=', ' read -r -a services <<< "${SERVICES}" Jan 30 05:08:09 crc kubenswrapper[4841]: Jan 30 05:08:09 crc kubenswrapper[4841]: # Make a temporary file with the old hosts file's attributes. Jan 30 05:08:09 crc kubenswrapper[4841]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Jan 30 05:08:09 crc kubenswrapper[4841]: echo "Failed to preserve hosts file. Exiting." Jan 30 05:08:09 crc kubenswrapper[4841]: exit 1 Jan 30 05:08:09 crc kubenswrapper[4841]: fi Jan 30 05:08:09 crc kubenswrapper[4841]: Jan 30 05:08:09 crc kubenswrapper[4841]: while true; do Jan 30 05:08:09 crc kubenswrapper[4841]: declare -A svc_ips Jan 30 05:08:09 crc kubenswrapper[4841]: for svc in "${services[@]}"; do Jan 30 05:08:09 crc kubenswrapper[4841]: # Fetch service IP from cluster dns if present. We make several tries Jan 30 05:08:09 crc kubenswrapper[4841]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Jan 30 05:08:09 crc kubenswrapper[4841]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Jan 30 05:08:09 crc kubenswrapper[4841]: # support UDP loadbalancers and require reaching DNS through TCP. Jan 30 05:08:09 crc kubenswrapper[4841]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 30 05:08:09 crc kubenswrapper[4841]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 30 05:08:09 crc kubenswrapper[4841]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 30 05:08:09 crc kubenswrapper[4841]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Jan 30 05:08:09 crc kubenswrapper[4841]: for i in ${!cmds[*]} Jan 30 05:08:09 crc kubenswrapper[4841]: do Jan 30 05:08:09 crc kubenswrapper[4841]: ips=($(eval "${cmds[i]}")) Jan 30 05:08:09 crc kubenswrapper[4841]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Jan 30 05:08:09 crc kubenswrapper[4841]: svc_ips["${svc}"]="${ips[@]}" Jan 30 05:08:09 crc kubenswrapper[4841]: break Jan 30 05:08:09 crc kubenswrapper[4841]: fi Jan 30 05:08:09 crc kubenswrapper[4841]: done Jan 30 05:08:09 crc kubenswrapper[4841]: done Jan 30 05:08:09 crc kubenswrapper[4841]: Jan 30 05:08:09 crc kubenswrapper[4841]: # Update /etc/hosts only if we get valid service IPs Jan 30 05:08:09 crc kubenswrapper[4841]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Jan 30 05:08:09 crc kubenswrapper[4841]: # Stale entries could exist in /etc/hosts if the service is deleted Jan 30 05:08:09 crc kubenswrapper[4841]: if [[ -n "${svc_ips[*]-}" ]]; then Jan 30 05:08:09 crc kubenswrapper[4841]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Jan 30 05:08:09 crc kubenswrapper[4841]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Jan 30 05:08:09 crc kubenswrapper[4841]: # Only continue rebuilding the hosts entries if its original content is preserved Jan 30 05:08:09 crc kubenswrapper[4841]: sleep 60 & wait Jan 30 05:08:09 crc kubenswrapper[4841]: continue Jan 30 05:08:09 crc kubenswrapper[4841]: fi Jan 30 05:08:09 crc kubenswrapper[4841]: Jan 30 05:08:09 crc kubenswrapper[4841]: # Append resolver entries for services Jan 30 05:08:09 crc kubenswrapper[4841]: rc=0 Jan 30 05:08:09 crc kubenswrapper[4841]: for svc in "${!svc_ips[@]}"; do Jan 30 05:08:09 crc kubenswrapper[4841]: for ip in ${svc_ips[${svc}]}; do Jan 30 05:08:09 crc kubenswrapper[4841]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Jan 30 05:08:09 crc kubenswrapper[4841]: done Jan 30 05:08:09 crc kubenswrapper[4841]: done Jan 30 05:08:09 crc kubenswrapper[4841]: if [[ $rc -ne 0 ]]; then Jan 30 05:08:09 crc kubenswrapper[4841]: sleep 60 & wait Jan 30 05:08:09 crc kubenswrapper[4841]: continue Jan 30 05:08:09 crc kubenswrapper[4841]: fi Jan 30 05:08:09 crc kubenswrapper[4841]: Jan 30 05:08:09 crc kubenswrapper[4841]: Jan 30 05:08:09 crc kubenswrapper[4841]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Jan 30 05:08:09 crc kubenswrapper[4841]: # Replace /etc/hosts with our modified version if needed Jan 30 05:08:09 crc kubenswrapper[4841]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Jan 30 05:08:09 crc kubenswrapper[4841]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Jan 30 05:08:09 crc kubenswrapper[4841]: fi Jan 30 05:08:09 crc kubenswrapper[4841]: sleep 60 & wait Jan 30 05:08:09 crc kubenswrapper[4841]: unset svc_ips Jan 30 05:08:09 crc kubenswrapper[4841]: done Jan 30 05:08:09 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jxs7q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-2hhs7_openshift-dns(bd6f6537-92be-4744-b292-751dd9ccdb1b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:09 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.602633 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"46963d5340255917b2e92a6028f6c2f975fc453f6e7fa671ed9242b99ca7ae92"} Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.603302 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-2hhs7" podUID="bd6f6537-92be-4744-b292-751dd9ccdb1b" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.603545 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nsqj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.604205 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.604333 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:09 crc kubenswrapper[4841]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Jan 30 05:08:09 crc kubenswrapper[4841]: if [[ -f "/env/_master" ]]; then Jan 30 05:08:09 crc kubenswrapper[4841]: set -o allexport Jan 30 05:08:09 crc kubenswrapper[4841]: source "/env/_master" Jan 30 05:08:09 crc kubenswrapper[4841]: set +o allexport Jan 30 05:08:09 crc kubenswrapper[4841]: fi Jan 30 05:08:09 crc kubenswrapper[4841]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Jan 30 05:08:09 crc kubenswrapper[4841]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Jan 30 05:08:09 crc kubenswrapper[4841]: ho_enable="--enable-hybrid-overlay" Jan 30 05:08:09 crc kubenswrapper[4841]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Jan 30 05:08:09 crc kubenswrapper[4841]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Jan 30 05:08:09 crc kubenswrapper[4841]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Jan 30 05:08:09 crc kubenswrapper[4841]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Jan 30 05:08:09 crc kubenswrapper[4841]: --webhook-cert-dir="/etc/webhook-cert" \ Jan 30 05:08:09 crc kubenswrapper[4841]: --webhook-host=127.0.0.1 \ Jan 30 05:08:09 crc kubenswrapper[4841]: --webhook-port=9743 \ Jan 30 05:08:09 crc kubenswrapper[4841]: ${ho_enable} \ Jan 30 05:08:09 crc kubenswrapper[4841]: --enable-interconnect \ Jan 30 05:08:09 crc kubenswrapper[4841]: --disable-approver \ Jan 30 05:08:09 crc kubenswrapper[4841]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Jan 30 05:08:09 crc kubenswrapper[4841]: --wait-for-kubernetes-api=200s \ Jan 30 05:08:09 crc kubenswrapper[4841]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Jan 30 05:08:09 crc kubenswrapper[4841]: --loglevel="${LOGLEVEL}" Jan 30 05:08:09 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:09 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.604719 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.608075 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:09 crc kubenswrapper[4841]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Jan 30 05:08:09 crc kubenswrapper[4841]: if [[ -f "/env/_master" ]]; then Jan 30 05:08:09 crc kubenswrapper[4841]: set -o allexport Jan 30 05:08:09 crc kubenswrapper[4841]: source "/env/_master" Jan 30 05:08:09 crc kubenswrapper[4841]: set +o allexport Jan 30 05:08:09 crc kubenswrapper[4841]: fi Jan 30 05:08:09 crc kubenswrapper[4841]: Jan 30 05:08:09 crc kubenswrapper[4841]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Jan 30 05:08:09 crc kubenswrapper[4841]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Jan 30 05:08:09 crc kubenswrapper[4841]: --disable-webhook \ Jan 30 05:08:09 crc kubenswrapper[4841]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Jan 30 05:08:09 crc kubenswrapper[4841]: --loglevel="${LOGLEVEL}" Jan 30 05:08:09 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:09 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.609589 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.619262 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.629726 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.644725 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.656218 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.667479 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.673068 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/577a5760-2c37-4852-90c0-4962c2362824-serviceca\") pod \"node-ca-d7bdh\" (UID: \"577a5760-2c37-4852-90c0-4962c2362824\") " pod="openshift-image-registry/node-ca-d7bdh" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.673333 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9p8z\" (UniqueName: \"kubernetes.io/projected/577a5760-2c37-4852-90c0-4962c2362824-kube-api-access-n9p8z\") pod \"node-ca-d7bdh\" (UID: \"577a5760-2c37-4852-90c0-4962c2362824\") " pod="openshift-image-registry/node-ca-d7bdh" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.673451 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/577a5760-2c37-4852-90c0-4962c2362824-host\") pod \"node-ca-d7bdh\" (UID: \"577a5760-2c37-4852-90c0-4962c2362824\") " pod="openshift-image-registry/node-ca-d7bdh" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.674719 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/577a5760-2c37-4852-90c0-4962c2362824-host\") pod \"node-ca-d7bdh\" (UID: \"577a5760-2c37-4852-90c0-4962c2362824\") " pod="openshift-image-registry/node-ca-d7bdh" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.676713 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/577a5760-2c37-4852-90c0-4962c2362824-serviceca\") pod \"node-ca-d7bdh\" (UID: \"577a5760-2c37-4852-90c0-4962c2362824\") " pod="openshift-image-registry/node-ca-d7bdh" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.678769 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.688339 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.699645 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9p8z\" (UniqueName: \"kubernetes.io/projected/577a5760-2c37-4852-90c0-4962c2362824-kube-api-access-n9p8z\") pod \"node-ca-d7bdh\" (UID: \"577a5760-2c37-4852-90c0-4962c2362824\") " pod="openshift-image-registry/node-ca-d7bdh" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.701534 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.713111 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.730417 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.739099 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.745545 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.754882 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.765475 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.777494 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.788872 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:09 crc kubenswrapper[4841]: I0130 05:08:09.792098 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d7bdh" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.810022 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:09 crc kubenswrapper[4841]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Jan 30 05:08:09 crc kubenswrapper[4841]: while [ true ]; Jan 30 05:08:09 crc kubenswrapper[4841]: do Jan 30 05:08:09 crc kubenswrapper[4841]: for f in $(ls /tmp/serviceca); do Jan 30 05:08:09 crc kubenswrapper[4841]: echo $f Jan 30 05:08:09 crc kubenswrapper[4841]: ca_file_path="/tmp/serviceca/${f}" Jan 30 05:08:09 crc kubenswrapper[4841]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Jan 30 05:08:09 crc kubenswrapper[4841]: reg_dir_path="/etc/docker/certs.d/${f}" Jan 30 05:08:09 crc kubenswrapper[4841]: if [ -e "${reg_dir_path}" ]; then Jan 30 05:08:09 crc kubenswrapper[4841]: cp -u $ca_file_path $reg_dir_path/ca.crt Jan 30 05:08:09 crc kubenswrapper[4841]: else Jan 30 05:08:09 crc kubenswrapper[4841]: mkdir $reg_dir_path Jan 30 05:08:09 crc kubenswrapper[4841]: cp $ca_file_path $reg_dir_path/ca.crt Jan 30 05:08:09 crc kubenswrapper[4841]: fi Jan 30 05:08:09 crc kubenswrapper[4841]: done Jan 30 05:08:09 crc kubenswrapper[4841]: for d in $(ls /etc/docker/certs.d); do Jan 30 05:08:09 crc kubenswrapper[4841]: echo $d Jan 30 05:08:09 crc kubenswrapper[4841]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Jan 30 05:08:09 crc kubenswrapper[4841]: reg_conf_path="/tmp/serviceca/${dp}" Jan 30 05:08:09 crc kubenswrapper[4841]: if [ ! -e "${reg_conf_path}" ]; then Jan 30 05:08:09 crc kubenswrapper[4841]: rm -rf /etc/docker/certs.d/$d Jan 30 05:08:09 crc kubenswrapper[4841]: fi Jan 30 05:08:09 crc kubenswrapper[4841]: done Jan 30 05:08:09 crc kubenswrapper[4841]: sleep 60 & wait ${!} Jan 30 05:08:09 crc kubenswrapper[4841]: done Jan 30 05:08:09 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n9p8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-d7bdh_openshift-image-registry(577a5760-2c37-4852-90c0-4962c2362824): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:09 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:09 crc kubenswrapper[4841]: E0130 05:08:09.812166 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-d7bdh" podUID="577a5760-2c37-4852-90c0-4962c2362824" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.077701 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.077860 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.077924 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.077987 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:12.077929077 +0000 UTC m=+29.071401745 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.078048 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.078050 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.078180 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:12.078104181 +0000 UTC m=+29.071576859 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.078201 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.078208 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.078226 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.078331 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:12.078301636 +0000 UTC m=+29.071774314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.078346 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.078391 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:12.078381168 +0000 UTC m=+29.071853806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.178934 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.179232 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.179291 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.179312 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.179499 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:12.179464184 +0000 UTC m=+29.172936862 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.361142 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 05:23:07.276317901 +0000 UTC Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.431039 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.431071 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.431225 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.431260 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.431356 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.431503 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.440723 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.441256 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.442420 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.442994 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.443888 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.444347 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.444932 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.445818 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.446380 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.447232 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.447702 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.448676 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.449130 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.449645 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.450495 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.450970 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.451859 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.452215 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.452757 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.453675 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.454090 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.454968 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.455364 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.456319 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.456711 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.457265 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.458260 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.458716 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.459612 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.460053 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.461114 4841 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.461434 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.464534 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.465670 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.466716 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.469260 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.471324 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.472195 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.473183 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.474170 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.474829 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.475698 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.476555 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.477423 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.478081 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.478897 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.479799 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.481593 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.482294 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.482978 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.483686 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.484423 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.485184 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.485836 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.606497 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d7bdh" event={"ID":"577a5760-2c37-4852-90c0-4962c2362824","Type":"ContainerStarted","Data":"b0ab003db8d117889ecdd3f4bdb09435387fec1a2fd0c4be62fbe7f00ec41bac"} Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.608968 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:08:10 crc kubenswrapper[4841]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Jan 30 05:08:10 crc kubenswrapper[4841]: while [ true ]; Jan 30 05:08:10 crc kubenswrapper[4841]: do Jan 30 05:08:10 crc kubenswrapper[4841]: for f in $(ls /tmp/serviceca); do Jan 30 05:08:10 crc kubenswrapper[4841]: echo $f Jan 30 05:08:10 crc kubenswrapper[4841]: ca_file_path="/tmp/serviceca/${f}" Jan 30 05:08:10 crc kubenswrapper[4841]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Jan 30 05:08:10 crc kubenswrapper[4841]: reg_dir_path="/etc/docker/certs.d/${f}" Jan 30 05:08:10 crc kubenswrapper[4841]: if [ -e "${reg_dir_path}" ]; then Jan 30 05:08:10 crc kubenswrapper[4841]: cp -u $ca_file_path $reg_dir_path/ca.crt Jan 30 05:08:10 crc kubenswrapper[4841]: else Jan 30 05:08:10 crc kubenswrapper[4841]: mkdir $reg_dir_path Jan 30 05:08:10 crc kubenswrapper[4841]: cp $ca_file_path $reg_dir_path/ca.crt Jan 30 05:08:10 crc kubenswrapper[4841]: fi Jan 30 05:08:10 crc kubenswrapper[4841]: done Jan 30 05:08:10 crc kubenswrapper[4841]: for d in $(ls /etc/docker/certs.d); do Jan 30 05:08:10 crc kubenswrapper[4841]: echo $d Jan 30 05:08:10 crc kubenswrapper[4841]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Jan 30 05:08:10 crc kubenswrapper[4841]: reg_conf_path="/tmp/serviceca/${dp}" Jan 30 05:08:10 crc kubenswrapper[4841]: if [ ! -e "${reg_conf_path}" ]; then Jan 30 05:08:10 crc kubenswrapper[4841]: rm -rf /etc/docker/certs.d/$d Jan 30 05:08:10 crc kubenswrapper[4841]: fi Jan 30 05:08:10 crc kubenswrapper[4841]: done Jan 30 05:08:10 crc kubenswrapper[4841]: sleep 60 & wait ${!} Jan 30 05:08:10 crc kubenswrapper[4841]: done Jan 30 05:08:10 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n9p8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-d7bdh_openshift-image-registry(577a5760-2c37-4852-90c0-4962c2362824): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 30 05:08:10 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:08:10 crc kubenswrapper[4841]: E0130 05:08:10.610057 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-d7bdh" podUID="577a5760-2c37-4852-90c0-4962c2362824" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.620962 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.628290 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.638689 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.648943 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.672033 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.679337 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.689367 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.697300 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.706104 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.716146 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.726453 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:10 crc kubenswrapper[4841]: I0130 05:08:10.740796 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.060225 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.067224 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.070119 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.070633 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.081949 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.092216 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.101930 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.119515 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.134161 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.149122 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.165450 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.192877 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.208116 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.231968 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.243037 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.254463 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.264993 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.272309 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.280334 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.288337 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.298745 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.307673 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.319029 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.329526 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.341557 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.357139 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.361756 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 08:39:40.877996276 +0000 UTC Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.367603 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:11 crc kubenswrapper[4841]: I0130 05:08:11.376583 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:12 crc kubenswrapper[4841]: I0130 05:08:12.101247 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:12 crc kubenswrapper[4841]: I0130 05:08:12.101390 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.101447 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:16.10138958 +0000 UTC m=+33.094862248 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.101502 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:12 crc kubenswrapper[4841]: I0130 05:08:12.101507 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.101564 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:16.101546494 +0000 UTC m=+33.095019152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:12 crc kubenswrapper[4841]: I0130 05:08:12.101611 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.101700 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.101737 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.101751 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.101788 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.101814 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:16.101793901 +0000 UTC m=+33.095266549 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.101844 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:16.101829522 +0000 UTC m=+33.095302300 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:12 crc kubenswrapper[4841]: I0130 05:08:12.202704 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.203061 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.203126 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.203149 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.203241 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:16.203213124 +0000 UTC m=+33.196685792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:12 crc kubenswrapper[4841]: I0130 05:08:12.362333 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 07:49:58.104237524 +0000 UTC Jan 30 05:08:12 crc kubenswrapper[4841]: I0130 05:08:12.431914 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:12 crc kubenswrapper[4841]: I0130 05:08:12.431965 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:12 crc kubenswrapper[4841]: I0130 05:08:12.431965 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.432157 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.432302 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:12 crc kubenswrapper[4841]: E0130 05:08:12.432501 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.363357 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 09:13:07.238776612 +0000 UTC Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.910501 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.914257 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.914319 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.914338 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.914515 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.926075 4841 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.926369 4841 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.927861 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.927917 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.927935 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.927957 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.927975 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4841]: E0130 05:08:13.951084 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98d8a8ed-cf63-480b-98f6-6728ad28fc06\\\",\\\"systemUUID\\\":\\\"a1d8c29a-8e5b-4ce6-b544-127e3ff9ee5c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.960640 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.960705 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.960732 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.960764 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.960789 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.977243 4841 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 05:08:13 crc kubenswrapper[4841]: E0130 05:08:13.977736 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98d8a8ed-cf63-480b-98f6-6728ad28fc06\\\",\\\"systemUUID\\\":\\\"a1d8c29a-8e5b-4ce6-b544-127e3ff9ee5c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.982810 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.982859 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.982877 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.982902 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:13 crc kubenswrapper[4841]: I0130 05:08:13.982919 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:13Z","lastTransitionTime":"2026-01-30T05:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:13 crc kubenswrapper[4841]: E0130 05:08:13.998211 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98d8a8ed-cf63-480b-98f6-6728ad28fc06\\\",\\\"systemUUID\\\":\\\"a1d8c29a-8e5b-4ce6-b544-127e3ff9ee5c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.003088 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.003134 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.003152 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.003173 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.003191 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4841]: E0130 05:08:14.018909 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98d8a8ed-cf63-480b-98f6-6728ad28fc06\\\",\\\"systemUUID\\\":\\\"a1d8c29a-8e5b-4ce6-b544-127e3ff9ee5c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.023957 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.024022 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.024044 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.024072 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.024094 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4841]: E0130 05:08:14.040800 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98d8a8ed-cf63-480b-98f6-6728ad28fc06\\\",\\\"systemUUID\\\":\\\"a1d8c29a-8e5b-4ce6-b544-127e3ff9ee5c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: E0130 05:08:14.041226 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.043316 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.043392 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.043451 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.043486 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.043511 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.075708 4841 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 30 05:08:14 crc kubenswrapper[4841]: W0130 05:08:14.077524 4841 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.155809 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.155883 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.155900 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.155923 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.155941 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.258815 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.258941 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.258973 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.258999 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.259021 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.279115 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.295237 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.295661 4841 scope.go:117] "RemoveContainer" containerID="042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.295646 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: E0130 05:08:14.295949 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.309160 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.326146 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.342042 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.359794 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.363043 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.363093 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.363109 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.363133 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.363151 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.363801 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 20:18:48.212112277 +0000 UTC Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.376332 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.393242 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.424130 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.431666 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.431702 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:14 crc kubenswrapper[4841]: E0130 05:08:14.431844 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.431921 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:14 crc kubenswrapper[4841]: E0130 05:08:14.432148 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:14 crc kubenswrapper[4841]: E0130 05:08:14.432262 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.439343 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.450931 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.465008 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.465623 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.465675 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.465692 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.465717 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.465734 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.481579 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.492551 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.507009 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.521845 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.534203 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.551043 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.567043 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.569322 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.569379 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.569477 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.569514 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.569533 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.580744 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.595175 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.609796 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.620061 4841 scope.go:117] "RemoveContainer" containerID="042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3" Jan 30 05:08:14 crc kubenswrapper[4841]: E0130 05:08:14.620338 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.628988 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.643083 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.658236 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.674657 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.674723 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.674747 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.674777 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.674800 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.690908 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.705196 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.716479 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.777656 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.777713 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.777730 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.777757 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.777775 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.881205 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.881266 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.881284 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.881306 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.881321 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.983671 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.983782 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.983807 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.983835 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:14 crc kubenswrapper[4841]: I0130 05:08:14.983858 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:14Z","lastTransitionTime":"2026-01-30T05:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.087254 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.087327 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.087348 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.087373 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.087495 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.190502 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.190577 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.190595 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.190620 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.190637 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.293554 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.293619 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.293640 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.293665 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.293686 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.364606 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 15:49:30.380561447 +0000 UTC Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.396865 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.396940 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.396970 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.396995 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.397013 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.500096 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.500157 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.500176 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.500202 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.500221 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.604000 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.604067 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.604085 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.604111 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.604132 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.707033 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.707093 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.707111 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.707136 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.707154 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.810362 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.810461 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.810479 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.810501 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.810523 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.915011 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.915094 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.915116 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.915147 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:15 crc kubenswrapper[4841]: I0130 05:08:15.915170 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:15Z","lastTransitionTime":"2026-01-30T05:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.018699 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.018760 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.018777 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.018805 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.018823 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.122113 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.122177 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.122194 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.122219 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.122237 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.146089 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.146235 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:24.146204214 +0000 UTC m=+41.139676882 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.146303 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.146391 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.146480 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.146604 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.146625 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.146702 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:24.146682746 +0000 UTC m=+41.140155414 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.146732 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:24.146719337 +0000 UTC m=+41.140192015 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.146808 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.146836 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.146853 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.146917 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:24.146900982 +0000 UTC m=+41.140373660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.225617 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.225668 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.225685 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.225710 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.225727 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.247382 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.247596 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.247630 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.247649 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.247727 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:24.247706981 +0000 UTC m=+41.241179649 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.327859 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.327939 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.327962 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.327995 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.328019 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.365422 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 21:18:32.874882466 +0000 UTC Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.430502 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.430576 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.430588 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.430627 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.430641 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.430777 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.430808 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.430786 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.430922 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.431017 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:16 crc kubenswrapper[4841]: E0130 05:08:16.431096 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.533233 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.533319 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.533345 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.533374 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.533393 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.636016 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.636138 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.636164 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.636196 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.636220 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.739233 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.739351 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.739376 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.739437 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.739461 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.842335 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.842452 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.842473 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.842497 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.842515 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.945490 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.945540 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.945562 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.945586 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:16 crc kubenswrapper[4841]: I0130 05:08:16.945604 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:16Z","lastTransitionTime":"2026-01-30T05:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.049223 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.049307 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.049329 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.049363 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.049386 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.152318 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.152607 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.152634 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.152664 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.152688 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.255897 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.255943 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.255957 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.255977 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.255992 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.359457 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.359549 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.359567 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.359590 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.359616 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.365999 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 16:31:31.340617424 +0000 UTC Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.463294 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.463368 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.463392 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.463456 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.463479 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.566107 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.566215 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.566276 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.566304 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.566323 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.668891 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.668967 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.668992 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.669020 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.669042 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.772113 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.772178 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.772197 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.772221 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.772239 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.875653 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.875714 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.875730 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.875752 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.875768 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.979203 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.979267 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.979283 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.979307 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:17 crc kubenswrapper[4841]: I0130 05:08:17.979324 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:17Z","lastTransitionTime":"2026-01-30T05:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.081960 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.082028 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.082049 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.082077 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.082096 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.184251 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.184300 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.184311 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.184329 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.184341 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.287512 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.287568 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.287589 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.287614 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.287634 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.366518 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 15:52:52.257929941 +0000 UTC Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.389955 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.390010 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.390027 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.390048 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.390064 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.431975 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.432148 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:18 crc kubenswrapper[4841]: E0130 05:08:18.432430 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.432466 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:18 crc kubenswrapper[4841]: E0130 05:08:18.432632 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:18 crc kubenswrapper[4841]: E0130 05:08:18.433011 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.493089 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.493151 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.493187 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.493211 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.493229 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.596234 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.596272 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.596281 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.596296 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.596305 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.699467 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.699562 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.699579 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.699604 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.699621 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.802373 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.802496 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.802514 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.802537 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.802556 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.906120 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.906192 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.906209 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.906236 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:18 crc kubenswrapper[4841]: I0130 05:08:18.906254 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:18Z","lastTransitionTime":"2026-01-30T05:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.009665 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.009740 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.009758 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.009780 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.009799 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.112751 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.112836 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.112854 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.112880 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.112901 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.215490 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.215535 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.215546 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.215563 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.215575 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.318889 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.318942 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.318982 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.319001 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.319012 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.367335 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 14:49:29.193735851 +0000 UTC Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.422867 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.422965 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.422992 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.423028 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.423053 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.525918 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.525991 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.526014 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.526046 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.526068 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.628541 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.628604 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.628622 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.628649 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.628666 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.731834 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.731901 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.731921 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.731946 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.731965 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.835069 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.835134 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.835151 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.835173 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.835190 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.938518 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.938584 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.938605 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.938630 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.938648 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:19Z","lastTransitionTime":"2026-01-30T05:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.939477 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4"] Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.940375 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.942727 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.946108 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.951113 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.968675 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.979950 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.990758 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/32e77eac-b883-466e-9ab3-921f182cca14-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gjgb4\" (UID: \"32e77eac-b883-466e-9ab3-921f182cca14\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.990825 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32e77eac-b883-466e-9ab3-921f182cca14-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gjgb4\" (UID: \"32e77eac-b883-466e-9ab3-921f182cca14\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.990913 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d97vl\" (UniqueName: \"kubernetes.io/projected/32e77eac-b883-466e-9ab3-921f182cca14-kube-api-access-d97vl\") pod \"ovnkube-control-plane-749d76644c-gjgb4\" (UID: \"32e77eac-b883-466e-9ab3-921f182cca14\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.990951 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/32e77eac-b883-466e-9ab3-921f182cca14-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gjgb4\" (UID: \"32e77eac-b883-466e-9ab3-921f182cca14\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:19 crc kubenswrapper[4841]: I0130 05:08:19.996456 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.006345 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e77eac-b883-466e-9ab3-921f182cca14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjgb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.017290 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.030707 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.040750 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.040792 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.040804 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.040822 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.040835 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.050143 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.060604 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.070359 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.085360 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.091700 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d97vl\" (UniqueName: \"kubernetes.io/projected/32e77eac-b883-466e-9ab3-921f182cca14-kube-api-access-d97vl\") pod \"ovnkube-control-plane-749d76644c-gjgb4\" (UID: \"32e77eac-b883-466e-9ab3-921f182cca14\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.091779 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/32e77eac-b883-466e-9ab3-921f182cca14-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gjgb4\" (UID: \"32e77eac-b883-466e-9ab3-921f182cca14\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.091890 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/32e77eac-b883-466e-9ab3-921f182cca14-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gjgb4\" (UID: \"32e77eac-b883-466e-9ab3-921f182cca14\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.091952 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32e77eac-b883-466e-9ab3-921f182cca14-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gjgb4\" (UID: \"32e77eac-b883-466e-9ab3-921f182cca14\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.093008 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/32e77eac-b883-466e-9ab3-921f182cca14-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gjgb4\" (UID: \"32e77eac-b883-466e-9ab3-921f182cca14\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.093558 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/32e77eac-b883-466e-9ab3-921f182cca14-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gjgb4\" (UID: \"32e77eac-b883-466e-9ab3-921f182cca14\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.099743 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32e77eac-b883-466e-9ab3-921f182cca14-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gjgb4\" (UID: \"32e77eac-b883-466e-9ab3-921f182cca14\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.114668 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.125471 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d97vl\" (UniqueName: \"kubernetes.io/projected/32e77eac-b883-466e-9ab3-921f182cca14-kube-api-access-d97vl\") pod \"ovnkube-control-plane-749d76644c-gjgb4\" (UID: \"32e77eac-b883-466e-9ab3-921f182cca14\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.126531 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.140165 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.143982 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.144024 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.144040 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.144063 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.144079 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.155496 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.246633 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.246691 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.246707 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.246725 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.246739 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.261428 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.350135 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.350176 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.350188 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.350210 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.350225 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.368107 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 21:19:48.791378017 +0000 UTC Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.431273 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.431307 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:20 crc kubenswrapper[4841]: E0130 05:08:20.431392 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.431292 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:20 crc kubenswrapper[4841]: E0130 05:08:20.431707 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:20 crc kubenswrapper[4841]: E0130 05:08:20.431738 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.453574 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.453623 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.453634 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.453651 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.453663 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.556611 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.556664 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.556679 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.556699 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.556714 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.637969 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" event={"ID":"32e77eac-b883-466e-9ab3-921f182cca14","Type":"ContainerStarted","Data":"13935d5717314a05d572f9c1dbfbc83ae73eda0bda4e908506ef36fbda98a003"} Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.638221 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" event={"ID":"32e77eac-b883-466e-9ab3-921f182cca14","Type":"ContainerStarted","Data":"e9a0b4fae7630007219d54f5a76bb57a9ae724650c93095408ae7fe2629ea9ad"} Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.640551 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801"} Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.660515 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.660556 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.660567 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.660586 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.660599 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.763538 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.763862 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.763875 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.763892 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.763905 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.866720 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.866784 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.866803 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.866833 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.866851 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.971355 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.971470 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.971492 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.971522 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:20 crc kubenswrapper[4841]: I0130 05:08:20.971550 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:20Z","lastTransitionTime":"2026-01-30T05:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.033816 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-25sxv"] Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.034703 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:21 crc kubenswrapper[4841]: E0130 05:08:21.034880 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25sxv" podUID="1e275bab-612f-4fe8-8a4f-792634265c15" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.046693 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.061713 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.075567 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.075627 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.075645 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.075669 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.075686 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.076947 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.090480 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.102749 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs\") pod \"network-metrics-daemon-25sxv\" (UID: \"1e275bab-612f-4fe8-8a4f-792634265c15\") " pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.102974 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjgl4\" (UniqueName: \"kubernetes.io/projected/1e275bab-612f-4fe8-8a4f-792634265c15-kube-api-access-cjgl4\") pod \"network-metrics-daemon-25sxv\" (UID: \"1e275bab-612f-4fe8-8a4f-792634265c15\") " pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.103643 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25sxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e275bab-612f-4fe8-8a4f-792634265c15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25sxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.121236 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.136842 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.151841 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.165344 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e77eac-b883-466e-9ab3-921f182cca14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjgb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.175289 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.178358 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.178417 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.178433 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.178455 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.178470 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.188155 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.196215 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.203027 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.203540 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjgl4\" (UniqueName: \"kubernetes.io/projected/1e275bab-612f-4fe8-8a4f-792634265c15-kube-api-access-cjgl4\") pod \"network-metrics-daemon-25sxv\" (UID: \"1e275bab-612f-4fe8-8a4f-792634265c15\") " pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.203617 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs\") pod \"network-metrics-daemon-25sxv\" (UID: \"1e275bab-612f-4fe8-8a4f-792634265c15\") " pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:21 crc kubenswrapper[4841]: E0130 05:08:21.203800 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:21 crc kubenswrapper[4841]: E0130 05:08:21.203868 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs podName:1e275bab-612f-4fe8-8a4f-792634265c15 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:21.703852792 +0000 UTC m=+38.697325440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs") pod "network-metrics-daemon-25sxv" (UID: "1e275bab-612f-4fe8-8a4f-792634265c15") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.214210 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.229389 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.232525 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjgl4\" (UniqueName: \"kubernetes.io/projected/1e275bab-612f-4fe8-8a4f-792634265c15-kube-api-access-cjgl4\") pod \"network-metrics-daemon-25sxv\" (UID: \"1e275bab-612f-4fe8-8a4f-792634265c15\") " pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.251123 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.281281 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.281381 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.281459 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.281495 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.281519 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.368522 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 11:14:38.092932045 +0000 UTC Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.384335 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.384375 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.384388 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.384426 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.384438 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.486366 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.486421 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.486431 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.486447 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.486458 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.588689 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.588726 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.588736 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.588753 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.588764 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.644910 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" event={"ID":"32e77eac-b883-466e-9ab3-921f182cca14","Type":"ContainerStarted","Data":"7b19fbe2bb17150acda4325b99e97b65778ed7b272947befd5810ceb41763b35"} Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.646938 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"15d35a37de33c3c613ef43ace617c6729e7e1665ef40647296e57dc14fa55316"} Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.656997 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.669567 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.692116 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.692705 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.692763 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.692783 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.692807 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.692825 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.701902 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.707785 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.708210 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs\") pod \"network-metrics-daemon-25sxv\" (UID: \"1e275bab-612f-4fe8-8a4f-792634265c15\") " pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:21 crc kubenswrapper[4841]: E0130 05:08:21.708360 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:21 crc kubenswrapper[4841]: E0130 05:08:21.708426 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs podName:1e275bab-612f-4fe8-8a4f-792634265c15 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:22.708412568 +0000 UTC m=+39.701885206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs") pod "network-metrics-daemon-25sxv" (UID: "1e275bab-612f-4fe8-8a4f-792634265c15") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.720185 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.727903 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.733582 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.741245 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.748307 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.754625 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.760637 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25sxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e275bab-612f-4fe8-8a4f-792634265c15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25sxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.769765 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.777640 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.793163 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.796564 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.796591 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.796600 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.796615 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.796625 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.802709 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e77eac-b883-466e-9ab3-921f182cca14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13935d5717314a05d572f9c1dbfbc83ae73eda0bda4e908506ef36fbda98a003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b19fbe2bb17150acda4325b99e97b65778ed7b272947befd5810ceb41763b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjgb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.817996 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.824561 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.839463 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.850579 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.864928 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.874217 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d35a37de33c3c613ef43ace617c6729e7e1665ef40647296e57dc14fa55316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.883765 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25sxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e275bab-612f-4fe8-8a4f-792634265c15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25sxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.897888 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.900249 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.900296 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.900313 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.900334 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.900350 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:21Z","lastTransitionTime":"2026-01-30T05:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.912700 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.925663 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.939897 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e77eac-b883-466e-9ab3-921f182cca14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13935d5717314a05d572f9c1dbfbc83ae73eda0bda4e908506ef36fbda98a003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b19fbe2bb17150acda4325b99e97b65778ed7b272947befd5810ceb41763b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjgb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.952494 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.976364 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.988155 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:21 crc kubenswrapper[4841]: I0130 05:08:21.998978 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.003170 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.003234 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.003247 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.003264 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.003277 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.010187 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.105917 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.105980 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.105996 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.106018 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.106036 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.209222 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.209288 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.209304 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.209327 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.209342 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.312158 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.312210 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.312229 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.312253 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.312273 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.369071 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 15:17:31.622561287 +0000 UTC Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.415188 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.415235 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.415253 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.415276 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.415293 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.431229 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.431657 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:22 crc kubenswrapper[4841]: E0130 05:08:22.431764 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.431873 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.432128 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:22 crc kubenswrapper[4841]: E0130 05:08:22.433042 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25sxv" podUID="1e275bab-612f-4fe8-8a4f-792634265c15" Jan 30 05:08:22 crc kubenswrapper[4841]: E0130 05:08:22.433187 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:22 crc kubenswrapper[4841]: E0130 05:08:22.432873 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.520640 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.520674 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.520686 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.520706 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.520721 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.624129 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.624638 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.624650 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.624664 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.624674 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.650762 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" event={"ID":"cd2ac8bc-c695-499d-aacb-0e47a29aa569","Type":"ContainerStarted","Data":"918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.653017 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d7bdh" event={"ID":"577a5760-2c37-4852-90c0-4962c2362824","Type":"ContainerStarted","Data":"35f8f7def03e94f78561f9ea30a34ca0f8e0fa00adda80bef84452359f3b4e72"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.654976 4841 generic.go:334] "Generic (PLEG): container finished" podID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerID="11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a" exitCode=0 Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.655031 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerDied","Data":"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.662340 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.676259 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.687688 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.695441 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.703566 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.715268 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.719084 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs\") pod \"network-metrics-daemon-25sxv\" (UID: \"1e275bab-612f-4fe8-8a4f-792634265c15\") " pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:22 crc kubenswrapper[4841]: E0130 05:08:22.719226 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:22 crc kubenswrapper[4841]: E0130 05:08:22.719302 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs podName:1e275bab-612f-4fe8-8a4f-792634265c15 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:24.719286824 +0000 UTC m=+41.712759452 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs") pod "network-metrics-daemon-25sxv" (UID: "1e275bab-612f-4fe8-8a4f-792634265c15") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.724062 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.733488 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.733536 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.733551 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.733576 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.733593 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.733533 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.749250 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.765882 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.777258 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d35a37de33c3c613ef43ace617c6729e7e1665ef40647296e57dc14fa55316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.785502 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25sxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e275bab-612f-4fe8-8a4f-792634265c15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25sxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.796837 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.806365 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.820856 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.830791 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e77eac-b883-466e-9ab3-921f182cca14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13935d5717314a05d572f9c1dbfbc83ae73eda0bda4e908506ef36fbda98a003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b19fbe2bb17150acda4325b99e97b65778ed7b272947befd5810ceb41763b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjgb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.835771 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.835812 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.835825 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.835842 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.835854 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.841457 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.850427 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.858561 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d35a37de33c3c613ef43ace617c6729e7e1665ef40647296e57dc14fa55316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.865978 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25sxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e275bab-612f-4fe8-8a4f-792634265c15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25sxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.877014 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.889496 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.899835 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.907876 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e77eac-b883-466e-9ab3-921f182cca14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13935d5717314a05d572f9c1dbfbc83ae73eda0bda4e908506ef36fbda98a003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b19fbe2bb17150acda4325b99e97b65778ed7b272947befd5810ceb41763b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjgb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.915996 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.928010 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.938200 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.938245 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.938261 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.938283 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.938299 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:22Z","lastTransitionTime":"2026-01-30T05:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.943837 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.954942 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.963053 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35f8f7def03e94f78561f9ea30a34ca0f8e0fa00adda80bef84452359f3b4e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.974562 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.986961 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:22 crc kubenswrapper[4841]: I0130 05:08:22.995612 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.040656 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.040710 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.040723 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.040741 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.040755 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.144092 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.144151 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.144169 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.144192 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.144209 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.247655 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.247714 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.247731 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.247754 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.247805 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.352196 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.352258 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.352282 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.352313 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.352336 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.369472 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 03:41:01.769238286 +0000 UTC Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.455257 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.455348 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.455371 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.455454 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.455482 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.558103 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.558154 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.558166 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.558196 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.558212 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.660264 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.660309 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.660326 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.660345 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.660356 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.660563 4841 generic.go:334] "Generic (PLEG): container finished" podID="cd2ac8bc-c695-499d-aacb-0e47a29aa569" containerID="918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08" exitCode=0 Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.660659 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" event={"ID":"cd2ac8bc-c695-499d-aacb-0e47a29aa569","Type":"ContainerDied","Data":"918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.664445 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c49cw" event={"ID":"262e0db9-4560-4557-823d-8a4145e03fd1","Type":"ContainerStarted","Data":"0a1d11eb03ded7773dd8e3beb118ebf6b2b97975cd4e82206ab01c0b0b9e88f7"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.668052 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0506b7378f588431575d0cb4ad90d8008cfc7d9961897b073dcd568e04f3ec26"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.679428 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.680690 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2hhs7" event={"ID":"bd6f6537-92be-4744-b292-751dd9ccdb1b","Type":"ContainerStarted","Data":"e9060620251cdf34f42186129a539242d97ba46a04d1fb24964087f302e5687c"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.702714 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.724041 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.739425 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.755780 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.768577 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.769050 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.769075 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.769107 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.769125 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.768762 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d35a37de33c3c613ef43ace617c6729e7e1665ef40647296e57dc14fa55316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.792662 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25sxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e275bab-612f-4fe8-8a4f-792634265c15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25sxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.808142 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.821054 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.832190 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.841903 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e77eac-b883-466e-9ab3-921f182cca14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13935d5717314a05d572f9c1dbfbc83ae73eda0bda4e908506ef36fbda98a003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b19fbe2bb17150acda4325b99e97b65778ed7b272947befd5810ceb41763b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjgb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.853163 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.869132 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.871859 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.871892 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.871904 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.871922 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.871934 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.890223 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.902444 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.915193 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35f8f7def03e94f78561f9ea30a34ca0f8e0fa00adda80bef84452359f3b4e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.927921 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.936071 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9060620251cdf34f42186129a539242d97ba46a04d1fb24964087f302e5687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.947941 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.959580 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.971530 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.974786 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.974806 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.974813 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.974826 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.974834 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:23Z","lastTransitionTime":"2026-01-30T05:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.980767 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d35a37de33c3c613ef43ace617c6729e7e1665ef40647296e57dc14fa55316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:23 crc kubenswrapper[4841]: I0130 05:08:23.987829 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25sxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e275bab-612f-4fe8-8a4f-792634265c15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25sxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.001792 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0506b7378f588431575d0cb4ad90d8008cfc7d9961897b073dcd568e04f3ec26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.012357 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.027814 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.040102 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e77eac-b883-466e-9ab3-921f182cca14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13935d5717314a05d572f9c1dbfbc83ae73eda0bda4e908506ef36fbda98a003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b19fbe2bb17150acda4325b99e97b65778ed7b272947befd5810ceb41763b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjgb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.050839 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a1d11eb03ded7773dd8e3beb118ebf6b2b97975cd4e82206ab01c0b0b9e88f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.071244 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.077538 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.077580 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.077592 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.077610 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.077621 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.080540 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.089469 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35f8f7def03e94f78561f9ea30a34ca0f8e0fa00adda80bef84452359f3b4e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.098180 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.147839 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.147890 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.147903 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.147921 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.147933 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.158182 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98d8a8ed-cf63-480b-98f6-6728ad28fc06\\\",\\\"systemUUID\\\":\\\"a1d8c29a-8e5b-4ce6-b544-127e3ff9ee5c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.164756 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.165100 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.165113 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.165132 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.165143 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.174028 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98d8a8ed-cf63-480b-98f6-6728ad28fc06\\\",\\\"systemUUID\\\":\\\"a1d8c29a-8e5b-4ce6-b544-127e3ff9ee5c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.180671 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.180704 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.180714 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.180727 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.180739 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.190742 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98d8a8ed-cf63-480b-98f6-6728ad28fc06\\\",\\\"systemUUID\\\":\\\"a1d8c29a-8e5b-4ce6-b544-127e3ff9ee5c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.194938 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.194982 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.194994 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.195010 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.195021 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.204135 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98d8a8ed-cf63-480b-98f6-6728ad28fc06\\\",\\\"systemUUID\\\":\\\"a1d8c29a-8e5b-4ce6-b544-127e3ff9ee5c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.209521 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.209555 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.209565 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.209579 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.209588 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.221887 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98d8a8ed-cf63-480b-98f6-6728ad28fc06\\\",\\\"systemUUID\\\":\\\"a1d8c29a-8e5b-4ce6-b544-127e3ff9ee5c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.221994 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.223744 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.223769 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.223778 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.223793 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.223802 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.235346 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.235484 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.235551 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.235514695 +0000 UTC m=+57.228987333 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.235631 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.235721 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.235640 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.235726 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.235700889 +0000 UTC m=+57.229173567 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.235844 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.235798592 +0000 UTC m=+57.229271230 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.235927 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.236064 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.236083 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.236093 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.236137 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.23613015 +0000 UTC m=+57.229602788 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.327075 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.327111 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.327119 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.327133 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.327143 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.336248 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.336385 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.336419 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.336429 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.336475 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.336461087 +0000 UTC m=+57.329933725 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.370509 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 10:07:27.850410031 +0000 UTC Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.429693 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.429752 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.429769 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.429794 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.429817 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.430953 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.430998 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.431033 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.431185 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.431435 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.431628 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.431718 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.431718 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25sxv" podUID="1e275bab-612f-4fe8-8a4f-792634265c15" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.442156 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0506b7378f588431575d0cb4ad90d8008cfc7d9961897b073dcd568e04f3ec26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.457020 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.477259 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.496031 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e77eac-b883-466e-9ab3-921f182cca14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13935d5717314a05d572f9c1dbfbc83ae73eda0bda4e908506ef36fbda98a003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b19fbe2bb17150acda4325b99e97b65778ed7b272947befd5810ceb41763b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjgb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.508939 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.532445 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.532487 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.532499 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.532516 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.532527 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.534216 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a1d11eb03ded7773dd8e3beb118ebf6b2b97975cd4e82206ab01c0b0b9e88f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.553151 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.563352 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.571115 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35f8f7def03e94f78561f9ea30a34ca0f8e0fa00adda80bef84452359f3b4e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.581028 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.594574 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.601293 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9060620251cdf34f42186129a539242d97ba46a04d1fb24964087f302e5687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.611012 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.621473 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.630868 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d35a37de33c3c613ef43ace617c6729e7e1665ef40647296e57dc14fa55316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.634338 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.634380 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.634414 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.634437 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.634452 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.637187 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25sxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e275bab-612f-4fe8-8a4f-792634265c15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25sxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.686839 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2ac1352555e30ad3a2772326c592768744a65a53d847fee56442d4ac7e068e4d"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.689626 4841 generic.go:334] "Generic (PLEG): container finished" podID="cd2ac8bc-c695-499d-aacb-0e47a29aa569" containerID="33ecdcaaa545cc29bc38ff1c697571e99626c2fd351ec019b9e1135ae9e25cd3" exitCode=0 Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.689678 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" event={"ID":"cd2ac8bc-c695-499d-aacb-0e47a29aa569","Type":"ContainerDied","Data":"33ecdcaaa545cc29bc38ff1c697571e99626c2fd351ec019b9e1135ae9e25cd3"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.697727 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerStarted","Data":"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.697798 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerStarted","Data":"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.697820 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerStarted","Data":"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.697839 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerStarted","Data":"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.697856 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerStarted","Data":"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.697877 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerStarted","Data":"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.707735 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.724425 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35f8f7def03e94f78561f9ea30a34ca0f8e0fa00adda80bef84452359f3b4e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.736027 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.736859 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.736908 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.736928 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.736955 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.736973 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.739710 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs\") pod \"network-metrics-daemon-25sxv\" (UID: \"1e275bab-612f-4fe8-8a4f-792634265c15\") " pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.740535 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:24 crc kubenswrapper[4841]: E0130 05:08:24.740607 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs podName:1e275bab-612f-4fe8-8a4f-792634265c15 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:28.740590513 +0000 UTC m=+45.734063161 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs") pod "network-metrics-daemon-25sxv" (UID: "1e275bab-612f-4fe8-8a4f-792634265c15") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.747546 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a1d11eb03ded7773dd8e3beb118ebf6b2b97975cd4e82206ab01c0b0b9e88f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.766072 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.773331 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9060620251cdf34f42186129a539242d97ba46a04d1fb24964087f302e5687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.783373 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.794825 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.801535 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d35a37de33c3c613ef43ace617c6729e7e1665ef40647296e57dc14fa55316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.808783 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25sxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e275bab-612f-4fe8-8a4f-792634265c15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25sxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.822193 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.832896 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.839599 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.839627 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.839635 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.839648 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.839656 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.843207 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ecdcaaa545cc29bc38ff1c697571e99626c2fd351ec019b9e1135ae9e25cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33ecdcaaa545cc29bc38ff1c697571e99626c2fd351ec019b9e1135ae9e25cd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.852558 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e77eac-b883-466e-9ab3-921f182cca14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13935d5717314a05d572f9c1dbfbc83ae73eda0bda4e908506ef36fbda98a003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b19fbe2bb17150acda4325b99e97b65778ed7b272947befd5810ceb41763b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjgb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.866524 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0506b7378f588431575d0cb4ad90d8008cfc7d9961897b073dcd568e04f3ec26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.876684 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.942559 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.942611 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.942629 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.942654 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:24 crc kubenswrapper[4841]: I0130 05:08:24.942673 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:24Z","lastTransitionTime":"2026-01-30T05:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.045202 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.045240 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.045248 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.045262 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.045272 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.153082 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.153163 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.153190 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.153213 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.153231 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.256265 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.256337 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.256361 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.256390 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.256472 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.359955 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.360025 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.360043 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.360069 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.360088 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.371231 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 05:19:53.987289686 +0000 UTC Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.464145 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.464197 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.464209 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.464226 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.464238 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.567167 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.567222 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.567241 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.567265 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.567282 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.670751 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.670835 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.670861 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.670889 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.670907 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.703177 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"419a36e5dd867de47c8ba277a16af6868ea7d0922c8f43e0fdf4ddc8bd5f7c16"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.705174 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f06108ef4f8f63937c80e664012f008eecf1bb5a0fa822ab2909be47b0e41419"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.707912 4841 generic.go:334] "Generic (PLEG): container finished" podID="cd2ac8bc-c695-499d-aacb-0e47a29aa569" containerID="e7cecd43f161b3eee2f102d219aa2d418a428b0bc03ccdd8185514927485e880" exitCode=0 Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.707951 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" event={"ID":"cd2ac8bc-c695-499d-aacb-0e47a29aa569","Type":"ContainerDied","Data":"e7cecd43f161b3eee2f102d219aa2d418a428b0bc03ccdd8185514927485e880"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.727042 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25sxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e275bab-612f-4fe8-8a4f-792634265c15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25sxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.750221 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.766904 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.773973 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.774004 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.774024 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.774041 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.774050 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.780053 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d35a37de33c3c613ef43ace617c6729e7e1665ef40647296e57dc14fa55316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.796203 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e77eac-b883-466e-9ab3-921f182cca14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13935d5717314a05d572f9c1dbfbc83ae73eda0bda4e908506ef36fbda98a003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b19fbe2bb17150acda4325b99e97b65778ed7b272947befd5810ceb41763b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjgb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.811348 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0506b7378f588431575d0cb4ad90d8008cfc7d9961897b073dcd568e04f3ec26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.831909 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.848612 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ecdcaaa545cc29bc38ff1c697571e99626c2fd351ec019b9e1135ae9e25cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33ecdcaaa545cc29bc38ff1c697571e99626c2fd351ec019b9e1135ae9e25cd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.858723 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35f8f7def03e94f78561f9ea30a34ca0f8e0fa00adda80bef84452359f3b4e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.873999 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419a36e5dd867de47c8ba277a16af6868ea7d0922c8f43e0fdf4ddc8bd5f7c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.875734 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.875760 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.875769 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.875781 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.875791 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.888732 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a1d11eb03ded7773dd8e3beb118ebf6b2b97975cd4e82206ab01c0b0b9e88f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.912367 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.929363 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.941308 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.955130 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.967067 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9060620251cdf34f42186129a539242d97ba46a04d1fb24964087f302e5687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.977801 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.977841 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.977853 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.977871 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.977885 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:25Z","lastTransitionTime":"2026-01-30T05:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.982946 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5282a9-eab1-4f08-a846-9c65d177b71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d8954304b3f58d808d917e6f12aafd9b175439f71c4e1cc106e5ac47f57351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29d7f1203932767c07319d33156b0dcb0e999c912c190347302da767bb8166e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82286cf6ee4a650097dab61d1c95fd42c500f4fc238f92508225bc0bccbf59b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:25 crc kubenswrapper[4841]: I0130 05:08:25.999470 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f06108ef4f8f63937c80e664012f008eecf1bb5a0fa822ab2909be47b0e41419\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac1352555e30ad3a2772326c592768744a65a53d847fee56442d4ac7e068e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:25Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.013584 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2hhs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6f6537-92be-4744-b292-751dd9ccdb1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9060620251cdf34f42186129a539242d97ba46a04d1fb24964087f302e5687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxs7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2hhs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.025548 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25sxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e275bab-612f-4fe8-8a4f-792634265c15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjgl4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25sxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.039999 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f791ed0a-befc-479e-862b-deb440b67c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T05:08:07Z\\\",\\\"message\\\":\\\"file observer\\\\nW0130 05:08:06.935679 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0130 05:08:06.935971 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 05:08:06.937016 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-847916023/tls.crt::/tmp/serving-cert-847916023/tls.key\\\\\\\"\\\\nI0130 05:08:07.351898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 05:08:07.354389 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 05:08:07.354419 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 05:08:07.354440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 05:08:07.354445 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 05:08:07.361764 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 05:08:07.361784 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 05:08:07.361792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 05:08:07.361795 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 05:08:07.361798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 05:08:07.361801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 05:08:07.361922 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 05:08:07.367377 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:07:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:07:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.054880 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.065775 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a24700eb-27ff-4126-9f6a-40ee9575e5ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d35a37de33c3c613ef43ace617c6729e7e1665ef40647296e57dc14fa55316\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsqj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hd8v2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.075892 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32e77eac-b883-466e-9ab3-921f182cca14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13935d5717314a05d572f9c1dbfbc83ae73eda0bda4e908506ef36fbda98a003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b19fbe2bb17150acda4325b99e97b65778ed7b272947befd5810ceb41763b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d97vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjgb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.080257 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.080297 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.080313 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.080334 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.080347 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.088565 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0506b7378f588431575d0cb4ad90d8008cfc7d9961897b073dcd568e04f3ec26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.105762 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.124004 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd2ac8bc-c695-499d-aacb-0e47a29aa569\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918c42d0672329b9984acd2204f21acf6ed006666e5b186a00ea251dd3ebaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ecdcaaa545cc29bc38ff1c697571e99626c2fd351ec019b9e1135ae9e25cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33ecdcaaa545cc29bc38ff1c697571e99626c2fd351ec019b9e1135ae9e25cd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7cecd43f161b3eee2f102d219aa2d418a428b0bc03ccdd8185514927485e880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7cecd43f161b3eee2f102d219aa2d418a428b0bc03ccdd8185514927485e880\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5ft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2ww56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.137084 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d7bdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"577a5760-2c37-4852-90c0-4962c2362824\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35f8f7def03e94f78561f9ea30a34ca0f8e0fa00adda80bef84452359f3b4e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n9p8z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d7bdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.150485 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419a36e5dd867de47c8ba277a16af6868ea7d0922c8f43e0fdf4ddc8bd5f7c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.169684 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c49cw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262e0db9-4560-4557-823d-8a4145e03fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a1d11eb03ded7773dd8e3beb118ebf6b2b97975cd4e82206ab01c0b0b9e88f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T05:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdgz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c49cw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.183774 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.183831 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.183849 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.183873 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.183891 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.199473 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.217330 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.286936 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.287232 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.287396 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.287572 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.287700 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.371940 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 17:23:22.28167181 +0000 UTC Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.392955 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.393043 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.393070 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.393101 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.393123 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.431928 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.432571 4841 scope.go:117] "RemoveContainer" containerID="042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.432205 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:26 crc kubenswrapper[4841]: E0130 05:08:26.432998 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:26 crc kubenswrapper[4841]: E0130 05:08:26.432891 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.432043 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.432351 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:26 crc kubenswrapper[4841]: E0130 05:08:26.433082 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:26 crc kubenswrapper[4841]: E0130 05:08:26.433334 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25sxv" podUID="1e275bab-612f-4fe8-8a4f-792634265c15" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.496746 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.497013 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.497035 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.497060 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.497077 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.600494 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.600551 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.600569 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.600594 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.600614 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.703455 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.703516 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.703534 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.703558 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.703574 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.717749 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerStarted","Data":"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86"} Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.720990 4841 generic.go:334] "Generic (PLEG): container finished" podID="cd2ac8bc-c695-499d-aacb-0e47a29aa569" containerID="1d5e55aa3ac109571b4c3366c18f06f9f7c2c2d1707d82c36eb88f95ae543ae7" exitCode=0 Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.721079 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" event={"ID":"cd2ac8bc-c695-499d-aacb-0e47a29aa569","Type":"ContainerDied","Data":"1d5e55aa3ac109571b4c3366c18f06f9f7c2c2d1707d82c36eb88f95ae543ae7"} Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.725490 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.727823 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f9a32e40ddae47aeb3bdffbc164805c1dfcdc5195e7b0d1c61ad87b68976eed9"} Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.728823 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.747805 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f5d8664-d53a-4e96-9458-fd915cec77b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T05:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T05:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qldm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T05:08:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fl5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.767884 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T05:08:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T05:08:26Z is after 2025-08-24T17:21:41Z" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.807589 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.807649 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.807666 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.807693 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.807711 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.822698 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d7bdh" podStartSLOduration=19.822674292 podStartE2EDuration="19.822674292s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:26.802127866 +0000 UTC m=+43.795600544" watchObservedRunningTime="2026-01-30 05:08:26.822674292 +0000 UTC m=+43.816146980" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.852005 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-c49cw" podStartSLOduration=19.851987311 podStartE2EDuration="19.851987311s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:26.839930348 +0000 UTC m=+43.833403026" watchObservedRunningTime="2026-01-30 05:08:26.851987311 +0000 UTC m=+43.845459959" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.880097 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2hhs7" podStartSLOduration=19.880079148 podStartE2EDuration="19.880079148s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:26.863540202 +0000 UTC m=+43.857012850" watchObservedRunningTime="2026-01-30 05:08:26.880079148 +0000 UTC m=+43.873551796" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.880337 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=15.880331515 podStartE2EDuration="15.880331515s" podCreationTimestamp="2026-01-30 05:08:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:26.879454263 +0000 UTC m=+43.872926911" watchObservedRunningTime="2026-01-30 05:08:26.880331515 +0000 UTC m=+43.873804163" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.915114 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.915155 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.915172 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.915193 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.915210 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:26Z","lastTransitionTime":"2026-01-30T05:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.925805 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podStartSLOduration=19.925786819 podStartE2EDuration="19.925786819s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:26.912268249 +0000 UTC m=+43.905740917" watchObservedRunningTime="2026-01-30 05:08:26.925786819 +0000 UTC m=+43.919259477" Jan 30 05:08:26 crc kubenswrapper[4841]: I0130 05:08:26.982721 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjgb4" podStartSLOduration=19.982701843 podStartE2EDuration="19.982701843s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:26.98220519 +0000 UTC m=+43.975677838" watchObservedRunningTime="2026-01-30 05:08:26.982701843 +0000 UTC m=+43.976174491" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.013568 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=13.01354904 podStartE2EDuration="13.01354904s" podCreationTimestamp="2026-01-30 05:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:27.011907958 +0000 UTC m=+44.005380606" watchObservedRunningTime="2026-01-30 05:08:27.01354904 +0000 UTC m=+44.007021688" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.016890 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.016926 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.016946 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.016963 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.016975 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.119569 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.119630 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.119649 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.119677 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.119697 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.222298 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.222349 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.222366 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.222393 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.222452 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.325800 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.325848 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.325860 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.325877 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.325902 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.372722 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 00:57:52.404725314 +0000 UTC Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.429245 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.429310 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.429329 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.429354 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.429379 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.532027 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.532081 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.532099 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.532122 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.532142 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.634850 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.634895 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.634911 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.634932 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.634947 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.735072 4841 generic.go:334] "Generic (PLEG): container finished" podID="cd2ac8bc-c695-499d-aacb-0e47a29aa569" containerID="7d41d45adf56ff8344afd1afe04535d2472908747e6fbb67320f319e41559d52" exitCode=0 Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.736287 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" event={"ID":"cd2ac8bc-c695-499d-aacb-0e47a29aa569","Type":"ContainerDied","Data":"7d41d45adf56ff8344afd1afe04535d2472908747e6fbb67320f319e41559d52"} Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.738891 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.738929 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.738945 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.738966 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.738982 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.842045 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.842108 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.842123 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.842148 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.842166 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.945115 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.945185 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.945209 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.945240 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:27 crc kubenswrapper[4841]: I0130 05:08:27.945262 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:27Z","lastTransitionTime":"2026-01-30T05:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.050875 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.050931 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.050947 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.050968 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.050984 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.154032 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.154081 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.154102 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.154129 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.154150 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.256886 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.256951 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.256974 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.257002 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.257023 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.359848 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.359887 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.359897 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.359910 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.359918 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.373467 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 03:43:10.336260668 +0000 UTC Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.431065 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.431115 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.431088 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.431073 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:28 crc kubenswrapper[4841]: E0130 05:08:28.431351 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:28 crc kubenswrapper[4841]: E0130 05:08:28.431497 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25sxv" podUID="1e275bab-612f-4fe8-8a4f-792634265c15" Jan 30 05:08:28 crc kubenswrapper[4841]: E0130 05:08:28.431653 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:28 crc kubenswrapper[4841]: E0130 05:08:28.431779 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.463121 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.463163 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.463176 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.463193 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.463206 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.565622 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.565685 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.565708 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.565743 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.565767 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.668950 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.669010 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.669027 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.669050 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.669066 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.746771 4841 generic.go:334] "Generic (PLEG): container finished" podID="cd2ac8bc-c695-499d-aacb-0e47a29aa569" containerID="7adf7b64c20f383d76bbcbf1691f89c950d5b39bf715095e94d09fad5c679b82" exitCode=0 Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.746837 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" event={"ID":"cd2ac8bc-c695-499d-aacb-0e47a29aa569","Type":"ContainerDied","Data":"7adf7b64c20f383d76bbcbf1691f89c950d5b39bf715095e94d09fad5c679b82"} Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.772490 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.772970 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.772993 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.773026 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.773051 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.784112 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs\") pod \"network-metrics-daemon-25sxv\" (UID: \"1e275bab-612f-4fe8-8a4f-792634265c15\") " pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:28 crc kubenswrapper[4841]: E0130 05:08:28.784377 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:28 crc kubenswrapper[4841]: E0130 05:08:28.784478 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs podName:1e275bab-612f-4fe8-8a4f-792634265c15 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:36.784454114 +0000 UTC m=+53.777926762 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs") pod "network-metrics-daemon-25sxv" (UID: "1e275bab-612f-4fe8-8a4f-792634265c15") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.882553 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.882603 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.882615 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.882679 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.882703 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.985218 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.985257 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.985268 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.985284 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:28 crc kubenswrapper[4841]: I0130 05:08:28.985296 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:28Z","lastTransitionTime":"2026-01-30T05:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.087137 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.087172 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.087182 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.087197 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.087208 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.189112 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.189158 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.189169 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.189187 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.189198 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.292113 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.292182 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.292202 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.292230 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.292250 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.373588 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 17:15:37.077327841 +0000 UTC Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.395529 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.395589 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.395606 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.395633 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.395650 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.498302 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.498351 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.498369 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.498395 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.498446 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.600874 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.600909 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.600922 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.600937 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.600950 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.703841 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.703894 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.703913 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.703937 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.703955 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.756871 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerStarted","Data":"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47"} Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.757304 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.757340 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.757471 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.765385 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2ww56" event={"ID":"cd2ac8bc-c695-499d-aacb-0e47a29aa569","Type":"ContainerStarted","Data":"46a2364c75dabe34c9dd6352d80eaaa5190a5ea6d82ab7b940657b52c76c3ebd"} Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.807733 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.807785 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.807802 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.807825 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.807842 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.810091 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" podStartSLOduration=22.810032439 podStartE2EDuration="22.810032439s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:29.809953397 +0000 UTC m=+46.803426125" watchObservedRunningTime="2026-01-30 05:08:29.810032439 +0000 UTC m=+46.803505117" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.837651 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.841856 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.848731 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2ww56" podStartSLOduration=22.848713433 podStartE2EDuration="22.848713433s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:29.847701548 +0000 UTC m=+46.841174226" watchObservedRunningTime="2026-01-30 05:08:29.848713433 +0000 UTC m=+46.842186091" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.910621 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.910659 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.910667 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.910679 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:29 crc kubenswrapper[4841]: I0130 05:08:29.910689 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:29Z","lastTransitionTime":"2026-01-30T05:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.013996 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.014045 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.014062 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.014087 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.014105 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.116873 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.116932 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.116950 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.116973 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.116990 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.219565 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.219607 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.219619 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.219637 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.219651 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.322262 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.322296 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.322304 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.322318 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.322327 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.374036 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 03:11:24.981744898 +0000 UTC Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.424639 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.424674 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.424683 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.424697 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.424706 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.431259 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.431316 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:30 crc kubenswrapper[4841]: E0130 05:08:30.431354 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.431461 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.431316 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:30 crc kubenswrapper[4841]: E0130 05:08:30.431499 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:30 crc kubenswrapper[4841]: E0130 05:08:30.431641 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25sxv" podUID="1e275bab-612f-4fe8-8a4f-792634265c15" Jan 30 05:08:30 crc kubenswrapper[4841]: E0130 05:08:30.431733 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.527425 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.527462 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.527471 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.527503 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.527513 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.630452 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.630520 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.630540 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.630568 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.630585 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.733485 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.733545 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.733562 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.733588 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.733609 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.839107 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.839174 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.839196 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.839234 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.839258 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.942202 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.942254 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.942272 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.942305 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:30 crc kubenswrapper[4841]: I0130 05:08:30.942323 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:30Z","lastTransitionTime":"2026-01-30T05:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.045726 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.046138 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.046157 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.046182 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.046200 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.150651 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.150699 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.150717 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.150742 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.150759 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.253536 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.253564 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.253575 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.253591 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.253603 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.356136 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.356168 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.356179 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.356196 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.356209 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.374991 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 13:12:36.855497463 +0000 UTC Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.458196 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.458229 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.458239 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.458254 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.458265 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.560199 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.560231 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.560242 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.560260 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.560272 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.663737 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.663776 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.663786 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.663801 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.663809 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.719501 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-25sxv"] Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.719628 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:31 crc kubenswrapper[4841]: E0130 05:08:31.719712 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25sxv" podUID="1e275bab-612f-4fe8-8a4f-792634265c15" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.766034 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.766102 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.766119 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.766144 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.766161 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.869641 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.869734 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.869753 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.869788 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.869815 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.973147 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.973201 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.973220 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.973243 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:31 crc kubenswrapper[4841]: I0130 05:08:31.973262 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:31Z","lastTransitionTime":"2026-01-30T05:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.076751 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.076794 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.076807 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.076828 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.076843 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.180128 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.180180 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.180196 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.180221 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.180238 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.283310 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.283382 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.283429 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.283457 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.283475 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.375774 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 08:37:58.173388412 +0000 UTC Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.386517 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.386585 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.386603 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.386628 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.386646 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.431456 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:32 crc kubenswrapper[4841]: E0130 05:08:32.431644 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.431706 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.431869 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:32 crc kubenswrapper[4841]: E0130 05:08:32.431952 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:32 crc kubenswrapper[4841]: E0130 05:08:32.432280 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.528016 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.528077 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.528097 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.528122 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.528139 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.630708 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.630767 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.630786 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.630809 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.630826 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.743497 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.743571 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.743592 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.743621 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.743642 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.847486 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.847545 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.847563 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.847589 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.847610 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.950007 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.950072 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.950097 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.950128 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:32 crc kubenswrapper[4841]: I0130 05:08:32.950149 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:32Z","lastTransitionTime":"2026-01-30T05:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.052954 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.053029 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.053055 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.053083 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.053132 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.156448 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.156511 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.156532 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.156563 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.156584 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.260086 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.260259 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.260320 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.260344 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.260360 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.363182 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.363246 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.363262 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.363286 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.363303 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.376919 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 01:16:09.511581076 +0000 UTC Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.431498 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:33 crc kubenswrapper[4841]: E0130 05:08:33.431729 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25sxv" podUID="1e275bab-612f-4fe8-8a4f-792634265c15" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.466554 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.466604 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.466617 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.466635 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.466647 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.569361 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.569457 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.569474 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.569499 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.569517 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.672010 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.672073 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.672091 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.672114 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.672130 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.774767 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.774808 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.774818 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.774836 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.774848 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.877954 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.878026 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.878045 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.878075 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.878100 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.981488 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.981528 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.981538 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.981552 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:33 crc kubenswrapper[4841]: I0130 05:08:33.981562 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:33Z","lastTransitionTime":"2026-01-30T05:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.084522 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.084632 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.084657 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.084694 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.084716 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.189105 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.189195 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.189213 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.189238 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.189258 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.292654 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.292774 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.292794 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.292818 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.292842 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.370851 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.370930 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.371019 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.371121 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.371160 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.377293 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 13:25:33.608970491 +0000 UTC Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.396256 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.396336 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.396362 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.396394 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.396499 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T05:08:34Z","lastTransitionTime":"2026-01-30T05:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.431260 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.431260 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.431262 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:34 crc kubenswrapper[4841]: E0130 05:08:34.432928 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 05:08:34 crc kubenswrapper[4841]: E0130 05:08:34.433122 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 05:08:34 crc kubenswrapper[4841]: E0130 05:08:34.433290 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.443088 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj"] Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.443752 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.446079 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49cb92e0-893d-484b-8d0c-be7c353b8766-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.446145 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49cb92e0-893d-484b-8d0c-be7c353b8766-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.446189 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.446197 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/49cb92e0-893d-484b-8d0c-be7c353b8766-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.446375 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/49cb92e0-893d-484b-8d0c-be7c353b8766-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.446610 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49cb92e0-893d-484b-8d0c-be7c353b8766-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.446789 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.449066 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.449071 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.547949 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/49cb92e0-893d-484b-8d0c-be7c353b8766-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.548011 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/49cb92e0-893d-484b-8d0c-be7c353b8766-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.548086 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49cb92e0-893d-484b-8d0c-be7c353b8766-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.548127 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/49cb92e0-893d-484b-8d0c-be7c353b8766-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.548165 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49cb92e0-893d-484b-8d0c-be7c353b8766-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.548200 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49cb92e0-893d-484b-8d0c-be7c353b8766-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.548246 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/49cb92e0-893d-484b-8d0c-be7c353b8766-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.549837 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49cb92e0-893d-484b-8d0c-be7c353b8766-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.556298 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49cb92e0-893d-484b-8d0c-be7c353b8766-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.578066 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49cb92e0-893d-484b-8d0c-be7c353b8766-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7m7fj\" (UID: \"49cb92e0-893d-484b-8d0c-be7c353b8766\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: I0130 05:08:34.767641 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" Jan 30 05:08:34 crc kubenswrapper[4841]: W0130 05:08:34.794632 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49cb92e0_893d_484b_8d0c_be7c353b8766.slice/crio-22ead9477f5e4e30add0283081b8bcb8218ece6309880defe5efefd00c893412 WatchSource:0}: Error finding container 22ead9477f5e4e30add0283081b8bcb8218ece6309880defe5efefd00c893412: Status 404 returned error can't find the container with id 22ead9477f5e4e30add0283081b8bcb8218ece6309880defe5efefd00c893412 Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.377535 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 19:01:42.389462234 +0000 UTC Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.377608 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.395830 4841 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.431263 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:35 crc kubenswrapper[4841]: E0130 05:08:35.431648 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25sxv" podUID="1e275bab-612f-4fe8-8a4f-792634265c15" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.742262 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.742535 4841 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.799670 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" event={"ID":"49cb92e0-893d-484b-8d0c-be7c353b8766","Type":"ContainerStarted","Data":"27cb514c926809e67e1a1e29c5b930c80c1afd2a4da565ccab12a586c58bbebd"} Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.800087 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" event={"ID":"49cb92e0-893d-484b-8d0c-be7c353b8766","Type":"ContainerStarted","Data":"22ead9477f5e4e30add0283081b8bcb8218ece6309880defe5efefd00c893412"} Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.802247 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.802927 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.804726 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.805303 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.806161 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qrk5s"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.806893 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.808520 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9wm9f"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.811470 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.812095 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.813149 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.815545 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.815809 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.815907 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-r9f92"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.816130 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.816462 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.816699 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.816883 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.817063 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.817264 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.817488 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.817693 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.817891 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.818263 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.818709 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.819011 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.821514 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.821682 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.822594 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.824783 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5hkjb"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.825689 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.826810 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.827504 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.831937 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.837019 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bznfv"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.837232 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.837485 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.837604 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.837909 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.837996 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.838113 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.840837 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-7zvv6"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.841499 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.856643 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.857305 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.857663 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.860280 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.860489 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.860699 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.860858 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.865722 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.885901 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.885936 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.886102 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.886286 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.886453 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.886600 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.886656 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.886606 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.886759 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.886850 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.886872 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.886907 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.886974 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887024 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887101 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.886863 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887154 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887260 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887343 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887292 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887560 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887608 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887560 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887700 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887790 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887839 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887497 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887871 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887930 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.887305 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.888000 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2dp8z"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.888011 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.888188 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.888213 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.888782 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2dp8z" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.889046 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.889516 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.889634 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.889905 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.890542 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.890630 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.890700 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.892409 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.892669 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.892771 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.893139 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.893387 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.893536 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.895132 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ffmj4"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.895641 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.896025 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.896020 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.896346 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.896752 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.900135 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-t9l6j"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.900676 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.900755 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.900764 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t9l6j" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.901107 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.901715 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.902427 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.906250 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r5skt"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.906769 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.908025 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-z5pll"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.908087 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.908558 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.907205 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.909092 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.909211 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.909432 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.921704 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-f8frn"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.924464 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.926918 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.927706 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f8frn" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.934186 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.927828 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.936553 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.928025 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.928062 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.945073 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lrbq7"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.928079 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.928149 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.928185 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.928275 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.928374 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.928420 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.928483 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.928541 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.928572 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.931194 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.931279 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.934286 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.934324 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.934343 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.934357 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.934539 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.934613 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.934674 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.934719 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.934735 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.934766 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.934792 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.945460 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.950160 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.954854 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lrbq7" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.957499 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.957917 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2f7jj"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.958015 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.958297 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.958490 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.962588 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.962998 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.963259 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qrk5s"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.963329 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.963792 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.972486 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.973260 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9wm9f"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.973364 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.974151 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.976583 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.976851 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4c6xs"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.976941 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.977130 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-c7m6v"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.977278 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.977333 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.977737 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.978371 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-wh9ns"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.978735 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.979954 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mms8q"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.980516 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.982426 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.982752 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.982805 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.982853 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.982999 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.984529 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5hkjb"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.984581 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.985898 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.986440 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bznfv"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.986455 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.987079 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.987568 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7zvv6"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.995749 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pcg6z"] Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.995661 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.997526 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-audit-policies\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.997576 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66edf007-1920-4e59-a256-52a9360dcf9f-config\") pod \"machine-approver-56656f9798-zg9xj\" (UID: \"66edf007-1920-4e59-a256-52a9360dcf9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.997750 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-564hh\" (UniqueName: \"kubernetes.io/projected/66edf007-1920-4e59-a256-52a9360dcf9f-kube-api-access-564hh\") pod \"machine-approver-56656f9798-zg9xj\" (UID: \"66edf007-1920-4e59-a256-52a9360dcf9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.997800 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1228156-5459-400b-97d7-16c75238223b-service-ca-bundle\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.997831 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gjqz\" (UniqueName: \"kubernetes.io/projected/9eee3621-c382-4ee6-a955-0061726a0214-kube-api-access-4gjqz\") pod \"cluster-samples-operator-665b6dd947-qqfld\" (UID: \"9eee3621-c382-4ee6-a955-0061726a0214\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.997952 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-config\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.998173 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-client-ca\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.998481 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9eee3621-c382-4ee6-a955-0061726a0214-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qqfld\" (UID: \"9eee3621-c382-4ee6-a955-0061726a0214\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.999653 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09769310-f1d3-49d3-87bf-1921c35b32de-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:35 crc kubenswrapper[4841]: I0130 05:08:35.999793 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:35.999991 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-etcd-serving-ca\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.000246 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-client-ca\") pod \"route-controller-manager-6576b87f9c-52nnq\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.000331 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/692a0681-d33c-43ff-b458-8c2302df6bd9-encryption-config\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.001016 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68fe97c1-4b26-445b-af5b-73808e119f0b-console-serving-cert\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.001089 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-config\") pod \"route-controller-manager-6576b87f9c-52nnq\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.002070 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.002148 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-serving-cert\") pod \"route-controller-manager-6576b87f9c-52nnq\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.002360 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/692a0681-d33c-43ff-b458-8c2302df6bd9-etcd-client\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.002495 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.006984 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-service-ca\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.007047 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09769310-f1d3-49d3-87bf-1921c35b32de-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.007100 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.007123 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5rvh\" (UniqueName: \"kubernetes.io/projected/29892f01-d39f-41cd-aa3c-402791553b2c-kube-api-access-l5rvh\") pod \"machine-api-operator-5694c8668f-qrk5s\" (UID: \"29892f01-d39f-41cd-aa3c-402791553b2c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.016953 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-console-config\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.017366 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-957v6\" (UniqueName: \"kubernetes.io/projected/692a0681-d33c-43ff-b458-8c2302df6bd9-kube-api-access-957v6\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.017806 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09769310-f1d3-49d3-87bf-1921c35b32de-audit-policies\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.018154 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09769310-f1d3-49d3-87bf-1921c35b32de-serving-cert\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.018387 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-audit\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.018435 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.018464 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68fe97c1-4b26-445b-af5b-73808e119f0b-console-oauth-config\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.020760 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09769310-f1d3-49d3-87bf-1921c35b32de-encryption-config\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.020781 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.020798 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-image-import-ca\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.020813 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.020834 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/692a0681-d33c-43ff-b458-8c2302df6bd9-node-pullsecrets\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.020849 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.020867 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6fqx\" (UniqueName: \"kubernetes.io/projected/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-kube-api-access-z6fqx\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021118 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a785f654-41ed-4c03-baf7-b0fb5bc3f543-serving-cert\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021141 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-trusted-ca-bundle\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021156 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-oauth-serving-cert\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021176 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021192 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1228156-5459-400b-97d7-16c75238223b-serving-cert\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021260 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021349 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/692a0681-d33c-43ff-b458-8c2302df6bd9-audit-dir\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021379 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09769310-f1d3-49d3-87bf-1921c35b32de-audit-dir\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021422 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09769310-f1d3-49d3-87bf-1921c35b32de-etcd-client\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021437 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/29892f01-d39f-41cd-aa3c-402791553b2c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qrk5s\" (UID: \"29892f01-d39f-41cd-aa3c-402791553b2c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021451 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/692a0681-d33c-43ff-b458-8c2302df6bd9-serving-cert\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021478 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021492 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-config\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021506 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021520 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021537 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/66edf007-1920-4e59-a256-52a9360dcf9f-machine-approver-tls\") pod \"machine-approver-56656f9798-zg9xj\" (UID: \"66edf007-1920-4e59-a256-52a9360dcf9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021550 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwzk9\" (UniqueName: \"kubernetes.io/projected/d1228156-5459-400b-97d7-16c75238223b-kube-api-access-hwzk9\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021564 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn6xp\" (UniqueName: \"kubernetes.io/projected/a785f654-41ed-4c03-baf7-b0fb5bc3f543-kube-api-access-nn6xp\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021585 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-audit-dir\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021600 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1228156-5459-400b-97d7-16c75238223b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021616 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29892f01-d39f-41cd-aa3c-402791553b2c-config\") pod \"machine-api-operator-5694c8668f-qrk5s\" (UID: \"29892f01-d39f-41cd-aa3c-402791553b2c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021631 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmh4m\" (UniqueName: \"kubernetes.io/projected/68fe97c1-4b26-445b-af5b-73808e119f0b-kube-api-access-gmh4m\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021647 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2hmb\" (UniqueName: \"kubernetes.io/projected/09769310-f1d3-49d3-87bf-1921c35b32de-kube-api-access-j2hmb\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021661 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn5cp\" (UniqueName: \"kubernetes.io/projected/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-kube-api-access-wn5cp\") pod \"route-controller-manager-6576b87f9c-52nnq\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021764 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/66edf007-1920-4e59-a256-52a9360dcf9f-auth-proxy-config\") pod \"machine-approver-56656f9798-zg9xj\" (UID: \"66edf007-1920-4e59-a256-52a9360dcf9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021780 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1228156-5459-400b-97d7-16c75238223b-config\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.021795 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/29892f01-d39f-41cd-aa3c-402791553b2c-images\") pod \"machine-api-operator-5694c8668f-qrk5s\" (UID: \"29892f01-d39f-41cd-aa3c-402791553b2c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.023461 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pcg6z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.026186 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.026556 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-76czq"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.027276 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-76czq" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.028648 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.029692 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-r9f92"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.031024 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2dp8z"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.032360 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.034475 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.036118 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.038431 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ffmj4"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.038467 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t9l6j"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.039223 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.040258 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.041230 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.042122 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.043228 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.045035 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.045074 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.046039 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.048010 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.049951 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-z5pll"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.051094 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-76czq"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.052355 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4c6xs"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.053308 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.054505 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mms8q"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.056467 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.057530 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.058689 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.059849 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2f7jj"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.061050 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-c7m6v"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.062246 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.063528 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-f8frn"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.065576 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lrbq7"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.068479 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.069848 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7kqcm"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.071127 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.071176 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.072121 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7kqcm"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.073324 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7wz6l"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.074533 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.075226 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7wz6l"] Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.084922 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.104765 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.123388 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.123504 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkn47\" (UniqueName: \"kubernetes.io/projected/00b1b4b6-71d6-41a0-94e9-e1f137961e72-kube-api-access-rkn47\") pod \"dns-operator-744455d44c-2dp8z\" (UID: \"00b1b4b6-71d6-41a0-94e9-e1f137961e72\") " pod="openshift-dns-operator/dns-operator-744455d44c-2dp8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.123538 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345f9dbf-0dbd-4d48-841f-0f9637618c3a-config\") pod \"service-ca-operator-777779d784-mms8q\" (UID: \"345f9dbf-0dbd-4d48-841f-0f9637618c3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.123587 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9eee3621-c382-4ee6-a955-0061726a0214-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qqfld\" (UID: \"9eee3621-c382-4ee6-a955-0061726a0214\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.123695 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5zv\" (UniqueName: \"kubernetes.io/projected/de438831-f663-43cd-98f9-72e133534c61-kube-api-access-rv5zv\") pod \"cluster-image-registry-operator-dc59b4c8b-4n5bd\" (UID: \"de438831-f663-43cd-98f9-72e133534c61\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.123789 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09769310-f1d3-49d3-87bf-1921c35b32de-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.123859 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.124200 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-etcd-serving-ca\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.124244 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68fe97c1-4b26-445b-af5b-73808e119f0b-console-serving-cert\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.124304 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/999a31cf-76fd-4c51-82df-21bcf988140d-etcd-service-ca\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.125438 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-etcd-serving-ca\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.125927 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-client-ca\") pod \"route-controller-manager-6576b87f9c-52nnq\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.126072 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/692a0681-d33c-43ff-b458-8c2302df6bd9-encryption-config\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.126193 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-config\") pod \"route-controller-manager-6576b87f9c-52nnq\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.126306 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/01747236-9ab9-46b2-952a-2c065de19cf4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wpv8z\" (UID: \"01747236-9ab9-46b2-952a-2c065de19cf4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.126259 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09769310-f1d3-49d3-87bf-1921c35b32de-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.126564 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-serving-cert\") pod \"route-controller-manager-6576b87f9c-52nnq\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.126619 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.126673 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/692a0681-d33c-43ff-b458-8c2302df6bd9-etcd-client\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.126713 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdk24\" (UniqueName: \"kubernetes.io/projected/345f9dbf-0dbd-4d48-841f-0f9637618c3a-kube-api-access-cdk24\") pod \"service-ca-operator-777779d784-mms8q\" (UID: \"345f9dbf-0dbd-4d48-841f-0f9637618c3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.126794 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127029 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-config\") pod \"kube-controller-manager-operator-78b949d7b-slrnt\" (UID: \"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127202 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/345f9dbf-0dbd-4d48-841f-0f9637618c3a-serving-cert\") pod \"service-ca-operator-777779d784-mms8q\" (UID: \"345f9dbf-0dbd-4d48-841f-0f9637618c3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127295 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127333 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-service-ca\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127379 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gpkz\" (UniqueName: \"kubernetes.io/projected/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-kube-api-access-8gpkz\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127443 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5rvh\" (UniqueName: \"kubernetes.io/projected/29892f01-d39f-41cd-aa3c-402791553b2c-kube-api-access-l5rvh\") pod \"machine-api-operator-5694c8668f-qrk5s\" (UID: \"29892f01-d39f-41cd-aa3c-402791553b2c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127488 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09769310-f1d3-49d3-87bf-1921c35b32de-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127558 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127597 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-console-config\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127654 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/999a31cf-76fd-4c51-82df-21bcf988140d-etcd-ca\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127690 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-957v6\" (UniqueName: \"kubernetes.io/projected/692a0681-d33c-43ff-b458-8c2302df6bd9-kube-api-access-957v6\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127726 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-config\") pod \"route-controller-manager-6576b87f9c-52nnq\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127744 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/999a31cf-76fd-4c51-82df-21bcf988140d-etcd-client\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.127783 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09769310-f1d3-49d3-87bf-1921c35b32de-audit-policies\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.128411 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.128108 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09769310-f1d3-49d3-87bf-1921c35b32de-serving-cert\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.128652 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-audit\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.128558 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-client-ca\") pod \"route-controller-manager-6576b87f9c-52nnq\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.129073 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9eee3621-c382-4ee6-a955-0061726a0214-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qqfld\" (UID: \"9eee3621-c382-4ee6-a955-0061726a0214\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.128684 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/999a31cf-76fd-4c51-82df-21bcf988140d-config\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.129673 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt9x8\" (UniqueName: \"kubernetes.io/projected/999a31cf-76fd-4c51-82df-21bcf988140d-kube-api-access-bt9x8\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.129894 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8353c559-01f4-4b08-bf29-81566a889797-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-chrvd\" (UID: \"8353c559-01f4-4b08-bf29-81566a889797\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.129926 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09769310-f1d3-49d3-87bf-1921c35b32de-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.129969 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de438831-f663-43cd-98f9-72e133534c61-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4n5bd\" (UID: \"de438831-f663-43cd-98f9-72e133534c61\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.130094 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.130157 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68fe97c1-4b26-445b-af5b-73808e119f0b-console-oauth-config\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.130235 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09769310-f1d3-49d3-87bf-1921c35b32de-audit-policies\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.130278 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09769310-f1d3-49d3-87bf-1921c35b32de-encryption-config\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.130580 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.130824 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-console-config\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.131003 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-image-import-ca\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.131146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.131101 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.131889 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-service-ca\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.132218 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-serving-cert\") pod \"route-controller-manager-6576b87f9c-52nnq\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.131490 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-slrnt\" (UID: \"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.132533 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/692a0681-d33c-43ff-b458-8c2302df6bd9-node-pullsecrets\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.132708 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.132824 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/de438831-f663-43cd-98f9-72e133534c61-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4n5bd\" (UID: \"de438831-f663-43cd-98f9-72e133534c61\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.132833 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/692a0681-d33c-43ff-b458-8c2302df6bd9-node-pullsecrets\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.132878 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6fqx\" (UniqueName: \"kubernetes.io/projected/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-kube-api-access-z6fqx\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.133029 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a785f654-41ed-4c03-baf7-b0fb5bc3f543-serving-cert\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.133078 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb68d\" (UniqueName: \"kubernetes.io/projected/66397da5-478a-4800-93b9-012a7684f0ad-kube-api-access-nb68d\") pod \"olm-operator-6b444d44fb-bxkf4\" (UID: \"66397da5-478a-4800-93b9-012a7684f0ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.133115 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjvv2\" (UniqueName: \"kubernetes.io/projected/5a250092-4e4c-4edf-943a-23b7ffe49bab-kube-api-access-wjvv2\") pod \"multus-admission-controller-857f4d67dd-lrbq7\" (UID: \"5a250092-4e4c-4edf-943a-23b7ffe49bab\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lrbq7" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.133179 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-trusted-ca-bundle\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.133285 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-oauth-serving-cert\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.133311 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.133317 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1228156-5459-400b-97d7-16c75238223b-serving-cert\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.133478 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/692a0681-d33c-43ff-b458-8c2302df6bd9-audit-dir\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.133512 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.133990 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09769310-f1d3-49d3-87bf-1921c35b32de-audit-dir\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.134101 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09769310-f1d3-49d3-87bf-1921c35b32de-audit-dir\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.134283 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/692a0681-d33c-43ff-b458-8c2302df6bd9-etcd-client\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.134522 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/692a0681-d33c-43ff-b458-8c2302df6bd9-audit-dir\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.134676 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-audit\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.134685 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de438831-f663-43cd-98f9-72e133534c61-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4n5bd\" (UID: \"de438831-f663-43cd-98f9-72e133534c61\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.134861 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09769310-f1d3-49d3-87bf-1921c35b32de-etcd-client\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.134957 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/29892f01-d39f-41cd-aa3c-402791553b2c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qrk5s\" (UID: \"29892f01-d39f-41cd-aa3c-402791553b2c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135214 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135256 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135270 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/66397da5-478a-4800-93b9-012a7684f0ad-srv-cert\") pod \"olm-operator-6b444d44fb-bxkf4\" (UID: \"66397da5-478a-4800-93b9-012a7684f0ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135364 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/692a0681-d33c-43ff-b458-8c2302df6bd9-serving-cert\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135431 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-oauth-serving-cert\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135472 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135557 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-config\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135604 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135653 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01747236-9ab9-46b2-952a-2c065de19cf4-serving-cert\") pod \"openshift-config-operator-7777fb866f-wpv8z\" (UID: \"01747236-9ab9-46b2-952a-2c065de19cf4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135795 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00b1b4b6-71d6-41a0-94e9-e1f137961e72-metrics-tls\") pod \"dns-operator-744455d44c-2dp8z\" (UID: \"00b1b4b6-71d6-41a0-94e9-e1f137961e72\") " pod="openshift-dns-operator/dns-operator-744455d44c-2dp8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135837 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8353c559-01f4-4b08-bf29-81566a889797-proxy-tls\") pod \"machine-config-controller-84d6567774-chrvd\" (UID: \"8353c559-01f4-4b08-bf29-81566a889797\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135897 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-image-import-ca\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135906 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwzk9\" (UniqueName: \"kubernetes.io/projected/d1228156-5459-400b-97d7-16c75238223b-kube-api-access-hwzk9\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135284 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/692a0681-d33c-43ff-b458-8c2302df6bd9-encryption-config\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135964 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68fe97c1-4b26-445b-af5b-73808e119f0b-console-oauth-config\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.135937 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn6xp\" (UniqueName: \"kubernetes.io/projected/a785f654-41ed-4c03-baf7-b0fb5bc3f543-kube-api-access-nn6xp\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.136244 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09769310-f1d3-49d3-87bf-1921c35b32de-encryption-config\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.136173 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.136266 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/999a31cf-76fd-4c51-82df-21bcf988140d-serving-cert\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.136692 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5a250092-4e4c-4edf-943a-23b7ffe49bab-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lrbq7\" (UID: \"5a250092-4e4c-4edf-943a-23b7ffe49bab\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lrbq7" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.136797 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.136814 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.136858 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.136909 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.136942 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/66edf007-1920-4e59-a256-52a9360dcf9f-machine-approver-tls\") pod \"machine-approver-56656f9798-zg9xj\" (UID: \"66edf007-1920-4e59-a256-52a9360dcf9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.136984 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ccj6\" (UniqueName: \"kubernetes.io/projected/8353c559-01f4-4b08-bf29-81566a889797-kube-api-access-5ccj6\") pod \"machine-config-controller-84d6567774-chrvd\" (UID: \"8353c559-01f4-4b08-bf29-81566a889797\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.137039 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29892f01-d39f-41cd-aa3c-402791553b2c-config\") pod \"machine-api-operator-5694c8668f-qrk5s\" (UID: \"29892f01-d39f-41cd-aa3c-402791553b2c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.137532 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.137619 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmh4m\" (UniqueName: \"kubernetes.io/projected/68fe97c1-4b26-445b-af5b-73808e119f0b-kube-api-access-gmh4m\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.138438 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692a0681-d33c-43ff-b458-8c2302df6bd9-config\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.138729 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-audit-dir\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.138767 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1228156-5459-400b-97d7-16c75238223b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.138868 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/66edf007-1920-4e59-a256-52a9360dcf9f-auth-proxy-config\") pod \"machine-approver-56656f9798-zg9xj\" (UID: \"66edf007-1920-4e59-a256-52a9360dcf9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.138921 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1228156-5459-400b-97d7-16c75238223b-config\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.138959 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/29892f01-d39f-41cd-aa3c-402791553b2c-images\") pod \"machine-api-operator-5694c8668f-qrk5s\" (UID: \"29892f01-d39f-41cd-aa3c-402791553b2c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.139201 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-audit-dir\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.139408 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/29892f01-d39f-41cd-aa3c-402791553b2c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qrk5s\" (UID: \"29892f01-d39f-41cd-aa3c-402791553b2c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.139558 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1228156-5459-400b-97d7-16c75238223b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.139865 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1228156-5459-400b-97d7-16c75238223b-config\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.139942 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzf5q\" (UniqueName: \"kubernetes.io/projected/01747236-9ab9-46b2-952a-2c065de19cf4-kube-api-access-mzf5q\") pod \"openshift-config-operator-7777fb866f-wpv8z\" (UID: \"01747236-9ab9-46b2-952a-2c065de19cf4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140010 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2hmb\" (UniqueName: \"kubernetes.io/projected/09769310-f1d3-49d3-87bf-1921c35b32de-kube-api-access-j2hmb\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140076 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn5cp\" (UniqueName: \"kubernetes.io/projected/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-kube-api-access-wn5cp\") pod \"route-controller-manager-6576b87f9c-52nnq\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140122 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/66edf007-1920-4e59-a256-52a9360dcf9f-auth-proxy-config\") pod \"machine-approver-56656f9798-zg9xj\" (UID: \"66edf007-1920-4e59-a256-52a9360dcf9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140141 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/66397da5-478a-4800-93b9-012a7684f0ad-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bxkf4\" (UID: \"66397da5-478a-4800-93b9-012a7684f0ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140239 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp8xd\" (UniqueName: \"kubernetes.io/projected/76152f0e-2b76-469b-a55e-f94c53fe9e4d-kube-api-access-cp8xd\") pod \"downloads-7954f5f757-t9l6j\" (UID: \"76152f0e-2b76-469b-a55e-f94c53fe9e4d\") " pod="openshift-console/downloads-7954f5f757-t9l6j" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140285 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140316 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-ready\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140350 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-audit-policies\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140377 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66edf007-1920-4e59-a256-52a9360dcf9f-config\") pod \"machine-approver-56656f9798-zg9xj\" (UID: \"66edf007-1920-4e59-a256-52a9360dcf9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140427 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-564hh\" (UniqueName: \"kubernetes.io/projected/66edf007-1920-4e59-a256-52a9360dcf9f-kube-api-access-564hh\") pod \"machine-approver-56656f9798-zg9xj\" (UID: \"66edf007-1920-4e59-a256-52a9360dcf9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140456 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1228156-5459-400b-97d7-16c75238223b-service-ca-bundle\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140488 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-slrnt\" (UID: \"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140514 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gjqz\" (UniqueName: \"kubernetes.io/projected/9eee3621-c382-4ee6-a955-0061726a0214-kube-api-access-4gjqz\") pod \"cluster-samples-operator-665b6dd947-qqfld\" (UID: \"9eee3621-c382-4ee6-a955-0061726a0214\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140541 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-config\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140571 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-client-ca\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140595 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.140912 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-audit-policies\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.141058 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66edf007-1920-4e59-a256-52a9360dcf9f-config\") pod \"machine-approver-56656f9798-zg9xj\" (UID: \"66edf007-1920-4e59-a256-52a9360dcf9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.141361 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1228156-5459-400b-97d7-16c75238223b-service-ca-bundle\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.141441 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1228156-5459-400b-97d7-16c75238223b-serving-cert\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.141648 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29892f01-d39f-41cd-aa3c-402791553b2c-config\") pod \"machine-api-operator-5694c8668f-qrk5s\" (UID: \"29892f01-d39f-41cd-aa3c-402791553b2c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.141979 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-trusted-ca-bundle\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.141948 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/29892f01-d39f-41cd-aa3c-402791553b2c-images\") pod \"machine-api-operator-5694c8668f-qrk5s\" (UID: \"29892f01-d39f-41cd-aa3c-402791553b2c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.142151 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-client-ca\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.142318 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-config\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.142710 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.142848 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09769310-f1d3-49d3-87bf-1921c35b32de-etcd-client\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.142975 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a785f654-41ed-4c03-baf7-b0fb5bc3f543-serving-cert\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.142973 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68fe97c1-4b26-445b-af5b-73808e119f0b-console-serving-cert\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.143142 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.143246 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.143487 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09769310-f1d3-49d3-87bf-1921c35b32de-serving-cert\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.144763 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.145108 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/692a0681-d33c-43ff-b458-8c2302df6bd9-serving-cert\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.146277 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/66edf007-1920-4e59-a256-52a9360dcf9f-machine-approver-tls\") pod \"machine-approver-56656f9798-zg9xj\" (UID: \"66edf007-1920-4e59-a256-52a9360dcf9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.165693 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.185584 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.205382 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.224871 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242056 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/999a31cf-76fd-4c51-82df-21bcf988140d-etcd-ca\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242126 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/999a31cf-76fd-4c51-82df-21bcf988140d-etcd-client\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242163 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/999a31cf-76fd-4c51-82df-21bcf988140d-config\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242226 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt9x8\" (UniqueName: \"kubernetes.io/projected/999a31cf-76fd-4c51-82df-21bcf988140d-kube-api-access-bt9x8\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242273 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8353c559-01f4-4b08-bf29-81566a889797-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-chrvd\" (UID: \"8353c559-01f4-4b08-bf29-81566a889797\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242302 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de438831-f663-43cd-98f9-72e133534c61-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4n5bd\" (UID: \"de438831-f663-43cd-98f9-72e133534c61\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242462 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-slrnt\" (UID: \"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242537 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/de438831-f663-43cd-98f9-72e133534c61-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4n5bd\" (UID: \"de438831-f663-43cd-98f9-72e133534c61\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242652 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjvv2\" (UniqueName: \"kubernetes.io/projected/5a250092-4e4c-4edf-943a-23b7ffe49bab-kube-api-access-wjvv2\") pod \"multus-admission-controller-857f4d67dd-lrbq7\" (UID: \"5a250092-4e4c-4edf-943a-23b7ffe49bab\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lrbq7" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242730 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb68d\" (UniqueName: \"kubernetes.io/projected/66397da5-478a-4800-93b9-012a7684f0ad-kube-api-access-nb68d\") pod \"olm-operator-6b444d44fb-bxkf4\" (UID: \"66397da5-478a-4800-93b9-012a7684f0ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242800 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/66397da5-478a-4800-93b9-012a7684f0ad-srv-cert\") pod \"olm-operator-6b444d44fb-bxkf4\" (UID: \"66397da5-478a-4800-93b9-012a7684f0ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242863 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de438831-f663-43cd-98f9-72e133534c61-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4n5bd\" (UID: \"de438831-f663-43cd-98f9-72e133534c61\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242894 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01747236-9ab9-46b2-952a-2c065de19cf4-serving-cert\") pod \"openshift-config-operator-7777fb866f-wpv8z\" (UID: \"01747236-9ab9-46b2-952a-2c065de19cf4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242957 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00b1b4b6-71d6-41a0-94e9-e1f137961e72-metrics-tls\") pod \"dns-operator-744455d44c-2dp8z\" (UID: \"00b1b4b6-71d6-41a0-94e9-e1f137961e72\") " pod="openshift-dns-operator/dns-operator-744455d44c-2dp8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.242982 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8353c559-01f4-4b08-bf29-81566a889797-proxy-tls\") pod \"machine-config-controller-84d6567774-chrvd\" (UID: \"8353c559-01f4-4b08-bf29-81566a889797\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243030 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/999a31cf-76fd-4c51-82df-21bcf988140d-serving-cert\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243065 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5a250092-4e4c-4edf-943a-23b7ffe49bab-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lrbq7\" (UID: \"5a250092-4e4c-4edf-943a-23b7ffe49bab\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lrbq7" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243129 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ccj6\" (UniqueName: \"kubernetes.io/projected/8353c559-01f4-4b08-bf29-81566a889797-kube-api-access-5ccj6\") pod \"machine-config-controller-84d6567774-chrvd\" (UID: \"8353c559-01f4-4b08-bf29-81566a889797\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243200 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzf5q\" (UniqueName: \"kubernetes.io/projected/01747236-9ab9-46b2-952a-2c065de19cf4-kube-api-access-mzf5q\") pod \"openshift-config-operator-7777fb866f-wpv8z\" (UID: \"01747236-9ab9-46b2-952a-2c065de19cf4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243314 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-ready\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243363 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/66397da5-478a-4800-93b9-012a7684f0ad-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bxkf4\" (UID: \"66397da5-478a-4800-93b9-012a7684f0ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243391 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp8xd\" (UniqueName: \"kubernetes.io/projected/76152f0e-2b76-469b-a55e-f94c53fe9e4d-kube-api-access-cp8xd\") pod \"downloads-7954f5f757-t9l6j\" (UID: \"76152f0e-2b76-469b-a55e-f94c53fe9e4d\") " pod="openshift-console/downloads-7954f5f757-t9l6j" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243463 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243521 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-slrnt\" (UID: \"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243547 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243625 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkn47\" (UniqueName: \"kubernetes.io/projected/00b1b4b6-71d6-41a0-94e9-e1f137961e72-kube-api-access-rkn47\") pod \"dns-operator-744455d44c-2dp8z\" (UID: \"00b1b4b6-71d6-41a0-94e9-e1f137961e72\") " pod="openshift-dns-operator/dns-operator-744455d44c-2dp8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243654 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345f9dbf-0dbd-4d48-841f-0f9637618c3a-config\") pod \"service-ca-operator-777779d784-mms8q\" (UID: \"345f9dbf-0dbd-4d48-841f-0f9637618c3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243702 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5zv\" (UniqueName: \"kubernetes.io/projected/de438831-f663-43cd-98f9-72e133534c61-kube-api-access-rv5zv\") pod \"cluster-image-registry-operator-dc59b4c8b-4n5bd\" (UID: \"de438831-f663-43cd-98f9-72e133534c61\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243738 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/999a31cf-76fd-4c51-82df-21bcf988140d-etcd-service-ca\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243790 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/01747236-9ab9-46b2-952a-2c065de19cf4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wpv8z\" (UID: \"01747236-9ab9-46b2-952a-2c065de19cf4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243818 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdk24\" (UniqueName: \"kubernetes.io/projected/345f9dbf-0dbd-4d48-841f-0f9637618c3a-kube-api-access-cdk24\") pod \"service-ca-operator-777779d784-mms8q\" (UID: \"345f9dbf-0dbd-4d48-841f-0f9637618c3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243869 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gpkz\" (UniqueName: \"kubernetes.io/projected/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-kube-api-access-8gpkz\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243892 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-config\") pod \"kube-controller-manager-operator-78b949d7b-slrnt\" (UID: \"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.243914 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/345f9dbf-0dbd-4d48-841f-0f9637618c3a-serving-cert\") pod \"service-ca-operator-777779d784-mms8q\" (UID: \"345f9dbf-0dbd-4d48-841f-0f9637618c3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.244924 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-ready\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.245156 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/01747236-9ab9-46b2-952a-2c065de19cf4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wpv8z\" (UID: \"01747236-9ab9-46b2-952a-2c065de19cf4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.245494 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8353c559-01f4-4b08-bf29-81566a889797-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-chrvd\" (UID: \"8353c559-01f4-4b08-bf29-81566a889797\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.245595 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/999a31cf-76fd-4c51-82df-21bcf988140d-config\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.245703 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.246153 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.246331 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/999a31cf-76fd-4c51-82df-21bcf988140d-etcd-ca\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.246706 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/999a31cf-76fd-4c51-82df-21bcf988140d-etcd-service-ca\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.246755 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de438831-f663-43cd-98f9-72e133534c61-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4n5bd\" (UID: \"de438831-f663-43cd-98f9-72e133534c61\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.248847 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/999a31cf-76fd-4c51-82df-21bcf988140d-etcd-client\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.249278 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/de438831-f663-43cd-98f9-72e133534c61-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4n5bd\" (UID: \"de438831-f663-43cd-98f9-72e133534c61\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.249520 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00b1b4b6-71d6-41a0-94e9-e1f137961e72-metrics-tls\") pod \"dns-operator-744455d44c-2dp8z\" (UID: \"00b1b4b6-71d6-41a0-94e9-e1f137961e72\") " pod="openshift-dns-operator/dns-operator-744455d44c-2dp8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.250612 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/999a31cf-76fd-4c51-82df-21bcf988140d-serving-cert\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.251885 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01747236-9ab9-46b2-952a-2c065de19cf4-serving-cert\") pod \"openshift-config-operator-7777fb866f-wpv8z\" (UID: \"01747236-9ab9-46b2-952a-2c065de19cf4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.265487 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.284908 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.304859 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.325179 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.346253 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.385679 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.406932 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.423340 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8353c559-01f4-4b08-bf29-81566a889797-proxy-tls\") pod \"machine-config-controller-84d6567774-chrvd\" (UID: \"8353c559-01f4-4b08-bf29-81566a889797\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.425327 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.431087 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.431982 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.432710 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.445587 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.465703 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.485113 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.505001 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.526028 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.545195 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.561130 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5a250092-4e4c-4edf-943a-23b7ffe49bab-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lrbq7\" (UID: \"5a250092-4e4c-4edf-943a-23b7ffe49bab\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lrbq7" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.565617 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.586231 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.606357 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.625170 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.645478 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.665033 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.686439 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.705551 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.726508 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.745586 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.767608 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.782299 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/66397da5-478a-4800-93b9-012a7684f0ad-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bxkf4\" (UID: \"66397da5-478a-4800-93b9-012a7684f0ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.785523 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.806317 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.826374 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.845347 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.853178 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs\") pod \"network-metrics-daemon-25sxv\" (UID: \"1e275bab-612f-4fe8-8a4f-792634265c15\") " pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:36 crc kubenswrapper[4841]: E0130 05:08:36.853634 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:36 crc kubenswrapper[4841]: E0130 05:08:36.853766 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs podName:1e275bab-612f-4fe8-8a4f-792634265c15 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:52.85374012 +0000 UTC m=+69.847212788 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs") pod "network-metrics-daemon-25sxv" (UID: "1e275bab-612f-4fe8-8a4f-792634265c15") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.866856 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.881366 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/66397da5-478a-4800-93b9-012a7684f0ad-srv-cert\") pod \"olm-operator-6b444d44fb-bxkf4\" (UID: \"66397da5-478a-4800-93b9-012a7684f0ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.885814 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.906267 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.925308 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.954019 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.965781 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.984481 4841 request.go:700] Waited for 1.010723482s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/secrets?fieldSelector=metadata.name%3Dmetrics-tls&limit=500&resourceVersion=0 Jan 30 05:08:36 crc kubenswrapper[4841]: I0130 05:08:36.985990 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.005583 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.026171 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.046316 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.065906 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.085576 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.095902 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.105738 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.135382 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.146861 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.166292 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.185478 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.206393 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.225211 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: E0130 05:08:37.245049 4841 secret.go:188] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 30 05:08:37 crc kubenswrapper[4841]: E0130 05:08:37.245365 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-serving-cert podName:f36c8b3b-03b1-41fb-8649-660e3cdb1bf3 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:37.74533498 +0000 UTC m=+54.738807628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-serving-cert") pod "kube-controller-manager-operator-78b949d7b-slrnt" (UID: "f36c8b3b-03b1-41fb-8649-660e3cdb1bf3") : failed to sync secret cache: timed out waiting for the condition Jan 30 05:08:37 crc kubenswrapper[4841]: E0130 05:08:37.245919 4841 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Jan 30 05:08:37 crc kubenswrapper[4841]: E0130 05:08:37.245975 4841 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 30 05:08:37 crc kubenswrapper[4841]: E0130 05:08:37.246056 4841 configmap.go:193] Couldn't get configMap openshift-multus/cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Jan 30 05:08:37 crc kubenswrapper[4841]: E0130 05:08:37.246074 4841 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.246101 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 30 05:08:37 crc kubenswrapper[4841]: E0130 05:08:37.246021 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-config podName:f36c8b3b-03b1-41fb-8649-660e3cdb1bf3 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:37.745988517 +0000 UTC m=+54.739461245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-config") pod "kube-controller-manager-operator-78b949d7b-slrnt" (UID: "f36c8b3b-03b1-41fb-8649-660e3cdb1bf3") : failed to sync configmap cache: timed out waiting for the condition Jan 30 05:08:37 crc kubenswrapper[4841]: E0130 05:08:37.246171 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-cni-sysctl-allowlist podName:87eccd50-4e4a-408b-aa2e-3c431b6d17d0 nodeName:}" failed. No retries permitted until 2026-01-30 05:08:37.74613559 +0000 UTC m=+54.739608318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-cni-sysctl-allowlist") pod "cni-sysctl-allowlist-ds-wh9ns" (UID: "87eccd50-4e4a-408b-aa2e-3c431b6d17d0") : failed to sync configmap cache: timed out waiting for the condition Jan 30 05:08:37 crc kubenswrapper[4841]: E0130 05:08:37.246209 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/345f9dbf-0dbd-4d48-841f-0f9637618c3a-serving-cert podName:345f9dbf-0dbd-4d48-841f-0f9637618c3a nodeName:}" failed. No retries permitted until 2026-01-30 05:08:37.746189582 +0000 UTC m=+54.739662370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/345f9dbf-0dbd-4d48-841f-0f9637618c3a-serving-cert") pod "service-ca-operator-777779d784-mms8q" (UID: "345f9dbf-0dbd-4d48-841f-0f9637618c3a") : failed to sync secret cache: timed out waiting for the condition Jan 30 05:08:37 crc kubenswrapper[4841]: E0130 05:08:37.246241 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/345f9dbf-0dbd-4d48-841f-0f9637618c3a-config podName:345f9dbf-0dbd-4d48-841f-0f9637618c3a nodeName:}" failed. No retries permitted until 2026-01-30 05:08:37.746225183 +0000 UTC m=+54.739697981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/345f9dbf-0dbd-4d48-841f-0f9637618c3a-config") pod "service-ca-operator-777779d784-mms8q" (UID: "345f9dbf-0dbd-4d48-841f-0f9637618c3a") : failed to sync configmap cache: timed out waiting for the condition Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.265714 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.284978 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.305359 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.325456 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.345227 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.365543 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.384370 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.405153 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.426219 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.431774 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.445993 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.465804 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.485971 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.506199 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.525481 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.545816 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.566164 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.585105 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.625348 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.645892 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.665815 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.686072 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.706033 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.726325 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.745289 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.766515 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.766549 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-config\") pod \"kube-controller-manager-operator-78b949d7b-slrnt\" (UID: \"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.766580 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/345f9dbf-0dbd-4d48-841f-0f9637618c3a-serving-cert\") pod \"service-ca-operator-777779d784-mms8q\" (UID: \"345f9dbf-0dbd-4d48-841f-0f9637618c3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.766637 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-slrnt\" (UID: \"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.766773 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.766797 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345f9dbf-0dbd-4d48-841f-0f9637618c3a-config\") pod \"service-ca-operator-777779d784-mms8q\" (UID: \"345f9dbf-0dbd-4d48-841f-0f9637618c3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.767497 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345f9dbf-0dbd-4d48-841f-0f9637618c3a-config\") pod \"service-ca-operator-777779d784-mms8q\" (UID: \"345f9dbf-0dbd-4d48-841f-0f9637618c3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.767827 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-config\") pod \"kube-controller-manager-operator-78b949d7b-slrnt\" (UID: \"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.767997 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.784870 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/345f9dbf-0dbd-4d48-841f-0f9637618c3a-serving-cert\") pod \"service-ca-operator-777779d784-mms8q\" (UID: \"345f9dbf-0dbd-4d48-841f-0f9637618c3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.785161 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-slrnt\" (UID: \"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.787721 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.805278 4841 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.826493 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.846311 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.865775 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.923394 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-957v6\" (UniqueName: \"kubernetes.io/projected/692a0681-d33c-43ff-b458-8c2302df6bd9-kube-api-access-957v6\") pod \"apiserver-76f77b778f-r9f92\" (UID: \"692a0681-d33c-43ff-b458-8c2302df6bd9\") " pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.935520 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5rvh\" (UniqueName: \"kubernetes.io/projected/29892f01-d39f-41cd-aa3c-402791553b2c-kube-api-access-l5rvh\") pod \"machine-api-operator-5694c8668f-qrk5s\" (UID: \"29892f01-d39f-41cd-aa3c-402791553b2c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.944536 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6fqx\" (UniqueName: \"kubernetes.io/projected/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-kube-api-access-z6fqx\") pod \"oauth-openshift-558db77b4-5hkjb\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.964070 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn6xp\" (UniqueName: \"kubernetes.io/projected/a785f654-41ed-4c03-baf7-b0fb5bc3f543-kube-api-access-nn6xp\") pod \"controller-manager-879f6c89f-9wm9f\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:37 crc kubenswrapper[4841]: I0130 05:08:37.983976 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwzk9\" (UniqueName: \"kubernetes.io/projected/d1228156-5459-400b-97d7-16c75238223b-kube-api-access-hwzk9\") pod \"authentication-operator-69f744f599-bznfv\" (UID: \"d1228156-5459-400b-97d7-16c75238223b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.004101 4841 request.go:700] Waited for 1.863843344s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.010024 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmh4m\" (UniqueName: \"kubernetes.io/projected/68fe97c1-4b26-445b-af5b-73808e119f0b-kube-api-access-gmh4m\") pod \"console-f9d7485db-7zvv6\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.024542 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.034943 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn5cp\" (UniqueName: \"kubernetes.io/projected/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-kube-api-access-wn5cp\") pod \"route-controller-manager-6576b87f9c-52nnq\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.041960 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2hmb\" (UniqueName: \"kubernetes.io/projected/09769310-f1d3-49d3-87bf-1921c35b32de-kube-api-access-j2hmb\") pod \"apiserver-7bbb656c7d-n2xzz\" (UID: \"09769310-f1d3-49d3-87bf-1921c35b32de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.043439 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.054156 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.059359 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-564hh\" (UniqueName: \"kubernetes.io/projected/66edf007-1920-4e59-a256-52a9360dcf9f-kube-api-access-564hh\") pod \"machine-approver-56656f9798-zg9xj\" (UID: \"66edf007-1920-4e59-a256-52a9360dcf9f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.078568 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gjqz\" (UniqueName: \"kubernetes.io/projected/9eee3621-c382-4ee6-a955-0061726a0214-kube-api-access-4gjqz\") pod \"cluster-samples-operator-665b6dd947-qqfld\" (UID: \"9eee3621-c382-4ee6-a955-0061726a0214\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.083961 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.104999 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkn47\" (UniqueName: \"kubernetes.io/projected/00b1b4b6-71d6-41a0-94e9-e1f137961e72-kube-api-access-rkn47\") pod \"dns-operator-744455d44c-2dp8z\" (UID: \"00b1b4b6-71d6-41a0-94e9-e1f137961e72\") " pod="openshift-dns-operator/dns-operator-744455d44c-2dp8z" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.122106 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt9x8\" (UniqueName: \"kubernetes.io/projected/999a31cf-76fd-4c51-82df-21bcf988140d-kube-api-access-bt9x8\") pod \"etcd-operator-b45778765-ffmj4\" (UID: \"999a31cf-76fd-4c51-82df-21bcf988140d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.123637 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.144974 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.151434 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.154978 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.156593 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ccj6\" (UniqueName: \"kubernetes.io/projected/8353c559-01f4-4b08-bf29-81566a889797-kube-api-access-5ccj6\") pod \"machine-config-controller-84d6567774-chrvd\" (UID: \"8353c559-01f4-4b08-bf29-81566a889797\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.162369 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2dp8z" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.178487 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzf5q\" (UniqueName: \"kubernetes.io/projected/01747236-9ab9-46b2-952a-2c065de19cf4-kube-api-access-mzf5q\") pod \"openshift-config-operator-7777fb866f-wpv8z\" (UID: \"01747236-9ab9-46b2-952a-2c065de19cf4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.178853 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de438831-f663-43cd-98f9-72e133534c61-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4n5bd\" (UID: \"de438831-f663-43cd-98f9-72e133534c61\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.182473 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.188590 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.217372 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjvv2\" (UniqueName: \"kubernetes.io/projected/5a250092-4e4c-4edf-943a-23b7ffe49bab-kube-api-access-wjvv2\") pod \"multus-admission-controller-857f4d67dd-lrbq7\" (UID: \"5a250092-4e4c-4edf-943a-23b7ffe49bab\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lrbq7" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.226984 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.227917 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36c8b3b-03b1-41fb-8649-660e3cdb1bf3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-slrnt\" (UID: \"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.238194 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.278705 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.279069 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gpkz\" (UniqueName: \"kubernetes.io/projected/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-kube-api-access-8gpkz\") pod \"cni-sysctl-allowlist-ds-wh9ns\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.281229 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lrbq7" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.294904 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb68d\" (UniqueName: \"kubernetes.io/projected/66397da5-478a-4800-93b9-012a7684f0ad-kube-api-access-nb68d\") pod \"olm-operator-6b444d44fb-bxkf4\" (UID: \"66397da5-478a-4800-93b9-012a7684f0ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.301877 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdk24\" (UniqueName: \"kubernetes.io/projected/345f9dbf-0dbd-4d48-841f-0f9637618c3a-kube-api-access-cdk24\") pod \"service-ca-operator-777779d784-mms8q\" (UID: \"345f9dbf-0dbd-4d48-841f-0f9637618c3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.308096 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp8xd\" (UniqueName: \"kubernetes.io/projected/76152f0e-2b76-469b-a55e-f94c53fe9e4d-kube-api-access-cp8xd\") pod \"downloads-7954f5f757-t9l6j\" (UID: \"76152f0e-2b76-469b-a55e-f94c53fe9e4d\") " pod="openshift-console/downloads-7954f5f757-t9l6j" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.312358 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.320093 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5zv\" (UniqueName: \"kubernetes.io/projected/de438831-f663-43cd-98f9-72e133534c61-kube-api-access-rv5zv\") pod \"cluster-image-registry-operator-dc59b4c8b-4n5bd\" (UID: \"de438831-f663-43cd-98f9-72e133534c61\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.351271 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qrk5s"] Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.351521 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.367265 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.378419 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.385114 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.386768 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.407327 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.407877 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.410852 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-r9f92"] Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.426728 4841 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.445014 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.466112 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.475878 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.494026 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t9l6j" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586041 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw22t\" (UniqueName: \"kubernetes.io/projected/7599df20-a4f3-4f48-b413-8eec3c0bba38-kube-api-access-tw22t\") pod \"marketplace-operator-79b997595-4c6xs\" (UID: \"7599df20-a4f3-4f48-b413-8eec3c0bba38\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586284 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b241ddd-e54d-45bb-bb4d-9001575f3cb0-srv-cert\") pod \"catalog-operator-68c6474976-xsjgv\" (UID: \"5b241ddd-e54d-45bb-bb4d-9001575f3cb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586332 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jpxb\" (UniqueName: \"kubernetes.io/projected/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-kube-api-access-6jpxb\") pod \"collect-profiles-29495820-dbjqh\" (UID: \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586352 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-config-volume\") pod \"collect-profiles-29495820-dbjqh\" (UID: \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586382 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vbvl\" (UniqueName: \"kubernetes.io/projected/ea8f46a8-b567-409c-ba03-4bdb0f85259d-kube-api-access-9vbvl\") pod \"openshift-controller-manager-operator-756b6f6bc6-65j7g\" (UID: \"ea8f46a8-b567-409c-ba03-4bdb0f85259d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586410 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck45n\" (UniqueName: \"kubernetes.io/projected/be148e1a-b698-4b18-abfa-9988c3c31971-kube-api-access-ck45n\") pod \"migrator-59844c95c7-f8frn\" (UID: \"be148e1a-b698-4b18-abfa-9988c3c31971\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f8frn" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586436 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/64d09cb6-73a6-4de8-8164-d8a241df4e5c-default-certificate\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586452 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c547h\" (UniqueName: \"kubernetes.io/projected/c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0-kube-api-access-c547h\") pod \"kube-storage-version-migrator-operator-b67b599dd-487lw\" (UID: \"c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586499 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wm2k\" (UniqueName: \"kubernetes.io/projected/e336a49d-a5c5-4b4e-9cd9-64e55db5a845-kube-api-access-7wm2k\") pod \"packageserver-d55dfcdfc-xkjjh\" (UID: \"e336a49d-a5c5-4b4e-9cd9-64e55db5a845\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586516 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5fzn\" (UniqueName: \"kubernetes.io/projected/d1e3388a-70c3-4588-a357-7131eae20e2e-kube-api-access-w5fzn\") pod \"ingress-operator-5b745b69d9-49fvr\" (UID: \"d1e3388a-70c3-4588-a357-7131eae20e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586541 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80ad8d1-44b4-4322-be87-71e2b9c72a5c-config\") pod \"console-operator-58897d9998-z5pll\" (UID: \"d80ad8d1-44b4-4322-be87-71e2b9c72a5c\") " pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586555 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1e3388a-70c3-4588-a357-7131eae20e2e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-49fvr\" (UID: \"d1e3388a-70c3-4588-a357-7131eae20e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586572 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d9c232c-75cc-4f38-bf2f-9e2de76138ca-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9pwf9\" (UID: \"2d9c232c-75cc-4f38-bf2f-9e2de76138ca\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586588 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnvmw\" (UniqueName: \"kubernetes.io/projected/64d09cb6-73a6-4de8-8164-d8a241df4e5c-kube-api-access-bnvmw\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586612 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/64d09cb6-73a6-4de8-8164-d8a241df4e5c-stats-auth\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.586638 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc438d29-a2a7-4774-9e46-93aa6a827129-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5dfls\" (UID: \"cc438d29-a2a7-4774-9e46-93aa6a827129\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.588239 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e336a49d-a5c5-4b4e-9cd9-64e55db5a845-apiservice-cert\") pod \"packageserver-d55dfcdfc-xkjjh\" (UID: \"e336a49d-a5c5-4b4e-9cd9-64e55db5a845\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.589315 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-487lw\" (UID: \"c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.589434 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2xzc\" (UniqueName: \"kubernetes.io/projected/d80ad8d1-44b4-4322-be87-71e2b9c72a5c-kube-api-access-m2xzc\") pod \"console-operator-58897d9998-z5pll\" (UID: \"d80ad8d1-44b4-4322-be87-71e2b9c72a5c\") " pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.589536 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d9c232c-75cc-4f38-bf2f-9e2de76138ca-config\") pod \"kube-apiserver-operator-766d6c64bb-9pwf9\" (UID: \"2d9c232c-75cc-4f38-bf2f-9e2de76138ca\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.589657 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1e3388a-70c3-4588-a357-7131eae20e2e-metrics-tls\") pod \"ingress-operator-5b745b69d9-49fvr\" (UID: \"d1e3388a-70c3-4588-a357-7131eae20e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.589748 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80ad8d1-44b4-4322-be87-71e2b9c72a5c-serving-cert\") pod \"console-operator-58897d9998-z5pll\" (UID: \"d80ad8d1-44b4-4322-be87-71e2b9c72a5c\") " pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.589769 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79ec5e12-1868-4efd-a76c-e7a06360cb3b-registry-certificates\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.589783 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d9c232c-75cc-4f38-bf2f-9e2de76138ca-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9pwf9\" (UID: \"2d9c232c-75cc-4f38-bf2f-9e2de76138ca\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593163 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/890f082d-3030-4e2a-bf11-fc6fb3f8bf18-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p8zbg\" (UID: \"890f082d-3030-4e2a-bf11-fc6fb3f8bf18\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593196 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sg89\" (UniqueName: \"kubernetes.io/projected/d7ad6da3-78c5-4541-8b41-bdcde44d577f-kube-api-access-8sg89\") pod \"service-ca-9c57cc56f-c7m6v\" (UID: \"d7ad6da3-78c5-4541-8b41-bdcde44d577f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593233 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7599df20-a4f3-4f48-b413-8eec3c0bba38-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4c6xs\" (UID: \"7599df20-a4f3-4f48-b413-8eec3c0bba38\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593250 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8f46a8-b567-409c-ba03-4bdb0f85259d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-65j7g\" (UID: \"ea8f46a8-b567-409c-ba03-4bdb0f85259d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593293 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-registry-tls\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593307 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79ec5e12-1868-4efd-a76c-e7a06360cb3b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593321 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d7ad6da3-78c5-4541-8b41-bdcde44d577f-signing-cabundle\") pod \"service-ca-9c57cc56f-c7m6v\" (UID: \"d7ad6da3-78c5-4541-8b41-bdcde44d577f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593358 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d80ad8d1-44b4-4322-be87-71e2b9c72a5c-trusted-ca\") pod \"console-operator-58897d9998-z5pll\" (UID: \"d80ad8d1-44b4-4322-be87-71e2b9c72a5c\") " pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593385 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64d09cb6-73a6-4de8-8164-d8a241df4e5c-service-ca-bundle\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593473 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78hz4\" (UniqueName: \"kubernetes.io/projected/3a4f6ee9-be68-47f5-a898-c39acbdb2852-kube-api-access-78hz4\") pod \"machine-config-operator-74547568cd-d9hrx\" (UID: \"3a4f6ee9-be68-47f5-a898-c39acbdb2852\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593490 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-487lw\" (UID: \"c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593505 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f55f083a-8e13-4cb3-aaec-9e5bef3f6075-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4j6fb\" (UID: \"f55f083a-8e13-4cb3-aaec-9e5bef3f6075\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593543 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmm95\" (UniqueName: \"kubernetes.io/projected/5b241ddd-e54d-45bb-bb4d-9001575f3cb0-kube-api-access-bmm95\") pod \"catalog-operator-68c6474976-xsjgv\" (UID: \"5b241ddd-e54d-45bb-bb4d-9001575f3cb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593580 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3a4f6ee9-be68-47f5-a898-c39acbdb2852-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d9hrx\" (UID: \"3a4f6ee9-be68-47f5-a898-c39acbdb2852\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593597 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f55f083a-8e13-4cb3-aaec-9e5bef3f6075-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4j6fb\" (UID: \"f55f083a-8e13-4cb3-aaec-9e5bef3f6075\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593615 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79ec5e12-1868-4efd-a76c-e7a06360cb3b-trusted-ca\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593631 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1e3388a-70c3-4588-a357-7131eae20e2e-trusted-ca\") pod \"ingress-operator-5b745b69d9-49fvr\" (UID: \"d1e3388a-70c3-4588-a357-7131eae20e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593691 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a4f6ee9-be68-47f5-a898-c39acbdb2852-images\") pod \"machine-config-operator-74547568cd-d9hrx\" (UID: \"3a4f6ee9-be68-47f5-a898-c39acbdb2852\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593708 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8f46a8-b567-409c-ba03-4bdb0f85259d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-65j7g\" (UID: \"ea8f46a8-b567-409c-ba03-4bdb0f85259d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593725 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wngnh\" (UniqueName: \"kubernetes.io/projected/cc438d29-a2a7-4774-9e46-93aa6a827129-kube-api-access-wngnh\") pod \"openshift-apiserver-operator-796bbdcf4f-5dfls\" (UID: \"cc438d29-a2a7-4774-9e46-93aa6a827129\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593751 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e336a49d-a5c5-4b4e-9cd9-64e55db5a845-tmpfs\") pod \"packageserver-d55dfcdfc-xkjjh\" (UID: \"e336a49d-a5c5-4b4e-9cd9-64e55db5a845\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.593768 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64d09cb6-73a6-4de8-8164-d8a241df4e5c-metrics-certs\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.594427 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tss7\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-kube-api-access-7tss7\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.594472 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d7ad6da3-78c5-4541-8b41-bdcde44d577f-signing-key\") pod \"service-ca-9c57cc56f-c7m6v\" (UID: \"d7ad6da3-78c5-4541-8b41-bdcde44d577f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.594523 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f55f083a-8e13-4cb3-aaec-9e5bef3f6075-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4j6fb\" (UID: \"f55f083a-8e13-4cb3-aaec-9e5bef3f6075\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.594567 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79ec5e12-1868-4efd-a76c-e7a06360cb3b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.594583 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a4f6ee9-be68-47f5-a898-c39acbdb2852-proxy-tls\") pod \"machine-config-operator-74547568cd-d9hrx\" (UID: \"3a4f6ee9-be68-47f5-a898-c39acbdb2852\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.594598 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5b241ddd-e54d-45bb-bb4d-9001575f3cb0-profile-collector-cert\") pod \"catalog-operator-68c6474976-xsjgv\" (UID: \"5b241ddd-e54d-45bb-bb4d-9001575f3cb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.594679 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e336a49d-a5c5-4b4e-9cd9-64e55db5a845-webhook-cert\") pod \"packageserver-d55dfcdfc-xkjjh\" (UID: \"e336a49d-a5c5-4b4e-9cd9-64e55db5a845\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.594701 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f2ade6a-33a5-4643-84ee-b2bd43c55446-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-s42sv\" (UID: \"5f2ade6a-33a5-4643-84ee-b2bd43c55446\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.594755 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdrcr\" (UniqueName: \"kubernetes.io/projected/5f2ade6a-33a5-4643-84ee-b2bd43c55446-kube-api-access-rdrcr\") pod \"control-plane-machine-set-operator-78cbb6b69f-s42sv\" (UID: \"5f2ade6a-33a5-4643-84ee-b2bd43c55446\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.601477 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-bound-sa-token\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.601669 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.601718 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7599df20-a4f3-4f48-b413-8eec3c0bba38-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4c6xs\" (UID: \"7599df20-a4f3-4f48-b413-8eec3c0bba38\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.601745 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfqqg\" (UniqueName: \"kubernetes.io/projected/890f082d-3030-4e2a-bf11-fc6fb3f8bf18-kube-api-access-bfqqg\") pod \"package-server-manager-789f6589d5-p8zbg\" (UID: \"890f082d-3030-4e2a-bf11-fc6fb3f8bf18\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.601812 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc438d29-a2a7-4774-9e46-93aa6a827129-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5dfls\" (UID: \"cc438d29-a2a7-4774-9e46-93aa6a827129\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.601850 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-secret-volume\") pod \"collect-profiles-29495820-dbjqh\" (UID: \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:38 crc kubenswrapper[4841]: E0130 05:08:38.602773 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:39.102759172 +0000 UTC m=+56.096231810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.665747 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9wm9f"] Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703037 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703166 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-config-volume\") pod \"collect-profiles-29495820-dbjqh\" (UID: \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703190 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vbvl\" (UniqueName: \"kubernetes.io/projected/ea8f46a8-b567-409c-ba03-4bdb0f85259d-kube-api-access-9vbvl\") pod \"openshift-controller-manager-operator-756b6f6bc6-65j7g\" (UID: \"ea8f46a8-b567-409c-ba03-4bdb0f85259d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703209 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck45n\" (UniqueName: \"kubernetes.io/projected/be148e1a-b698-4b18-abfa-9988c3c31971-kube-api-access-ck45n\") pod \"migrator-59844c95c7-f8frn\" (UID: \"be148e1a-b698-4b18-abfa-9988c3c31971\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f8frn" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703256 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-csi-data-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703277 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/64d09cb6-73a6-4de8-8164-d8a241df4e5c-default-certificate\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703298 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/afcb5d20-fdc9-4472-b201-253a90897fa5-certs\") pod \"machine-config-server-pcg6z\" (UID: \"afcb5d20-fdc9-4472-b201-253a90897fa5\") " pod="openshift-machine-config-operator/machine-config-server-pcg6z" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703320 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8hmh\" (UniqueName: \"kubernetes.io/projected/afcb5d20-fdc9-4472-b201-253a90897fa5-kube-api-access-l8hmh\") pod \"machine-config-server-pcg6z\" (UID: \"afcb5d20-fdc9-4472-b201-253a90897fa5\") " pod="openshift-machine-config-operator/machine-config-server-pcg6z" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703338 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c547h\" (UniqueName: \"kubernetes.io/projected/c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0-kube-api-access-c547h\") pod \"kube-storage-version-migrator-operator-b67b599dd-487lw\" (UID: \"c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703354 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wm2k\" (UniqueName: \"kubernetes.io/projected/e336a49d-a5c5-4b4e-9cd9-64e55db5a845-kube-api-access-7wm2k\") pod \"packageserver-d55dfcdfc-xkjjh\" (UID: \"e336a49d-a5c5-4b4e-9cd9-64e55db5a845\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703370 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c95986c1-1a6e-442a-bb94-4c6778398fec-metrics-tls\") pod \"dns-default-7wz6l\" (UID: \"c95986c1-1a6e-442a-bb94-4c6778398fec\") " pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703421 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5fzn\" (UniqueName: \"kubernetes.io/projected/d1e3388a-70c3-4588-a357-7131eae20e2e-kube-api-access-w5fzn\") pod \"ingress-operator-5b745b69d9-49fvr\" (UID: \"d1e3388a-70c3-4588-a357-7131eae20e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703456 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80ad8d1-44b4-4322-be87-71e2b9c72a5c-config\") pod \"console-operator-58897d9998-z5pll\" (UID: \"d80ad8d1-44b4-4322-be87-71e2b9c72a5c\") " pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703477 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1e3388a-70c3-4588-a357-7131eae20e2e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-49fvr\" (UID: \"d1e3388a-70c3-4588-a357-7131eae20e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703493 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d9c232c-75cc-4f38-bf2f-9e2de76138ca-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9pwf9\" (UID: \"2d9c232c-75cc-4f38-bf2f-9e2de76138ca\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703588 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnvmw\" (UniqueName: \"kubernetes.io/projected/64d09cb6-73a6-4de8-8164-d8a241df4e5c-kube-api-access-bnvmw\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703603 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/64d09cb6-73a6-4de8-8164-d8a241df4e5c-stats-auth\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703638 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc438d29-a2a7-4774-9e46-93aa6a827129-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5dfls\" (UID: \"cc438d29-a2a7-4774-9e46-93aa6a827129\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703675 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e336a49d-a5c5-4b4e-9cd9-64e55db5a845-apiservice-cert\") pod \"packageserver-d55dfcdfc-xkjjh\" (UID: \"e336a49d-a5c5-4b4e-9cd9-64e55db5a845\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703709 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-487lw\" (UID: \"c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703725 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2xzc\" (UniqueName: \"kubernetes.io/projected/d80ad8d1-44b4-4322-be87-71e2b9c72a5c-kube-api-access-m2xzc\") pod \"console-operator-58897d9998-z5pll\" (UID: \"d80ad8d1-44b4-4322-be87-71e2b9c72a5c\") " pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703743 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcae0fc1-0485-491b-9003-ded9c95fe166-cert\") pod \"ingress-canary-76czq\" (UID: \"dcae0fc1-0485-491b-9003-ded9c95fe166\") " pod="openshift-ingress-canary/ingress-canary-76czq" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703758 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-registration-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703784 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d9c232c-75cc-4f38-bf2f-9e2de76138ca-config\") pod \"kube-apiserver-operator-766d6c64bb-9pwf9\" (UID: \"2d9c232c-75cc-4f38-bf2f-9e2de76138ca\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703800 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1e3388a-70c3-4588-a357-7131eae20e2e-metrics-tls\") pod \"ingress-operator-5b745b69d9-49fvr\" (UID: \"d1e3388a-70c3-4588-a357-7131eae20e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703818 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80ad8d1-44b4-4322-be87-71e2b9c72a5c-serving-cert\") pod \"console-operator-58897d9998-z5pll\" (UID: \"d80ad8d1-44b4-4322-be87-71e2b9c72a5c\") " pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703837 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79ec5e12-1868-4efd-a76c-e7a06360cb3b-registry-certificates\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703853 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d9c232c-75cc-4f38-bf2f-9e2de76138ca-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9pwf9\" (UID: \"2d9c232c-75cc-4f38-bf2f-9e2de76138ca\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703871 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/afcb5d20-fdc9-4472-b201-253a90897fa5-node-bootstrap-token\") pod \"machine-config-server-pcg6z\" (UID: \"afcb5d20-fdc9-4472-b201-253a90897fa5\") " pod="openshift-machine-config-operator/machine-config-server-pcg6z" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703892 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/890f082d-3030-4e2a-bf11-fc6fb3f8bf18-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p8zbg\" (UID: \"890f082d-3030-4e2a-bf11-fc6fb3f8bf18\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703909 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sg89\" (UniqueName: \"kubernetes.io/projected/d7ad6da3-78c5-4541-8b41-bdcde44d577f-kube-api-access-8sg89\") pod \"service-ca-9c57cc56f-c7m6v\" (UID: \"d7ad6da3-78c5-4541-8b41-bdcde44d577f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703927 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7599df20-a4f3-4f48-b413-8eec3c0bba38-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4c6xs\" (UID: \"7599df20-a4f3-4f48-b413-8eec3c0bba38\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703942 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8f46a8-b567-409c-ba03-4bdb0f85259d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-65j7g\" (UID: \"ea8f46a8-b567-409c-ba03-4bdb0f85259d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703958 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-plugins-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-registry-tls\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.703988 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrjfd\" (UniqueName: \"kubernetes.io/projected/ee896b81-0955-4a1e-a9ac-20887e4612c1-kube-api-access-hrjfd\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704003 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79ec5e12-1868-4efd-a76c-e7a06360cb3b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704018 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d7ad6da3-78c5-4541-8b41-bdcde44d577f-signing-cabundle\") pod \"service-ca-9c57cc56f-c7m6v\" (UID: \"d7ad6da3-78c5-4541-8b41-bdcde44d577f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704033 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d80ad8d1-44b4-4322-be87-71e2b9c72a5c-trusted-ca\") pod \"console-operator-58897d9998-z5pll\" (UID: \"d80ad8d1-44b4-4322-be87-71e2b9c72a5c\") " pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704048 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92vbz\" (UniqueName: \"kubernetes.io/projected/c95986c1-1a6e-442a-bb94-4c6778398fec-kube-api-access-92vbz\") pod \"dns-default-7wz6l\" (UID: \"c95986c1-1a6e-442a-bb94-4c6778398fec\") " pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704064 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64d09cb6-73a6-4de8-8164-d8a241df4e5c-service-ca-bundle\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704081 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-socket-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704100 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78hz4\" (UniqueName: \"kubernetes.io/projected/3a4f6ee9-be68-47f5-a898-c39acbdb2852-kube-api-access-78hz4\") pod \"machine-config-operator-74547568cd-d9hrx\" (UID: \"3a4f6ee9-be68-47f5-a898-c39acbdb2852\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704114 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-487lw\" (UID: \"c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704129 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f55f083a-8e13-4cb3-aaec-9e5bef3f6075-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4j6fb\" (UID: \"f55f083a-8e13-4cb3-aaec-9e5bef3f6075\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704147 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmm95\" (UniqueName: \"kubernetes.io/projected/5b241ddd-e54d-45bb-bb4d-9001575f3cb0-kube-api-access-bmm95\") pod \"catalog-operator-68c6474976-xsjgv\" (UID: \"5b241ddd-e54d-45bb-bb4d-9001575f3cb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704162 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3a4f6ee9-be68-47f5-a898-c39acbdb2852-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d9hrx\" (UID: \"3a4f6ee9-be68-47f5-a898-c39acbdb2852\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704177 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f55f083a-8e13-4cb3-aaec-9e5bef3f6075-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4j6fb\" (UID: \"f55f083a-8e13-4cb3-aaec-9e5bef3f6075\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704193 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79ec5e12-1868-4efd-a76c-e7a06360cb3b-trusted-ca\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704207 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1e3388a-70c3-4588-a357-7131eae20e2e-trusted-ca\") pod \"ingress-operator-5b745b69d9-49fvr\" (UID: \"d1e3388a-70c3-4588-a357-7131eae20e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704225 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a4f6ee9-be68-47f5-a898-c39acbdb2852-images\") pod \"machine-config-operator-74547568cd-d9hrx\" (UID: \"3a4f6ee9-be68-47f5-a898-c39acbdb2852\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704241 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8f46a8-b567-409c-ba03-4bdb0f85259d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-65j7g\" (UID: \"ea8f46a8-b567-409c-ba03-4bdb0f85259d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704257 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wngnh\" (UniqueName: \"kubernetes.io/projected/cc438d29-a2a7-4774-9e46-93aa6a827129-kube-api-access-wngnh\") pod \"openshift-apiserver-operator-796bbdcf4f-5dfls\" (UID: \"cc438d29-a2a7-4774-9e46-93aa6a827129\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704273 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4vst\" (UniqueName: \"kubernetes.io/projected/dcae0fc1-0485-491b-9003-ded9c95fe166-kube-api-access-c4vst\") pod \"ingress-canary-76czq\" (UID: \"dcae0fc1-0485-491b-9003-ded9c95fe166\") " pod="openshift-ingress-canary/ingress-canary-76czq" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704297 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e336a49d-a5c5-4b4e-9cd9-64e55db5a845-tmpfs\") pod \"packageserver-d55dfcdfc-xkjjh\" (UID: \"e336a49d-a5c5-4b4e-9cd9-64e55db5a845\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704312 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64d09cb6-73a6-4de8-8164-d8a241df4e5c-metrics-certs\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704328 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tss7\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-kube-api-access-7tss7\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704343 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d7ad6da3-78c5-4541-8b41-bdcde44d577f-signing-key\") pod \"service-ca-9c57cc56f-c7m6v\" (UID: \"d7ad6da3-78c5-4541-8b41-bdcde44d577f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704358 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f55f083a-8e13-4cb3-aaec-9e5bef3f6075-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4j6fb\" (UID: \"f55f083a-8e13-4cb3-aaec-9e5bef3f6075\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704381 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79ec5e12-1868-4efd-a76c-e7a06360cb3b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704439 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a4f6ee9-be68-47f5-a898-c39acbdb2852-proxy-tls\") pod \"machine-config-operator-74547568cd-d9hrx\" (UID: \"3a4f6ee9-be68-47f5-a898-c39acbdb2852\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704457 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5b241ddd-e54d-45bb-bb4d-9001575f3cb0-profile-collector-cert\") pod \"catalog-operator-68c6474976-xsjgv\" (UID: \"5b241ddd-e54d-45bb-bb4d-9001575f3cb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704478 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e336a49d-a5c5-4b4e-9cd9-64e55db5a845-webhook-cert\") pod \"packageserver-d55dfcdfc-xkjjh\" (UID: \"e336a49d-a5c5-4b4e-9cd9-64e55db5a845\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704493 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f2ade6a-33a5-4643-84ee-b2bd43c55446-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-s42sv\" (UID: \"5f2ade6a-33a5-4643-84ee-b2bd43c55446\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704513 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdrcr\" (UniqueName: \"kubernetes.io/projected/5f2ade6a-33a5-4643-84ee-b2bd43c55446-kube-api-access-rdrcr\") pod \"control-plane-machine-set-operator-78cbb6b69f-s42sv\" (UID: \"5f2ade6a-33a5-4643-84ee-b2bd43c55446\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704531 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-bound-sa-token\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704561 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7599df20-a4f3-4f48-b413-8eec3c0bba38-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4c6xs\" (UID: \"7599df20-a4f3-4f48-b413-8eec3c0bba38\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704577 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfqqg\" (UniqueName: \"kubernetes.io/projected/890f082d-3030-4e2a-bf11-fc6fb3f8bf18-kube-api-access-bfqqg\") pod \"package-server-manager-789f6589d5-p8zbg\" (UID: \"890f082d-3030-4e2a-bf11-fc6fb3f8bf18\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704594 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc438d29-a2a7-4774-9e46-93aa6a827129-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5dfls\" (UID: \"cc438d29-a2a7-4774-9e46-93aa6a827129\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704609 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-secret-volume\") pod \"collect-profiles-29495820-dbjqh\" (UID: \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704624 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-mountpoint-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704639 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c95986c1-1a6e-442a-bb94-4c6778398fec-config-volume\") pod \"dns-default-7wz6l\" (UID: \"c95986c1-1a6e-442a-bb94-4c6778398fec\") " pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704673 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw22t\" (UniqueName: \"kubernetes.io/projected/7599df20-a4f3-4f48-b413-8eec3c0bba38-kube-api-access-tw22t\") pod \"marketplace-operator-79b997595-4c6xs\" (UID: \"7599df20-a4f3-4f48-b413-8eec3c0bba38\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704687 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b241ddd-e54d-45bb-bb4d-9001575f3cb0-srv-cert\") pod \"catalog-operator-68c6474976-xsjgv\" (UID: \"5b241ddd-e54d-45bb-bb4d-9001575f3cb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.704746 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jpxb\" (UniqueName: \"kubernetes.io/projected/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-kube-api-access-6jpxb\") pod \"collect-profiles-29495820-dbjqh\" (UID: \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:38 crc kubenswrapper[4841]: E0130 05:08:38.705007 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:39.204994156 +0000 UTC m=+56.198466794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.706574 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc438d29-a2a7-4774-9e46-93aa6a827129-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5dfls\" (UID: \"cc438d29-a2a7-4774-9e46-93aa6a827129\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.706761 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-config-volume\") pod \"collect-profiles-29495820-dbjqh\" (UID: \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.706886 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7599df20-a4f3-4f48-b413-8eec3c0bba38-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4c6xs\" (UID: \"7599df20-a4f3-4f48-b413-8eec3c0bba38\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.708083 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d9c232c-75cc-4f38-bf2f-9e2de76138ca-config\") pod \"kube-apiserver-operator-766d6c64bb-9pwf9\" (UID: \"2d9c232c-75cc-4f38-bf2f-9e2de76138ca\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.710155 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/64d09cb6-73a6-4de8-8164-d8a241df4e5c-default-certificate\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.710483 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3a4f6ee9-be68-47f5-a898-c39acbdb2852-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d9hrx\" (UID: \"3a4f6ee9-be68-47f5-a898-c39acbdb2852\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.711513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79ec5e12-1868-4efd-a76c-e7a06360cb3b-trusted-ca\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.712480 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1e3388a-70c3-4588-a357-7131eae20e2e-trusted-ca\") pod \"ingress-operator-5b745b69d9-49fvr\" (UID: \"d1e3388a-70c3-4588-a357-7131eae20e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.712872 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a4f6ee9-be68-47f5-a898-c39acbdb2852-images\") pod \"machine-config-operator-74547568cd-d9hrx\" (UID: \"3a4f6ee9-be68-47f5-a898-c39acbdb2852\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.713300 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8f46a8-b567-409c-ba03-4bdb0f85259d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-65j7g\" (UID: \"ea8f46a8-b567-409c-ba03-4bdb0f85259d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.713639 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e336a49d-a5c5-4b4e-9cd9-64e55db5a845-tmpfs\") pod \"packageserver-d55dfcdfc-xkjjh\" (UID: \"e336a49d-a5c5-4b4e-9cd9-64e55db5a845\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.715494 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d7ad6da3-78c5-4541-8b41-bdcde44d577f-signing-key\") pod \"service-ca-9c57cc56f-c7m6v\" (UID: \"d7ad6da3-78c5-4541-8b41-bdcde44d577f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.716474 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-487lw\" (UID: \"c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.717352 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/64d09cb6-73a6-4de8-8164-d8a241df4e5c-stats-auth\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.717988 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e336a49d-a5c5-4b4e-9cd9-64e55db5a845-webhook-cert\") pod \"packageserver-d55dfcdfc-xkjjh\" (UID: \"e336a49d-a5c5-4b4e-9cd9-64e55db5a845\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.718294 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8f46a8-b567-409c-ba03-4bdb0f85259d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-65j7g\" (UID: \"ea8f46a8-b567-409c-ba03-4bdb0f85259d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.718362 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1e3388a-70c3-4588-a357-7131eae20e2e-metrics-tls\") pod \"ingress-operator-5b745b69d9-49fvr\" (UID: \"d1e3388a-70c3-4588-a357-7131eae20e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.718434 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64d09cb6-73a6-4de8-8164-d8a241df4e5c-metrics-certs\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.718713 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79ec5e12-1868-4efd-a76c-e7a06360cb3b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.719089 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-secret-volume\") pod \"collect-profiles-29495820-dbjqh\" (UID: \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.719146 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f2ade6a-33a5-4643-84ee-b2bd43c55446-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-s42sv\" (UID: \"5f2ade6a-33a5-4643-84ee-b2bd43c55446\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.719870 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79ec5e12-1868-4efd-a76c-e7a06360cb3b-registry-certificates\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.720140 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d80ad8d1-44b4-4322-be87-71e2b9c72a5c-trusted-ca\") pod \"console-operator-58897d9998-z5pll\" (UID: \"d80ad8d1-44b4-4322-be87-71e2b9c72a5c\") " pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.720285 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.720910 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79ec5e12-1868-4efd-a76c-e7a06360cb3b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.720979 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d80ad8d1-44b4-4322-be87-71e2b9c72a5c-config\") pod \"console-operator-58897d9998-z5pll\" (UID: \"d80ad8d1-44b4-4322-be87-71e2b9c72a5c\") " pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.725691 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d7ad6da3-78c5-4541-8b41-bdcde44d577f-signing-cabundle\") pod \"service-ca-9c57cc56f-c7m6v\" (UID: \"d7ad6da3-78c5-4541-8b41-bdcde44d577f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.726133 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d80ad8d1-44b4-4322-be87-71e2b9c72a5c-serving-cert\") pod \"console-operator-58897d9998-z5pll\" (UID: \"d80ad8d1-44b4-4322-be87-71e2b9c72a5c\") " pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.726250 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/890f082d-3030-4e2a-bf11-fc6fb3f8bf18-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p8zbg\" (UID: \"890f082d-3030-4e2a-bf11-fc6fb3f8bf18\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.727439 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-487lw\" (UID: \"c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.728079 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e336a49d-a5c5-4b4e-9cd9-64e55db5a845-apiservice-cert\") pod \"packageserver-d55dfcdfc-xkjjh\" (UID: \"e336a49d-a5c5-4b4e-9cd9-64e55db5a845\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.728416 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7599df20-a4f3-4f48-b413-8eec3c0bba38-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4c6xs\" (UID: \"7599df20-a4f3-4f48-b413-8eec3c0bba38\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.728880 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64d09cb6-73a6-4de8-8164-d8a241df4e5c-service-ca-bundle\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.729807 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a4f6ee9-be68-47f5-a898-c39acbdb2852-proxy-tls\") pod \"machine-config-operator-74547568cd-d9hrx\" (UID: \"3a4f6ee9-be68-47f5-a898-c39acbdb2852\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.730500 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f55f083a-8e13-4cb3-aaec-9e5bef3f6075-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4j6fb\" (UID: \"f55f083a-8e13-4cb3-aaec-9e5bef3f6075\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.733127 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc438d29-a2a7-4774-9e46-93aa6a827129-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5dfls\" (UID: \"cc438d29-a2a7-4774-9e46-93aa6a827129\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.733834 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-registry-tls\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.734955 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b241ddd-e54d-45bb-bb4d-9001575f3cb0-srv-cert\") pod \"catalog-operator-68c6474976-xsjgv\" (UID: \"5b241ddd-e54d-45bb-bb4d-9001575f3cb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.744443 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f55f083a-8e13-4cb3-aaec-9e5bef3f6075-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4j6fb\" (UID: \"f55f083a-8e13-4cb3-aaec-9e5bef3f6075\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.746463 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d9c232c-75cc-4f38-bf2f-9e2de76138ca-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9pwf9\" (UID: \"2d9c232c-75cc-4f38-bf2f-9e2de76138ca\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.751825 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5b241ddd-e54d-45bb-bb4d-9001575f3cb0-profile-collector-cert\") pod \"catalog-operator-68c6474976-xsjgv\" (UID: \"5b241ddd-e54d-45bb-bb4d-9001575f3cb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.753698 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz"] Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.754341 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tss7\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-kube-api-access-7tss7\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.757141 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5hkjb"] Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.775971 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vbvl\" (UniqueName: \"kubernetes.io/projected/ea8f46a8-b567-409c-ba03-4bdb0f85259d-kube-api-access-9vbvl\") pod \"openshift-controller-manager-operator-756b6f6bc6-65j7g\" (UID: \"ea8f46a8-b567-409c-ba03-4bdb0f85259d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.780715 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck45n\" (UniqueName: \"kubernetes.io/projected/be148e1a-b698-4b18-abfa-9988c3c31971-kube-api-access-ck45n\") pod \"migrator-59844c95c7-f8frn\" (UID: \"be148e1a-b698-4b18-abfa-9988c3c31971\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f8frn" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.804359 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2xzc\" (UniqueName: \"kubernetes.io/projected/d80ad8d1-44b4-4322-be87-71e2b9c72a5c-kube-api-access-m2xzc\") pod \"console-operator-58897d9998-z5pll\" (UID: \"d80ad8d1-44b4-4322-be87-71e2b9c72a5c\") " pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.809143 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcae0fc1-0485-491b-9003-ded9c95fe166-cert\") pod \"ingress-canary-76czq\" (UID: \"dcae0fc1-0485-491b-9003-ded9c95fe166\") " pod="openshift-ingress-canary/ingress-canary-76czq" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.809179 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-registration-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.809180 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.809208 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/afcb5d20-fdc9-4472-b201-253a90897fa5-node-bootstrap-token\") pod \"machine-config-server-pcg6z\" (UID: \"afcb5d20-fdc9-4472-b201-253a90897fa5\") " pod="openshift-machine-config-operator/machine-config-server-pcg6z" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.810237 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-plugins-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.810260 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrjfd\" (UniqueName: \"kubernetes.io/projected/ee896b81-0955-4a1e-a9ac-20887e4612c1-kube-api-access-hrjfd\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.810304 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92vbz\" (UniqueName: \"kubernetes.io/projected/c95986c1-1a6e-442a-bb94-4c6778398fec-kube-api-access-92vbz\") pod \"dns-default-7wz6l\" (UID: \"c95986c1-1a6e-442a-bb94-4c6778398fec\") " pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.810331 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-socket-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.810439 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4vst\" (UniqueName: \"kubernetes.io/projected/dcae0fc1-0485-491b-9003-ded9c95fe166-kube-api-access-c4vst\") pod \"ingress-canary-76czq\" (UID: \"dcae0fc1-0485-491b-9003-ded9c95fe166\") " pod="openshift-ingress-canary/ingress-canary-76czq" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.810578 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.810640 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-mountpoint-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.810656 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c95986c1-1a6e-442a-bb94-4c6778398fec-config-volume\") pod \"dns-default-7wz6l\" (UID: \"c95986c1-1a6e-442a-bb94-4c6778398fec\") " pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.810745 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-csi-data-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.810796 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c95986c1-1a6e-442a-bb94-4c6778398fec-metrics-tls\") pod \"dns-default-7wz6l\" (UID: \"c95986c1-1a6e-442a-bb94-4c6778398fec\") " pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.810954 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-plugins-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: E0130 05:08:38.811430 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:39.311390185 +0000 UTC m=+56.304862823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.811501 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-mountpoint-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.811509 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-socket-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.811606 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-csi-data-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.811652 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ee896b81-0955-4a1e-a9ac-20887e4612c1-registration-dir\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.811997 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/afcb5d20-fdc9-4472-b201-253a90897fa5-certs\") pod \"machine-config-server-pcg6z\" (UID: \"afcb5d20-fdc9-4472-b201-253a90897fa5\") " pod="openshift-machine-config-operator/machine-config-server-pcg6z" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.812019 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8hmh\" (UniqueName: \"kubernetes.io/projected/afcb5d20-fdc9-4472-b201-253a90897fa5-kube-api-access-l8hmh\") pod \"machine-config-server-pcg6z\" (UID: \"afcb5d20-fdc9-4472-b201-253a90897fa5\") " pod="openshift-machine-config-operator/machine-config-server-pcg6z" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.815963 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcae0fc1-0485-491b-9003-ded9c95fe166-cert\") pod \"ingress-canary-76czq\" (UID: \"dcae0fc1-0485-491b-9003-ded9c95fe166\") " pod="openshift-ingress-canary/ingress-canary-76czq" Jan 30 05:08:38 crc kubenswrapper[4841]: W0130 05:08:38.816289 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbe3fe34_31a2_4843_9df6_8dd5d6c968d8.slice/crio-8fd32b2ceaa40ce2ab8660e894215fe4a654d9dfe28e2f70030a41d3abb53ab6 WatchSource:0}: Error finding container 8fd32b2ceaa40ce2ab8660e894215fe4a654d9dfe28e2f70030a41d3abb53ab6: Status 404 returned error can't find the container with id 8fd32b2ceaa40ce2ab8660e894215fe4a654d9dfe28e2f70030a41d3abb53ab6 Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.818851 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c95986c1-1a6e-442a-bb94-4c6778398fec-config-volume\") pod \"dns-default-7wz6l\" (UID: \"c95986c1-1a6e-442a-bb94-4c6778398fec\") " pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.819263 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.819828 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c95986c1-1a6e-442a-bb94-4c6778398fec-metrics-tls\") pod \"dns-default-7wz6l\" (UID: \"c95986c1-1a6e-442a-bb94-4c6778398fec\") " pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.826136 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/afcb5d20-fdc9-4472-b201-253a90897fa5-certs\") pod \"machine-config-server-pcg6z\" (UID: \"afcb5d20-fdc9-4472-b201-253a90897fa5\") " pod="openshift-machine-config-operator/machine-config-server-pcg6z" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.827823 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/afcb5d20-fdc9-4472-b201-253a90897fa5-node-bootstrap-token\") pod \"machine-config-server-pcg6z\" (UID: \"afcb5d20-fdc9-4472-b201-253a90897fa5\") " pod="openshift-machine-config-operator/machine-config-server-pcg6z" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.829683 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld"] Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.852920 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnvmw\" (UniqueName: \"kubernetes.io/projected/64d09cb6-73a6-4de8-8164-d8a241df4e5c-kube-api-access-bnvmw\") pod \"router-default-5444994796-r5skt\" (UID: \"64d09cb6-73a6-4de8-8164-d8a241df4e5c\") " pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.858022 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" event={"ID":"66edf007-1920-4e59-a256-52a9360dcf9f","Type":"ContainerStarted","Data":"d799d593288cc2022c1ba0a5f5a83ce7815223a7b0849b4d46191425f385d238"} Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.860807 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ffmj4"] Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.866025 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" event={"ID":"a785f654-41ed-4c03-baf7-b0fb5bc3f543","Type":"ContainerStarted","Data":"0d4b94244b9cee6d38105026731cb2e3c2579462d484dc74cc9f5930d17c3064"} Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.872148 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f8frn" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.872277 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmm95\" (UniqueName: \"kubernetes.io/projected/5b241ddd-e54d-45bb-bb4d-9001575f3cb0-kube-api-access-bmm95\") pod \"catalog-operator-68c6474976-xsjgv\" (UID: \"5b241ddd-e54d-45bb-bb4d-9001575f3cb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.881423 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7zvv6"] Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.881461 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2dp8z"] Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.881652 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" event={"ID":"29892f01-d39f-41cd-aa3c-402791553b2c","Type":"ContainerStarted","Data":"e0e2d240e55aec9180d486764099d04a2b9f2b2feca8f33ead6a2fc489abd593"} Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.881679 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" event={"ID":"29892f01-d39f-41cd-aa3c-402791553b2c","Type":"ContainerStarted","Data":"68eda3759be985e57d1a2ab37ef00df4d41b99d19039b8a34362be3a06d91d5a"} Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.891667 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c547h\" (UniqueName: \"kubernetes.io/projected/c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0-kube-api-access-c547h\") pod \"kube-storage-version-migrator-operator-b67b599dd-487lw\" (UID: \"c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.897836 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.903227 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wm2k\" (UniqueName: \"kubernetes.io/projected/e336a49d-a5c5-4b4e-9cd9-64e55db5a845-kube-api-access-7wm2k\") pod \"packageserver-d55dfcdfc-xkjjh\" (UID: \"e336a49d-a5c5-4b4e-9cd9-64e55db5a845\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.909472 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" event={"ID":"87eccd50-4e4a-408b-aa2e-3c431b6d17d0","Type":"ContainerStarted","Data":"1cb3ba3a872080b8feeee5f7a44e324674e82ae7de9f5b4629024b7bbc64af39"} Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.913315 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:38 crc kubenswrapper[4841]: E0130 05:08:38.914047 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:39.41402278 +0000 UTC m=+56.407495418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.915710 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" event={"ID":"692a0681-d33c-43ff-b458-8c2302df6bd9","Type":"ContainerStarted","Data":"b9b8497d146c46314ea0cbedd7869cb4812e5be66207835f96a79d46c7fad250"} Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.915741 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" event={"ID":"692a0681-d33c-43ff-b458-8c2302df6bd9","Type":"ContainerStarted","Data":"0b558d4e55c562e2dc4170384c16da8d7dcb5e20dbf6c62cce89e508d41b6c72"} Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.931715 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" event={"ID":"09769310-f1d3-49d3-87bf-1921c35b32de","Type":"ContainerStarted","Data":"ae482e17da54a8d84cfbc020d9f1ed96eca62ae132af0692635f53daaa985b15"} Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.936855 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f55f083a-8e13-4cb3-aaec-9e5bef3f6075-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4j6fb\" (UID: \"f55f083a-8e13-4cb3-aaec-9e5bef3f6075\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.942040 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5fzn\" (UniqueName: \"kubernetes.io/projected/d1e3388a-70c3-4588-a357-7131eae20e2e-kube-api-access-w5fzn\") pod \"ingress-operator-5b745b69d9-49fvr\" (UID: \"d1e3388a-70c3-4588-a357-7131eae20e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.944876 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.965690 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw22t\" (UniqueName: \"kubernetes.io/projected/7599df20-a4f3-4f48-b413-8eec3c0bba38-kube-api-access-tw22t\") pod \"marketplace-operator-79b997595-4c6xs\" (UID: \"7599df20-a4f3-4f48-b413-8eec3c0bba38\") " pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.986565 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wngnh\" (UniqueName: \"kubernetes.io/projected/cc438d29-a2a7-4774-9e46-93aa6a827129-kube-api-access-wngnh\") pod \"openshift-apiserver-operator-796bbdcf4f-5dfls\" (UID: \"cc438d29-a2a7-4774-9e46-93aa6a827129\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.998446 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jpxb\" (UniqueName: \"kubernetes.io/projected/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-kube-api-access-6jpxb\") pod \"collect-profiles-29495820-dbjqh\" (UID: \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:38 crc kubenswrapper[4841]: I0130 05:08:38.998563 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.016233 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:39 crc kubenswrapper[4841]: E0130 05:08:39.019434 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:39.519416223 +0000 UTC m=+56.512888861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.032097 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1e3388a-70c3-4588-a357-7131eae20e2e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-49fvr\" (UID: \"d1e3388a-70c3-4588-a357-7131eae20e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.037996 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.040791 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.040823 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bznfv"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.048066 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mms8q"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.048118 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t9l6j"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.054608 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d9c232c-75cc-4f38-bf2f-9e2de76138ca-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9pwf9\" (UID: \"2d9c232c-75cc-4f38-bf2f-9e2de76138ca\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.066520 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdrcr\" (UniqueName: \"kubernetes.io/projected/5f2ade6a-33a5-4643-84ee-b2bd43c55446-kube-api-access-rdrcr\") pod \"control-plane-machine-set-operator-78cbb6b69f-s42sv\" (UID: \"5f2ade6a-33a5-4643-84ee-b2bd43c55446\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.069193 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.080273 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-bound-sa-token\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.100513 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.105783 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.112232 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.117347 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:39 crc kubenswrapper[4841]: E0130 05:08:39.117742 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:39.61772535 +0000 UTC m=+56.611197988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.119832 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sg89\" (UniqueName: \"kubernetes.io/projected/d7ad6da3-78c5-4541-8b41-bdcde44d577f-kube-api-access-8sg89\") pod \"service-ca-9c57cc56f-c7m6v\" (UID: \"d7ad6da3-78c5-4541-8b41-bdcde44d577f\") " pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" Jan 30 05:08:39 crc kubenswrapper[4841]: W0130 05:08:39.143753 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1228156_5459_400b_97d7_16c75238223b.slice/crio-28f639c20c1dc20b4d279fe37915def251836134badd8621b98a931b7b1e8d54 WatchSource:0}: Error finding container 28f639c20c1dc20b4d279fe37915def251836134badd8621b98a931b7b1e8d54: Status 404 returned error can't find the container with id 28f639c20c1dc20b4d279fe37915def251836134badd8621b98a931b7b1e8d54 Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.151776 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfqqg\" (UniqueName: \"kubernetes.io/projected/890f082d-3030-4e2a-bf11-fc6fb3f8bf18-kube-api-access-bfqqg\") pod \"package-server-manager-789f6589d5-p8zbg\" (UID: \"890f082d-3030-4e2a-bf11-fc6fb3f8bf18\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.160644 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.161747 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.162294 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78hz4\" (UniqueName: \"kubernetes.io/projected/3a4f6ee9-be68-47f5-a898-c39acbdb2852-kube-api-access-78hz4\") pod \"machine-config-operator-74547568cd-d9hrx\" (UID: \"3a4f6ee9-be68-47f5-a898-c39acbdb2852\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.164437 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lrbq7"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.176639 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.178788 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.185709 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.188935 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.192978 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.193278 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92vbz\" (UniqueName: \"kubernetes.io/projected/c95986c1-1a6e-442a-bb94-4c6778398fec-kube-api-access-92vbz\") pod \"dns-default-7wz6l\" (UID: \"c95986c1-1a6e-442a-bb94-4c6778398fec\") " pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.204934 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.214206 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrjfd\" (UniqueName: \"kubernetes.io/projected/ee896b81-0955-4a1e-a9ac-20887e4612c1-kube-api-access-hrjfd\") pod \"csi-hostpathplugin-7kqcm\" (UID: \"ee896b81-0955-4a1e-a9ac-20887e4612c1\") " pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.218381 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4vst\" (UniqueName: \"kubernetes.io/projected/dcae0fc1-0485-491b-9003-ded9c95fe166-kube-api-access-c4vst\") pod \"ingress-canary-76czq\" (UID: \"dcae0fc1-0485-491b-9003-ded9c95fe166\") " pod="openshift-ingress-canary/ingress-canary-76czq" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.218590 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.219139 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:39 crc kubenswrapper[4841]: E0130 05:08:39.219538 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:39.719527642 +0000 UTC m=+56.713000280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.231205 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.231986 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.243522 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8hmh\" (UniqueName: \"kubernetes.io/projected/afcb5d20-fdc9-4472-b201-253a90897fa5-kube-api-access-l8hmh\") pod \"machine-config-server-pcg6z\" (UID: \"afcb5d20-fdc9-4472-b201-253a90897fa5\") " pod="openshift-machine-config-operator/machine-config-server-pcg6z" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.273978 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.291600 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.305989 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-z5pll"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.321375 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:39 crc kubenswrapper[4841]: E0130 05:08:39.321685 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:39.821671635 +0000 UTC m=+56.815144263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.321761 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pcg6z" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.322093 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-76czq" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.346915 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.349494 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.364349 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-f8frn"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.386555 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.429460 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:39 crc kubenswrapper[4841]: E0130 05:08:39.429733 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:39.929722876 +0000 UTC m=+56.923195504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:39 crc kubenswrapper[4841]: W0130 05:08:39.479156 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd80ad8d1_44b4_4322_be87_71e2b9c72a5c.slice/crio-14a6cefab88359ea2456fc039545379fed9094a4f4b2660b9a1447ad7d6e276e WatchSource:0}: Error finding container 14a6cefab88359ea2456fc039545379fed9094a4f4b2660b9a1447ad7d6e276e: Status 404 returned error can't find the container with id 14a6cefab88359ea2456fc039545379fed9094a4f4b2660b9a1447ad7d6e276e Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.487707 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.533711 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:39 crc kubenswrapper[4841]: E0130 05:08:39.534082 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.034066423 +0000 UTC m=+57.027539061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.536590 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls"] Jan 30 05:08:39 crc kubenswrapper[4841]: W0130 05:08:39.560991 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode336a49d_a5c5_4b4e_9cd9_64e55db5a845.slice/crio-5a2c00361cc6231b6f2c151fb30184fcecbb11080359534f82886ab3466f9757 WatchSource:0}: Error finding container 5a2c00361cc6231b6f2c151fb30184fcecbb11080359534f82886ab3466f9757: Status 404 returned error can't find the container with id 5a2c00361cc6231b6f2c151fb30184fcecbb11080359534f82886ab3466f9757 Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.634956 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:39 crc kubenswrapper[4841]: E0130 05:08:39.636154 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.136139124 +0000 UTC m=+57.129611762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:39 crc kubenswrapper[4841]: W0130 05:08:39.640607 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc438d29_a2a7_4774_9e46_93aa6a827129.slice/crio-d84bebdb2624e03d2520f68d7d181f9befc8ddc2bd0606a748134f70747bc652 WatchSource:0}: Error finding container d84bebdb2624e03d2520f68d7d181f9befc8ddc2bd0606a748134f70747bc652: Status 404 returned error can't find the container with id d84bebdb2624e03d2520f68d7d181f9befc8ddc2bd0606a748134f70747bc652 Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.684337 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.735889 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:39 crc kubenswrapper[4841]: E0130 05:08:39.736285 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.236272625 +0000 UTC m=+57.229745263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.749151 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.774568 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.805137 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.837029 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:39 crc kubenswrapper[4841]: E0130 05:08:39.837363 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.337351171 +0000 UTC m=+57.330823809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.885606 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.921660 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw"] Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.938303 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:39 crc kubenswrapper[4841]: E0130 05:08:39.938559 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.438537988 +0000 UTC m=+57.432010626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.938610 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:39 crc kubenswrapper[4841]: E0130 05:08:39.938942 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.438934928 +0000 UTC m=+57.432407566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.965592 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" event={"ID":"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8","Type":"ContainerStarted","Data":"aec7788a70a630de48298989ffd5285ecd960aade19357c79f5d21ecb349db3f"} Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.965633 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" event={"ID":"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8","Type":"ContainerStarted","Data":"8fd32b2ceaa40ce2ab8660e894215fe4a654d9dfe28e2f70030a41d3abb53ab6"} Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.966655 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.980630 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld" event={"ID":"9eee3621-c382-4ee6-a955-0061726a0214","Type":"ContainerStarted","Data":"6c7df20abde72612c6849d3bc8c068aeb83dbeb9c2f6c478431c2a66df022785"} Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.980684 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld" event={"ID":"9eee3621-c382-4ee6-a955-0061726a0214","Type":"ContainerStarted","Data":"0fe499b22a9c8f8757c42b366fa3ec2c49d5242e0164c06cabac5b6387f9bd22"} Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.989148 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" event={"ID":"345f9dbf-0dbd-4d48-841f-0f9637618c3a","Type":"ContainerStarted","Data":"1cbc6ae670ec9107d65de98a1d0014e651eeb5bd76792d25a7f4626388207960"} Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.993840 4841 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-5hkjb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Jan 30 05:08:39 crc kubenswrapper[4841]: I0130 05:08:39.993897 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" podUID="cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.000538 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.000515749 podStartE2EDuration="1.000515749s" podCreationTimestamp="2026-01-30 05:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:39.999002331 +0000 UTC m=+56.992474979" watchObservedRunningTime="2026-01-30 05:08:40.000515749 +0000 UTC m=+56.993988387" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.013860 4841 generic.go:334] "Generic (PLEG): container finished" podID="09769310-f1d3-49d3-87bf-1921c35b32de" containerID="b7a7962546617e4b40c7d5af202d69421b068f398b03cf0594d0bbefdec2755d" exitCode=0 Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.014201 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" event={"ID":"09769310-f1d3-49d3-87bf-1921c35b32de","Type":"ContainerDied","Data":"b7a7962546617e4b40c7d5af202d69421b068f398b03cf0594d0bbefdec2755d"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.018823 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" event={"ID":"999a31cf-76fd-4c51-82df-21bcf988140d","Type":"ContainerStarted","Data":"61b15ee266a9b819667788ea411d92c6b0ea87fa44071773e757ef6f97ac8b42"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.018862 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4c6xs"] Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.018880 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" event={"ID":"999a31cf-76fd-4c51-82df-21bcf988140d","Type":"ContainerStarted","Data":"a73fbe4bf52a3e07a34bbb2b00c69239a9f4d66de8879c32abf5ee13c5cc71f4"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.028510 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" event={"ID":"d1228156-5459-400b-97d7-16c75238223b","Type":"ContainerStarted","Data":"28f639c20c1dc20b4d279fe37915def251836134badd8621b98a931b7b1e8d54"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.039802 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.039960 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.539934331 +0000 UTC m=+57.533406969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.040195 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.042220 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.542207479 +0000 UTC m=+57.535680117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.061245 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7zvv6" event={"ID":"68fe97c1-4b26-445b-af5b-73808e119f0b","Type":"ContainerStarted","Data":"004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.061278 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7zvv6" event={"ID":"68fe97c1-4b26-445b-af5b-73808e119f0b","Type":"ContainerStarted","Data":"fefa0a6241043bd13dbe2618ca4ec292ba1ba9cc62d292fd9a8943c7774487e0"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.081590 4841 generic.go:334] "Generic (PLEG): container finished" podID="692a0681-d33c-43ff-b458-8c2302df6bd9" containerID="b9b8497d146c46314ea0cbedd7869cb4812e5be66207835f96a79d46c7fad250" exitCode=0 Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.081644 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" event={"ID":"692a0681-d33c-43ff-b458-8c2302df6bd9","Type":"ContainerDied","Data":"b9b8497d146c46314ea0cbedd7869cb4812e5be66207835f96a79d46c7fad250"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.081957 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" event={"ID":"692a0681-d33c-43ff-b458-8c2302df6bd9","Type":"ContainerStarted","Data":"8941ae6a705da996a4c6f886e8fdaeb17b14e18f45a30aefb5d197eaccbfa693"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.103600 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" event={"ID":"66397da5-478a-4800-93b9-012a7684f0ad","Type":"ContainerStarted","Data":"07f72911df1f6f1771e84f6b42b31dd9a921203bdd0c569f10ab3ef5bcfb7d66"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.103647 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" event={"ID":"66397da5-478a-4800-93b9-012a7684f0ad","Type":"ContainerStarted","Data":"d7779789012d6a82c24cd7b221b557bb4d04e211e1b48448c909950ce78b0874"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.104584 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.113567 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" event={"ID":"343813f5-7868-4aa9-9d23-6c3f70f6bbd8","Type":"ContainerStarted","Data":"c1ef02c93d006124da44ebaf4d95d614d15817b21dbfd7f04d2747a7b64f29de"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.126878 4841 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-bxkf4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.126931 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" podUID="66397da5-478a-4800-93b9-012a7684f0ad" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.127726 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f8frn" event={"ID":"be148e1a-b698-4b18-abfa-9988c3c31971","Type":"ContainerStarted","Data":"ea1015ae2f338716f61f12fc982f3b79af9bbf484f458d2472c0bc5991946fc7"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.139765 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" event={"ID":"87eccd50-4e4a-408b-aa2e-3c431b6d17d0","Type":"ContainerStarted","Data":"693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.142392 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.142595 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.143330 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.643310485 +0000 UTC m=+57.636783123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.143496 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.146182 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.646164296 +0000 UTC m=+57.639636934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.146695 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-z5pll" event={"ID":"d80ad8d1-44b4-4322-be87-71e2b9c72a5c","Type":"ContainerStarted","Data":"14a6cefab88359ea2456fc039545379fed9094a4f4b2660b9a1447ad7d6e276e"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.148553 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" event={"ID":"01747236-9ab9-46b2-952a-2c065de19cf4","Type":"ContainerStarted","Data":"54f591664f320bf2c60d131cdd23b5eefd2d4d4dcc3ef8f9c242f68b48dcd6ed"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.153947 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" event={"ID":"d1e3388a-70c3-4588-a357-7131eae20e2e","Type":"ContainerStarted","Data":"41edff20780da77b4c54b249c33ff540e953fe25ed8292de43f607a01bf76aad"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.189791 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" event={"ID":"a785f654-41ed-4c03-baf7-b0fb5bc3f543","Type":"ContainerStarted","Data":"b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.190668 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.203728 4841 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9wm9f container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.203781 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" podUID="a785f654-41ed-4c03-baf7-b0fb5bc3f543" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.206351 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" event={"ID":"f55f083a-8e13-4cb3-aaec-9e5bef3f6075","Type":"ContainerStarted","Data":"3ae0fb5972eccc78ce9bc8591b56d94d9ec97d5a845cb8d97fadf1f062288431"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.226589 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" event={"ID":"de438831-f663-43cd-98f9-72e133534c61","Type":"ContainerStarted","Data":"606baafe37a1e8c899a8b61c1aa9beaaf3fe10753fd68ad0371d758b3e550034"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.244185 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.244423 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.244495 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.244551 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.246492 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.246612 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.746590606 +0000 UTC m=+57.740063244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.247771 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lrbq7" event={"ID":"5a250092-4e4c-4edf-943a-23b7ffe49bab","Type":"ContainerStarted","Data":"e66cf0666a0c55c3f37be4f51bc678785135cc8814afe6d8af5e8d4aab57bfe8"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.255724 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.255825 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.264695 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" event={"ID":"e336a49d-a5c5-4b4e-9cd9-64e55db5a845","Type":"ContainerStarted","Data":"5a2c00361cc6231b6f2c151fb30184fcecbb11080359534f82886ab3466f9757"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.289983 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r5skt" event={"ID":"64d09cb6-73a6-4de8-8164-d8a241df4e5c","Type":"ContainerStarted","Data":"9676c6a51def6ef785386b052671de722a2c8fbd804d16480f53b31f8ea1eb6c"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.298674 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.306736 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" event={"ID":"8353c559-01f4-4b08-bf29-81566a889797","Type":"ContainerStarted","Data":"1b5b7e89835f8c0fbd32a99303a82a4c7107c639f1e4dcc6bb92acc1307e292f"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.324186 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7m7fj" podStartSLOduration=33.324157678 podStartE2EDuration="33.324157678s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:40.320549818 +0000 UTC m=+57.314022456" watchObservedRunningTime="2026-01-30 05:08:40.324157678 +0000 UTC m=+57.317630306" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.333540 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg"] Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.341352 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-c7m6v"] Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.345848 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" event={"ID":"29892f01-d39f-41cd-aa3c-402791553b2c","Type":"ContainerStarted","Data":"276bedeb5c35edc021c0ad8096cb6df5b76ccccc0fbbfe28eec84b919f784919"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.346535 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.346575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.346844 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.84683086 +0000 UTC m=+57.840303498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.349611 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2dp8z" event={"ID":"00b1b4b6-71d6-41a0-94e9-e1f137961e72","Type":"ContainerStarted","Data":"498b8e3316cb93a90d4bcf91a1d3fd847ce04f384cd80fab5b0054f78a2c08a3"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.349655 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2dp8z" event={"ID":"00b1b4b6-71d6-41a0-94e9-e1f137961e72","Type":"ContainerStarted","Data":"4256d5c8e82322936ebe279415cb82001ce6636071a4754114f0cdf39a6020e0"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.351136 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" event={"ID":"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3","Type":"ContainerStarted","Data":"a49fb14cba5e82fb875450b200454c28b9128035fab61138b5984fa22f1eace1"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.353976 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" event={"ID":"3a4f6ee9-be68-47f5-a898-c39acbdb2852","Type":"ContainerStarted","Data":"394b7c40fe3e6c8df9cf8a0cf6b779aae3f4b1d20b4a0e42883d3a91b83f41f4"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.355718 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.366660 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.367853 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" event={"ID":"ea8f46a8-b567-409c-ba03-4bdb0f85259d","Type":"ContainerStarted","Data":"ef079a7ad0382acc3b5872f2cb1b3373424eb7d8285338cb9eddfa0080fafb97"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.372431 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.376361 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t9l6j" event={"ID":"76152f0e-2b76-469b-a55e-f94c53fe9e4d","Type":"ContainerStarted","Data":"11d54b9395144e8d1ab0eacd6f4b4a81a3c45ed51d699678aa7f5deb46e42e13"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.377072 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-t9l6j" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.378775 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.383395 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" event={"ID":"66edf007-1920-4e59-a256-52a9360dcf9f","Type":"ContainerStarted","Data":"2b14257fca0e7ff0ef6d87277f6f9b2409d554a1a49b17fb371c6c8910c4a469"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.383454 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" event={"ID":"66edf007-1920-4e59-a256-52a9360dcf9f","Type":"ContainerStarted","Data":"3f8ea53ee0adbf8bbf8c3a726fbec28ec98238318db9f15c96869c9fe08dd91e"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.393567 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-t9l6j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.393832 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t9l6j" podUID="76152f0e-2b76-469b-a55e-f94c53fe9e4d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.414427 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" event={"ID":"705df608-7f08-4d29-aaf2-c39ae4f0e0cd","Type":"ContainerStarted","Data":"a4b7499dfa26d0b3af94839e2637a6d6f58f2496dc1d15423893565f032f42eb"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.424602 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-76czq"] Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.452286 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.452797 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.952772907 +0000 UTC m=+57.946245545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.457849 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.458256 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:40.958241655 +0000 UTC m=+57.951714293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.506814 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" event={"ID":"cc438d29-a2a7-4774-9e46-93aa6a827129","Type":"ContainerStarted","Data":"d84bebdb2624e03d2520f68d7d181f9befc8ddc2bd0606a748134f70747bc652"} Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.507555 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7kqcm"] Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.562244 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.563177 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:41.063162827 +0000 UTC m=+58.056635465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: W0130 05:08:40.640192 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee896b81_0955_4a1e_a9ac_20887e4612c1.slice/crio-bab772a1a6e36a1f2ad663f51b090d266f326d53cf70898683b123dd25296df6 WatchSource:0}: Error finding container bab772a1a6e36a1f2ad663f51b090d266f326d53cf70898683b123dd25296df6: Status 404 returned error can't find the container with id bab772a1a6e36a1f2ad663f51b090d266f326d53cf70898683b123dd25296df6 Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.663800 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.664104 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:41.164091659 +0000 UTC m=+58.157564297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.732280 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9"] Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.766985 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.767638 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:41.267610405 +0000 UTC m=+58.261083033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.781601 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7wz6l"] Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.871069 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.871368 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:41.371357818 +0000 UTC m=+58.364830456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.890042 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zg9xj" podStartSLOduration=33.890020557 podStartE2EDuration="33.890020557s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:40.85952579 +0000 UTC m=+57.852998428" watchObservedRunningTime="2026-01-30 05:08:40.890020557 +0000 UTC m=+57.883493195" Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.930528 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" podStartSLOduration=33.930511957 podStartE2EDuration="33.930511957s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:40.889681659 +0000 UTC m=+57.883154297" watchObservedRunningTime="2026-01-30 05:08:40.930511957 +0000 UTC m=+57.923984595" Jan 30 05:08:40 crc kubenswrapper[4841]: W0130 05:08:40.933050 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d9c232c_75cc_4f38_bf2f_9e2de76138ca.slice/crio-17ed0fcbd8cd983f8be70657012fc4d8a359c41ead37d9401f85050143737893 WatchSource:0}: Error finding container 17ed0fcbd8cd983f8be70657012fc4d8a359c41ead37d9401f85050143737893: Status 404 returned error can't find the container with id 17ed0fcbd8cd983f8be70657012fc4d8a359c41ead37d9401f85050143737893 Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.972413 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.972600 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:41.472579136 +0000 UTC m=+58.466051774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.973002 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:40 crc kubenswrapper[4841]: E0130 05:08:40.973318 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:41.473303635 +0000 UTC m=+58.466776273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:40 crc kubenswrapper[4841]: I0130 05:08:40.988740 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-ffmj4" podStartSLOduration=33.988723433 podStartE2EDuration="33.988723433s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:40.935168735 +0000 UTC m=+57.928641373" watchObservedRunningTime="2026-01-30 05:08:40.988723433 +0000 UTC m=+57.982196071" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.015111 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-7zvv6" podStartSLOduration=34.015094498 podStartE2EDuration="34.015094498s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.014327188 +0000 UTC m=+58.007799826" watchObservedRunningTime="2026-01-30 05:08:41.015094498 +0000 UTC m=+58.008567136" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.059475 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qrk5s" podStartSLOduration=34.059462234 podStartE2EDuration="34.059462234s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.052827168 +0000 UTC m=+58.046299806" watchObservedRunningTime="2026-01-30 05:08:41.059462234 +0000 UTC m=+58.052934872" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.077174 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:41 crc kubenswrapper[4841]: E0130 05:08:41.077469 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:41.577450347 +0000 UTC m=+58.570922985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:41 crc kubenswrapper[4841]: W0130 05:08:41.078938 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc95986c1_1a6e_442a_bb94_4c6778398fec.slice/crio-8f037d46e46498cd74e8844975f1da167839717df8e3c662181095ef683770a2 WatchSource:0}: Error finding container 8f037d46e46498cd74e8844975f1da167839717df8e3c662181095ef683770a2: Status 404 returned error can't find the container with id 8f037d46e46498cd74e8844975f1da167839717df8e3c662181095ef683770a2 Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.128947 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" podStartSLOduration=34.128934194 podStartE2EDuration="34.128934194s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.127828556 +0000 UTC m=+58.121301194" watchObservedRunningTime="2026-01-30 05:08:41.128934194 +0000 UTC m=+58.122406832" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.181014 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:41 crc kubenswrapper[4841]: E0130 05:08:41.181351 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:41.681338284 +0000 UTC m=+58.674810922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.205259 4841 csr.go:261] certificate signing request csr-r9kgg is approved, waiting to be issued Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.242018 4841 csr.go:257] certificate signing request csr-r9kgg is issued Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.268754 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-t9l6j" podStartSLOduration=34.268734575 podStartE2EDuration="34.268734575s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.243462168 +0000 UTC m=+58.236934806" watchObservedRunningTime="2026-01-30 05:08:41.268734575 +0000 UTC m=+58.262207203" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.269354 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" podStartSLOduration=34.269332259 podStartE2EDuration="34.269332259s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.190954745 +0000 UTC m=+58.184427383" watchObservedRunningTime="2026-01-30 05:08:41.269332259 +0000 UTC m=+58.262804897" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.281458 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" podStartSLOduration=6.281381943 podStartE2EDuration="6.281381943s" podCreationTimestamp="2026-01-30 05:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.280365807 +0000 UTC m=+58.273838445" watchObservedRunningTime="2026-01-30 05:08:41.281381943 +0000 UTC m=+58.274854581" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.282343 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:41 crc kubenswrapper[4841]: E0130 05:08:41.283964 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:41.783933847 +0000 UTC m=+58.777406535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.383834 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:41 crc kubenswrapper[4841]: E0130 05:08:41.384201 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:41.884186001 +0000 UTC m=+58.877658639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.492318 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:41 crc kubenswrapper[4841]: E0130 05:08:41.492544 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:41.9925163 +0000 UTC m=+58.985988938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.492931 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:41 crc kubenswrapper[4841]: E0130 05:08:41.493232 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:41.993219837 +0000 UTC m=+58.986692465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.525975 4841 generic.go:334] "Generic (PLEG): container finished" podID="01747236-9ab9-46b2-952a-2c065de19cf4" containerID="cd82df955d8f31aca5371d385fbd371d368669e99317e5df4af3dbd4c6b47870" exitCode=0 Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.526194 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" event={"ID":"01747236-9ab9-46b2-952a-2c065de19cf4","Type":"ContainerDied","Data":"cd82df955d8f31aca5371d385fbd371d368669e99317e5df4af3dbd4c6b47870"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.570867 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" event={"ID":"343813f5-7868-4aa9-9d23-6c3f70f6bbd8","Type":"ContainerStarted","Data":"79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.576606 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.586159 4841 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-52nnq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.586206 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" podUID="343813f5-7868-4aa9-9d23-6c3f70f6bbd8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.593871 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:41 crc kubenswrapper[4841]: E0130 05:08:41.594746 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:42.094731274 +0000 UTC m=+59.088203912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.598509 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" podStartSLOduration=34.598492958 podStartE2EDuration="34.598492958s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.596954649 +0000 UTC m=+58.590427287" watchObservedRunningTime="2026-01-30 05:08:41.598492958 +0000 UTC m=+58.591965596" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.609778 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r5skt" event={"ID":"64d09cb6-73a6-4de8-8164-d8a241df4e5c","Type":"ContainerStarted","Data":"3787c9fecd00b3e2b91dd2e55136e8f46f24dfba0399f888bc16fdb2f96d535e"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.612574 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t9l6j" event={"ID":"76152f0e-2b76-469b-a55e-f94c53fe9e4d","Type":"ContainerStarted","Data":"2a65d741d3c7d20f3c5d6294f93fe99e028c2b98ada96fbb42393dc4fce8a170"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.613248 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-t9l6j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.613284 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t9l6j" podUID="76152f0e-2b76-469b-a55e-f94c53fe9e4d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.622552 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" event={"ID":"8353c559-01f4-4b08-bf29-81566a889797","Type":"ContainerStarted","Data":"f9824fa19014cd81a12f0612f26d26691cb5a412e39119939126374e770d5e65"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.622593 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" event={"ID":"8353c559-01f4-4b08-bf29-81566a889797","Type":"ContainerStarted","Data":"32528f200964dd444c99a3f6c309cdec8a01b8353f6fe0cfcc690ecf0cadfa3b"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.632651 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" event={"ID":"d1228156-5459-400b-97d7-16c75238223b","Type":"ContainerStarted","Data":"785c35922bfa86c5c235b433db32f05f1a549739c7c72720a21e9b7c31f3a3a1"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.634763 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" event={"ID":"cc438d29-a2a7-4774-9e46-93aa6a827129","Type":"ContainerStarted","Data":"cb8664e0a6bdcaa2c287eecccdb1422287029ab17cbf76833e3bf3daa2dc4121"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.638988 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld" event={"ID":"9eee3621-c382-4ee6-a955-0061726a0214","Type":"ContainerStarted","Data":"41ba9be600fcca08a21dc59ad924ad74ca26bbbf261615a585b084a99fabca21"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.640307 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" event={"ID":"2d9c232c-75cc-4f38-bf2f-9e2de76138ca","Type":"ContainerStarted","Data":"17ed0fcbd8cd983f8be70657012fc4d8a359c41ead37d9401f85050143737893"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.670634 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7wz6l" event={"ID":"c95986c1-1a6e-442a-bb94-4c6778398fec","Type":"ContainerStarted","Data":"8f037d46e46498cd74e8844975f1da167839717df8e3c662181095ef683770a2"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.672139 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" event={"ID":"705df608-7f08-4d29-aaf2-c39ae4f0e0cd","Type":"ContainerStarted","Data":"44c5fce58ba6b5265fed46e62435d71d97b647c9e13baca4ce7f2c4c81a3671f"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.677712 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f8frn" event={"ID":"be148e1a-b698-4b18-abfa-9988c3c31971","Type":"ContainerStarted","Data":"fcc9d1a63f0396ad4868b42f6ed3d68ff5513066b67842538c1fcb401e8bf4cd"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.679733 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" event={"ID":"692a0681-d33c-43ff-b458-8c2302df6bd9","Type":"ContainerStarted","Data":"4672eb03f6df05a4e8076b57881ff15c93c19bd7bccaee1a8494e1ae967585ba"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.681435 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" event={"ID":"f55f083a-8e13-4cb3-aaec-9e5bef3f6075","Type":"ContainerStarted","Data":"7087208e0b37eabd97ab356dab7a5b0f37e48f63f7dd07b3bff3d014dc64a810"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.682835 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2dp8z" event={"ID":"00b1b4b6-71d6-41a0-94e9-e1f137961e72","Type":"ContainerStarted","Data":"fa047ec21f5af3732cf6bba164ae78e2d7a8621cb5d911f94175629d4831003b"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.688384 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv" event={"ID":"5f2ade6a-33a5-4643-84ee-b2bd43c55446","Type":"ContainerStarted","Data":"b5da78ebcfc85ba9556bd2d0aeca2b82aab03fe1511b18e9c97105e55e4c6194"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.688427 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv" event={"ID":"5f2ade6a-33a5-4643-84ee-b2bd43c55446","Type":"ContainerStarted","Data":"c859063e5b45d926add5f9c277526e485c05a5973b6f5eb55ceea30e87f3bbfb"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.689748 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" event={"ID":"3a4f6ee9-be68-47f5-a898-c39acbdb2852","Type":"ContainerStarted","Data":"11a72495cc8918b6a54535de7d210a9084375dae0a8842afb4cfff15d3146fca"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.690479 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" event={"ID":"ee896b81-0955-4a1e-a9ac-20887e4612c1","Type":"ContainerStarted","Data":"bab772a1a6e36a1f2ad663f51b090d266f326d53cf70898683b123dd25296df6"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.691383 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" event={"ID":"c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0","Type":"ContainerStarted","Data":"babfbda2ae07d708ac78c41bd44c173c718ac619b897c2759e65bba188a24d5d"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.691415 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" event={"ID":"c8d4c8c2-30e1-4ffc-b4c2-8ce641c14cc0","Type":"ContainerStarted","Data":"16e06a349333fa263d88bb146b8657bd59099b810e0fda458f98665aede579bb"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.692729 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" event={"ID":"ea8f46a8-b567-409c-ba03-4bdb0f85259d","Type":"ContainerStarted","Data":"2d59c0a22e081579d9eee501c2de6eb1756bb6246513e3a4692d171f44b1c4d6"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.694630 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" event={"ID":"de438831-f663-43cd-98f9-72e133534c61","Type":"ContainerStarted","Data":"b1425a5eebaa49f72441c72d907369d7406e3877206e1a15403be63073c0f11b"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.695723 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:41 crc kubenswrapper[4841]: E0130 05:08:41.698658 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:42.19864574 +0000 UTC m=+59.192118378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.707256 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lrbq7" event={"ID":"5a250092-4e4c-4edf-943a-23b7ffe49bab","Type":"ContainerStarted","Data":"56b2120089b7fde6f765d91a5e725e915fd33ca1753dc6859958f5a5b8195e14"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.708375 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" event={"ID":"890f082d-3030-4e2a-bf11-fc6fb3f8bf18","Type":"ContainerStarted","Data":"138db5ad99105cffff2564fd328ad9109884d41149b427022fd0032e6a20c353"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.709956 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" event={"ID":"5b241ddd-e54d-45bb-bb4d-9001575f3cb0","Type":"ContainerStarted","Data":"0a056b288293f5b07874ec0061261b0a03197955f43b279e5a95383ea19af857"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.709988 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" event={"ID":"5b241ddd-e54d-45bb-bb4d-9001575f3cb0","Type":"ContainerStarted","Data":"a8a28a4584f6739e8725c5cd52a0890b043e00a2743ff8280bc499846a55dd66"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.710371 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.712619 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" event={"ID":"f36c8b3b-03b1-41fb-8649-660e3cdb1bf3","Type":"ContainerStarted","Data":"f9a9060fc235de4c471b370e063b89590bd8719b38d2fdfb8e0ae864d671d22d"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.716521 4841 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xsjgv container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.716553 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" podUID="5b241ddd-e54d-45bb-bb4d-9001575f3cb0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.717229 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" event={"ID":"d7ad6da3-78c5-4541-8b41-bdcde44d577f","Type":"ContainerStarted","Data":"eb03365dc14a40bedd406fb994e1c2702d70c0c0788f50e7d9deb643c3054df2"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.718532 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" event={"ID":"7599df20-a4f3-4f48-b413-8eec3c0bba38","Type":"ContainerStarted","Data":"f9d3180e69bd282fe3b847b6b43b625085b6ee927a0eab391a1483b0339c07da"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.720609 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pcg6z" event={"ID":"afcb5d20-fdc9-4472-b201-253a90897fa5","Type":"ContainerStarted","Data":"5d6c48490a89ba78922d9871a2fdbc5171e340142923ba71ecabe2b97c23dfe7"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.720656 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pcg6z" event={"ID":"afcb5d20-fdc9-4472-b201-253a90897fa5","Type":"ContainerStarted","Data":"89d1bd829df18c73826ec46c9dfcf6fd8fb8d5748344b362668773c09d8d4b5a"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.722083 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-76czq" event={"ID":"dcae0fc1-0485-491b-9003-ded9c95fe166","Type":"ContainerStarted","Data":"ad8b8f136a9aa6bc39f8b319e2641fc8a27b7a7550dbcc7dae326dab88f37c8d"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.723354 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" event={"ID":"e336a49d-a5c5-4b4e-9cd9-64e55db5a845","Type":"ContainerStarted","Data":"974fd47ceeec919a927202c9ba5a73f4e0260aae9f7a655c10887c20862793bf"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.724166 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.725332 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" event={"ID":"d1e3388a-70c3-4588-a357-7131eae20e2e","Type":"ContainerStarted","Data":"5fdde992e48721400a7cd7e70ccaa4946b01e2b43a5825effe4e446f9808a8c9"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.725527 4841 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xkjjh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.725556 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" podUID="e336a49d-a5c5-4b4e-9cd9-64e55db5a845" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.730264 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-z5pll" event={"ID":"d80ad8d1-44b4-4322-be87-71e2b9c72a5c","Type":"ContainerStarted","Data":"f6b81b4b8524a7854b5952cb49e3778c59e9962e9703b758ed9d3f4c88624082"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.731214 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.745834 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" event={"ID":"345f9dbf-0dbd-4d48-841f-0f9637618c3a","Type":"ContainerStarted","Data":"8736654efdb588448a003ce665d9459969ea8d0f464ba46e4667e7b9c6a3bfd4"} Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.767520 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r5skt" podStartSLOduration=34.767506424 podStartE2EDuration="34.767506424s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.765006471 +0000 UTC m=+58.758479109" watchObservedRunningTime="2026-01-30 05:08:41.767506424 +0000 UTC m=+58.760979062" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.784533 4841 patch_prober.go:28] interesting pod/console-operator-58897d9998-z5pll container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.784589 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-z5pll" podUID="d80ad8d1-44b4-4322-be87-71e2b9c72a5c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.784881 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.785322 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bxkf4" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.793245 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.803211 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.803520 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-chrvd" podStartSLOduration=34.803506091 podStartE2EDuration="34.803506091s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.802978178 +0000 UTC m=+58.796450816" watchObservedRunningTime="2026-01-30 05:08:41.803506091 +0000 UTC m=+58.796978719" Jan 30 05:08:41 crc kubenswrapper[4841]: E0130 05:08:41.804255 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:42.304241839 +0000 UTC m=+59.297714477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.829566 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" podStartSLOduration=34.829549457 podStartE2EDuration="34.829549457s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.829063954 +0000 UTC m=+58.822536592" watchObservedRunningTime="2026-01-30 05:08:41.829549457 +0000 UTC m=+58.823022095" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.898552 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-2dp8z" podStartSLOduration=34.898521664 podStartE2EDuration="34.898521664s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.888103711 +0000 UTC m=+58.881576349" watchObservedRunningTime="2026-01-30 05:08:41.898521664 +0000 UTC m=+58.891994302" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.944487 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:41 crc kubenswrapper[4841]: E0130 05:08:41.945000 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:42.444990454 +0000 UTC m=+59.438463092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.976535 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qqfld" podStartSLOduration=34.976516827 podStartE2EDuration="34.976516827s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.943765893 +0000 UTC m=+58.937238531" watchObservedRunningTime="2026-01-30 05:08:41.976516827 +0000 UTC m=+58.969989465" Jan 30 05:08:41 crc kubenswrapper[4841]: I0130 05:08:41.977102 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s42sv" podStartSLOduration=34.977097742 podStartE2EDuration="34.977097742s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:41.975963423 +0000 UTC m=+58.969436061" watchObservedRunningTime="2026-01-30 05:08:41.977097742 +0000 UTC m=+58.970570380" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.006626 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-65j7g" podStartSLOduration=35.006612295 podStartE2EDuration="35.006612295s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.004224885 +0000 UTC m=+58.997697533" watchObservedRunningTime="2026-01-30 05:08:42.006612295 +0000 UTC m=+59.000084933" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.050193 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:42 crc kubenswrapper[4841]: E0130 05:08:42.050481 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:42.550391587 +0000 UTC m=+59.543864225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.116667 4841 patch_prober.go:28] interesting pod/router-default-5444994796-r5skt container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.116930 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5skt" podUID="64d09cb6-73a6-4de8-8164-d8a241df4e5c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.116682 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.130643 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-487lw" podStartSLOduration=35.130626919 podStartE2EDuration="35.130626919s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.087320578 +0000 UTC m=+59.080793216" watchObservedRunningTime="2026-01-30 05:08:42.130626919 +0000 UTC m=+59.124099557" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.131702 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4j6fb" podStartSLOduration=35.131696625000004 podStartE2EDuration="35.131696625s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.128876214 +0000 UTC m=+59.122348852" watchObservedRunningTime="2026-01-30 05:08:42.131696625 +0000 UTC m=+59.125169263" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.153422 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-wh9ns"] Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.159001 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" podStartSLOduration=35.158979732 podStartE2EDuration="35.158979732s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.153266118 +0000 UTC m=+59.146738756" watchObservedRunningTime="2026-01-30 05:08:42.158979732 +0000 UTC m=+59.152452390" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.160796 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:42 crc kubenswrapper[4841]: E0130 05:08:42.161205 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:42.661194007 +0000 UTC m=+59.654666645 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.192202 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pcg6z" podStartSLOduration=7.192183259 podStartE2EDuration="7.192183259s" podCreationTimestamp="2026-01-30 05:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.185774317 +0000 UTC m=+59.179246955" watchObservedRunningTime="2026-01-30 05:08:42.192183259 +0000 UTC m=+59.185655897" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.239459 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4n5bd" podStartSLOduration=35.239444298 podStartE2EDuration="35.239444298s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.23835744 +0000 UTC m=+59.231830078" watchObservedRunningTime="2026-01-30 05:08:42.239444298 +0000 UTC m=+59.232916936" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.248588 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-30 05:03:41 +0000 UTC, rotation deadline is 2026-11-14 17:40:12.74960319 +0000 UTC Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.248635 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6924h31m30.5009699s for next certificate rotation Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.265890 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:42 crc kubenswrapper[4841]: E0130 05:08:42.265989 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:42.765973146 +0000 UTC m=+59.759445784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.266238 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:42 crc kubenswrapper[4841]: E0130 05:08:42.266532 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:42.766525241 +0000 UTC m=+59.759997879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.297472 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bznfv" podStartSLOduration=35.297448599 podStartE2EDuration="35.297448599s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.295819978 +0000 UTC m=+59.289292616" watchObservedRunningTime="2026-01-30 05:08:42.297448599 +0000 UTC m=+59.290921237" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.320575 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-z5pll" podStartSLOduration=35.320559911 podStartE2EDuration="35.320559911s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.319562895 +0000 UTC m=+59.313035533" watchObservedRunningTime="2026-01-30 05:08:42.320559911 +0000 UTC m=+59.314032549" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.367022 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:42 crc kubenswrapper[4841]: E0130 05:08:42.367438 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:42.867419721 +0000 UTC m=+59.860892359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.367513 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" podStartSLOduration=35.367501583 podStartE2EDuration="35.367501583s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.365557814 +0000 UTC m=+59.359030452" watchObservedRunningTime="2026-01-30 05:08:42.367501583 +0000 UTC m=+59.360974221" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.438170 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mms8q" podStartSLOduration=35.438155432 podStartE2EDuration="35.438155432s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.433501795 +0000 UTC m=+59.426974423" watchObservedRunningTime="2026-01-30 05:08:42.438155432 +0000 UTC m=+59.431628070" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.440150 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5dfls" podStartSLOduration=35.440140402 podStartE2EDuration="35.440140402s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.406569717 +0000 UTC m=+59.400042355" watchObservedRunningTime="2026-01-30 05:08:42.440140402 +0000 UTC m=+59.433613040" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.482109 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:42 crc kubenswrapper[4841]: E0130 05:08:42.482429 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:42.982417257 +0000 UTC m=+59.975889895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.561163 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" podStartSLOduration=35.561147919 podStartE2EDuration="35.561147919s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.513627202 +0000 UTC m=+59.507099850" watchObservedRunningTime="2026-01-30 05:08:42.561147919 +0000 UTC m=+59.554620557" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.586918 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:42 crc kubenswrapper[4841]: E0130 05:08:42.587194 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:43.087179825 +0000 UTC m=+60.080652453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.602088 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-slrnt" podStartSLOduration=35.602072129 podStartE2EDuration="35.602072129s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.561684332 +0000 UTC m=+59.555156970" watchObservedRunningTime="2026-01-30 05:08:42.602072129 +0000 UTC m=+59.595544767" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.697523 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:42 crc kubenswrapper[4841]: E0130 05:08:42.697822 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:43.197809561 +0000 UTC m=+60.191282199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.765152 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" event={"ID":"890f082d-3030-4e2a-bf11-fc6fb3f8bf18","Type":"ContainerStarted","Data":"3717d9848a28df2166f7bc1e0a28582c56902ffce0d5580e173c2b44a5852955"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.765437 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" event={"ID":"890f082d-3030-4e2a-bf11-fc6fb3f8bf18","Type":"ContainerStarted","Data":"6008102ca730af47a0872839f00cf5952acab940c02454eed085099edc065e24"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.766195 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.798313 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:42 crc kubenswrapper[4841]: E0130 05:08:42.799169 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:43.299149862 +0000 UTC m=+60.292622510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.814723 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" event={"ID":"3a4f6ee9-be68-47f5-a898-c39acbdb2852","Type":"ContainerStarted","Data":"08108393726e32f7687a6622940a915e400738abf36e09c56290607c54a743a7"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.820045 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" event={"ID":"01747236-9ab9-46b2-952a-2c065de19cf4","Type":"ContainerStarted","Data":"8fea6267a4e2c39630103d242b7f55ca3de3f0721ea2a3bfa654a7e9222c0748"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.820879 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.821772 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7wz6l" event={"ID":"c95986c1-1a6e-442a-bb94-4c6778398fec","Type":"ContainerStarted","Data":"e82711804b7442a85405b912cf39592821f95402bf2bb663bc28f2027053cb72"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.828595 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lrbq7" event={"ID":"5a250092-4e4c-4edf-943a-23b7ffe49bab","Type":"ContainerStarted","Data":"368f8b33784157c3e17bf04ba36dd50a23d52901003fc993f1826887784540c4"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.831094 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" event={"ID":"09769310-f1d3-49d3-87bf-1921c35b32de","Type":"ContainerStarted","Data":"e4824d020849f02bd5a31cf54dbaf4195be4ae455d064110fc8adf6ce35ca0bf"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.832505 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" event={"ID":"7599df20-a4f3-4f48-b413-8eec3c0bba38","Type":"ContainerStarted","Data":"852c3759a609a71af3531896af0f8229e0870854040facca8847ea946c31339e"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.833194 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.833788 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"aad19b0ad76d5bc7d029a91255a870436e7abe8468d816c6db10fb93193feb32"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.833806 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6455c2feaadb3cb4d7db63b9edd3db225f8ba1f13d61289c02efe741b8e5ab91"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.835342 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f0d05221d4fe69858c477e1109e15fb87a51b4c9dc3cd10be4b03663a5e6cd9f"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.835366 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b50827206bf89af1e886eea41b9e7b40e17df3767947cd5db4cf49158979f803"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.836584 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f8frn" event={"ID":"be148e1a-b698-4b18-abfa-9988c3c31971","Type":"ContainerStarted","Data":"409e7ae071d1f371e237689f3cac9dda802e5d8fff8ed36966f5020c96306699"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.837667 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" event={"ID":"d7ad6da3-78c5-4541-8b41-bdcde44d577f","Type":"ContainerStarted","Data":"39e103f0ad2eb9bb9f005f026366164fed374f05c4a435dea009a506111715bd"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.838714 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"671866a14fde6e6524f93ab2963fe8bba85c8bc8d15cf250611a98a890bb325e"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.838732 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3af9cf4144800bdaee0e11d49c5a175450e1320b0498420717b71c44fc64f7f0"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.839003 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.857899 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" event={"ID":"2d9c232c-75cc-4f38-bf2f-9e2de76138ca","Type":"ContainerStarted","Data":"27a118f12f98b2abe6da2a5d022f1e8987d1d2aed7df82ce354581c43c46a40c"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.858582 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" podStartSLOduration=35.858567318 podStartE2EDuration="35.858567318s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.834378269 +0000 UTC m=+59.827850907" watchObservedRunningTime="2026-01-30 05:08:42.858567318 +0000 UTC m=+59.852039956" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.859652 4841 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4c6xs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.859695 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" podUID="7599df20-a4f3-4f48-b413-8eec3c0bba38" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.876088 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" event={"ID":"d1e3388a-70c3-4588-a357-7131eae20e2e","Type":"ContainerStarted","Data":"56404dfdba6d75d08679254baae45e3e75aefd1a11956e35461a01f7a9332f88"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.889308 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-76czq" event={"ID":"dcae0fc1-0485-491b-9003-ded9c95fe166","Type":"ContainerStarted","Data":"09016a6b7c4c6a50ad40b218bf8c5e5c22be8fbc5855ae713248bb699183d5bf"} Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.890525 4841 patch_prober.go:28] interesting pod/console-operator-58897d9998-z5pll container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.890558 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-z5pll" podUID="d80ad8d1-44b4-4322-be87-71e2b9c72a5c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.890600 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-t9l6j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.890614 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t9l6j" podUID="76152f0e-2b76-469b-a55e-f94c53fe9e4d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.905670 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:42 crc kubenswrapper[4841]: E0130 05:08:42.906044 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:43.406029823 +0000 UTC m=+60.399502461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.921891 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xsjgv" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.922760 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.923100 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lrbq7" podStartSLOduration=35.923090374 podStartE2EDuration="35.923090374s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.923026761 +0000 UTC m=+59.916499399" watchObservedRunningTime="2026-01-30 05:08:42.923090374 +0000 UTC m=+59.916563012" Jan 30 05:08:42 crc kubenswrapper[4841]: I0130 05:08:42.966379 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" podStartSLOduration=35.966362063 podStartE2EDuration="35.966362063s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:42.964519897 +0000 UTC m=+59.957992525" watchObservedRunningTime="2026-01-30 05:08:42.966362063 +0000 UTC m=+59.959834701" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.013485 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.016504 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:43.516483285 +0000 UTC m=+60.509955923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.027861 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9hrx" podStartSLOduration=36.027845591 podStartE2EDuration="36.027845591s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:43.027079002 +0000 UTC m=+60.020551640" watchObservedRunningTime="2026-01-30 05:08:43.027845591 +0000 UTC m=+60.021318229" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.043623 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.046559 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.082858 4841 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-n2xzz container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.15:8443/livez\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.082919 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" podUID="09769310-f1d3-49d3-87bf-1921c35b32de" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.15:8443/livez\": dial tcp 10.217.0.15:8443: connect: connection refused" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.085888 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.086113 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.118086 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.118422 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:43.618411862 +0000 UTC m=+60.611884500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.122307 4841 patch_prober.go:28] interesting pod/router-default-5444994796-r5skt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:08:43 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Jan 30 05:08:43 crc kubenswrapper[4841]: [+]process-running ok Jan 30 05:08:43 crc kubenswrapper[4841]: healthz check failed Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.122354 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5skt" podUID="64d09cb6-73a6-4de8-8164-d8a241df4e5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.149892 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" podStartSLOduration=36.149877005 podStartE2EDuration="36.149877005s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:43.085794631 +0000 UTC m=+60.079267269" watchObservedRunningTime="2026-01-30 05:08:43.149877005 +0000 UTC m=+60.143349643" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.150239 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" podStartSLOduration=36.150236343 podStartE2EDuration="36.150236343s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:43.149124285 +0000 UTC m=+60.142596923" watchObservedRunningTime="2026-01-30 05:08:43.150236343 +0000 UTC m=+60.143708971" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.225951 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.226517 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:43.726500703 +0000 UTC m=+60.719973341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.311170 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49fvr" podStartSLOduration=36.311152336 podStartE2EDuration="36.311152336s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:43.285173681 +0000 UTC m=+60.278646319" watchObservedRunningTime="2026-01-30 05:08:43.311152336 +0000 UTC m=+60.304624974" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.332249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.332609 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:43.832596615 +0000 UTC m=+60.826069253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.356681 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9pwf9" podStartSLOduration=36.356667301 podStartE2EDuration="36.356667301s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:43.312621823 +0000 UTC m=+60.306094461" watchObservedRunningTime="2026-01-30 05:08:43.356667301 +0000 UTC m=+60.350139939" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.357099 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-f8frn" podStartSLOduration=36.357095102 podStartE2EDuration="36.357095102s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:43.354538688 +0000 UTC m=+60.348011326" watchObservedRunningTime="2026-01-30 05:08:43.357095102 +0000 UTC m=+60.350567740" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.414477 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-c7m6v" podStartSLOduration=36.414457776 podStartE2EDuration="36.414457776s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:43.409939383 +0000 UTC m=+60.403412021" watchObservedRunningTime="2026-01-30 05:08:43.414457776 +0000 UTC m=+60.407930404" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.433737 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.434020 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:43.933997098 +0000 UTC m=+60.927469736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.434318 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.434633 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:43.934622505 +0000 UTC m=+60.928095143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.437388 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-76czq" podStartSLOduration=8.437374304 podStartE2EDuration="8.437374304s" podCreationTimestamp="2026-01-30 05:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:43.435153998 +0000 UTC m=+60.428626636" watchObservedRunningTime="2026-01-30 05:08:43.437374304 +0000 UTC m=+60.430846942" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.536847 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.536921 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.03690498 +0000 UTC m=+61.030377618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.537098 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.537438 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.037430593 +0000 UTC m=+61.030903231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.637650 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.637840 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.137813521 +0000 UTC m=+61.131286149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.739328 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.739628 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.239617025 +0000 UTC m=+61.233089663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.840029 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.840211 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.340177787 +0000 UTC m=+61.333650425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.840338 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.840610 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.340598498 +0000 UTC m=+61.334071136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.855853 4841 patch_prober.go:28] interesting pod/apiserver-76f77b778f-r9f92 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 30 05:08:43 crc kubenswrapper[4841]: [+]log ok Jan 30 05:08:43 crc kubenswrapper[4841]: [+]etcd ok Jan 30 05:08:43 crc kubenswrapper[4841]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 30 05:08:43 crc kubenswrapper[4841]: [+]poststarthook/generic-apiserver-start-informers ok Jan 30 05:08:43 crc kubenswrapper[4841]: [+]poststarthook/max-in-flight-filter ok Jan 30 05:08:43 crc kubenswrapper[4841]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 30 05:08:43 crc kubenswrapper[4841]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 30 05:08:43 crc kubenswrapper[4841]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 30 05:08:43 crc kubenswrapper[4841]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 30 05:08:43 crc kubenswrapper[4841]: [+]poststarthook/project.openshift.io-projectcache ok Jan 30 05:08:43 crc kubenswrapper[4841]: [-]poststarthook/project.openshift.io-projectauthorizationcache failed: reason withheld Jan 30 05:08:43 crc kubenswrapper[4841]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Jan 30 05:08:43 crc kubenswrapper[4841]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 30 05:08:43 crc kubenswrapper[4841]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 30 05:08:43 crc kubenswrapper[4841]: livez check failed Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.855906 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" podUID="692a0681-d33c-43ff-b458-8c2302df6bd9" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.858876 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkjjh" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.894462 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" event={"ID":"ee896b81-0955-4a1e-a9ac-20887e4612c1","Type":"ContainerStarted","Data":"454e2577ef90cd467b0f349d89058e5abc08d0a2d7eafb9ac40b76055bfc40af"} Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.896298 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7wz6l" event={"ID":"c95986c1-1a6e-442a-bb94-4c6778398fec","Type":"ContainerStarted","Data":"2c8fca524f4aa52f3ff218a510574f248c554f2e2cd01426afc316a9baa4047a"} Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.897370 4841 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4c6xs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.898109 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" podUID="7599df20-a4f3-4f48-b413-8eec3c0bba38" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.898619 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" podUID="87eccd50-4e4a-408b-aa2e-3c431b6d17d0" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" gracePeriod=30 Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.941641 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.941832 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.441805286 +0000 UTC m=+61.435277924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:43 crc kubenswrapper[4841]: I0130 05:08:43.941971 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:43 crc kubenswrapper[4841]: E0130 05:08:43.945376 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.445363316 +0000 UTC m=+61.438835954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.044851 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.045130 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.545103567 +0000 UTC m=+61.538576205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.045350 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.045644 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.54563318 +0000 UTC m=+61.539105818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.082169 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7wz6l" podStartSLOduration=9.08215015 podStartE2EDuration="9.08215015s" podCreationTimestamp="2026-01-30 05:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:44.081568506 +0000 UTC m=+61.075041144" watchObservedRunningTime="2026-01-30 05:08:44.08215015 +0000 UTC m=+61.075622788" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.116534 4841 patch_prober.go:28] interesting pod/router-default-5444994796-r5skt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:08:44 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Jan 30 05:08:44 crc kubenswrapper[4841]: [+]process-running ok Jan 30 05:08:44 crc kubenswrapper[4841]: healthz check failed Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.116595 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5skt" podUID="64d09cb6-73a6-4de8-8164-d8a241df4e5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.146681 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.146863 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.646834369 +0000 UTC m=+61.640307007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.146942 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.147247 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.647232969 +0000 UTC m=+61.640705607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.247538 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.247720 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.747696189 +0000 UTC m=+61.741168827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.247854 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.264165 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.764146313 +0000 UTC m=+61.757618951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.353806 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.359780 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.854505558 +0000 UTC m=+61.847978196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.454841 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.455308 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:44.955276796 +0000 UTC m=+61.948749434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.559924 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.560125 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:45.060098656 +0000 UTC m=+62.053571294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.560233 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.560545 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:45.060531546 +0000 UTC m=+62.054004184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.571965 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-z5pll" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.612046 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t88dq"] Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.613029 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.615334 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.685774 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.685956 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12231fcc-9527-405e-bac6-734865031f83-catalog-content\") pod \"certified-operators-t88dq\" (UID: \"12231fcc-9527-405e-bac6-734865031f83\") " pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.686002 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12231fcc-9527-405e-bac6-734865031f83-utilities\") pod \"certified-operators-t88dq\" (UID: \"12231fcc-9527-405e-bac6-734865031f83\") " pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.686047 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st6r9\" (UniqueName: \"kubernetes.io/projected/12231fcc-9527-405e-bac6-734865031f83-kube-api-access-st6r9\") pod \"certified-operators-t88dq\" (UID: \"12231fcc-9527-405e-bac6-734865031f83\") " pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.686141 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:45.186126229 +0000 UTC m=+62.179598867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.698468 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t88dq"] Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.788231 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12231fcc-9527-405e-bac6-734865031f83-utilities\") pod \"certified-operators-t88dq\" (UID: \"12231fcc-9527-405e-bac6-734865031f83\") " pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.788850 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.788949 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st6r9\" (UniqueName: \"kubernetes.io/projected/12231fcc-9527-405e-bac6-734865031f83-kube-api-access-st6r9\") pod \"certified-operators-t88dq\" (UID: \"12231fcc-9527-405e-bac6-734865031f83\") " pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.789003 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12231fcc-9527-405e-bac6-734865031f83-utilities\") pod \"certified-operators-t88dq\" (UID: \"12231fcc-9527-405e-bac6-734865031f83\") " pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.788739 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g8lz7"] Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.789146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12231fcc-9527-405e-bac6-734865031f83-catalog-content\") pod \"certified-operators-t88dq\" (UID: \"12231fcc-9527-405e-bac6-734865031f83\") " pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.789523 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12231fcc-9527-405e-bac6-734865031f83-catalog-content\") pod \"certified-operators-t88dq\" (UID: \"12231fcc-9527-405e-bac6-734865031f83\") " pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.789812 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:45.28980007 +0000 UTC m=+62.283272698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.790175 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.795843 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.808150 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8lz7"] Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.835848 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st6r9\" (UniqueName: \"kubernetes.io/projected/12231fcc-9527-405e-bac6-734865031f83-kube-api-access-st6r9\") pod \"certified-operators-t88dq\" (UID: \"12231fcc-9527-405e-bac6-734865031f83\") " pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.889673 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.889791 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a117bdc-99af-4fd8-a810-b2d08e174f77-utilities\") pod \"community-operators-g8lz7\" (UID: \"4a117bdc-99af-4fd8-a810-b2d08e174f77\") " pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.889813 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btxhr\" (UniqueName: \"kubernetes.io/projected/4a117bdc-99af-4fd8-a810-b2d08e174f77-kube-api-access-btxhr\") pod \"community-operators-g8lz7\" (UID: \"4a117bdc-99af-4fd8-a810-b2d08e174f77\") " pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.889858 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a117bdc-99af-4fd8-a810-b2d08e174f77-catalog-content\") pod \"community-operators-g8lz7\" (UID: \"4a117bdc-99af-4fd8-a810-b2d08e174f77\") " pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.889938 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:45.38992316 +0000 UTC m=+62.383395788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.922146 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" event={"ID":"ee896b81-0955-4a1e-a9ac-20887e4612c1","Type":"ContainerStarted","Data":"501dc0d9335a7125b7a385f857f142eb2ffdb07e70194b3f773ca655e9a79a2c"} Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.924870 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.925000 4841 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4c6xs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.925053 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" podUID="7599df20-a4f3-4f48-b413-8eec3c0bba38" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.933767 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.988915 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5rhzf"] Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.989868 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.993763 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.993858 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a117bdc-99af-4fd8-a810-b2d08e174f77-utilities\") pod \"community-operators-g8lz7\" (UID: \"4a117bdc-99af-4fd8-a810-b2d08e174f77\") " pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.993907 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e336685-24da-4b58-b586-c3f673d2c226-catalog-content\") pod \"certified-operators-5rhzf\" (UID: \"0e336685-24da-4b58-b586-c3f673d2c226\") " pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.993931 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btxhr\" (UniqueName: \"kubernetes.io/projected/4a117bdc-99af-4fd8-a810-b2d08e174f77-kube-api-access-btxhr\") pod \"community-operators-g8lz7\" (UID: \"4a117bdc-99af-4fd8-a810-b2d08e174f77\") " pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.994089 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjx45\" (UniqueName: \"kubernetes.io/projected/0e336685-24da-4b58-b586-c3f673d2c226-kube-api-access-vjx45\") pod \"certified-operators-5rhzf\" (UID: \"0e336685-24da-4b58-b586-c3f673d2c226\") " pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.994162 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a117bdc-99af-4fd8-a810-b2d08e174f77-catalog-content\") pod \"community-operators-g8lz7\" (UID: \"4a117bdc-99af-4fd8-a810-b2d08e174f77\") " pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.994198 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e336685-24da-4b58-b586-c3f673d2c226-utilities\") pod \"certified-operators-5rhzf\" (UID: \"0e336685-24da-4b58-b586-c3f673d2c226\") " pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:08:44 crc kubenswrapper[4841]: E0130 05:08:44.995761 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:45.495745296 +0000 UTC m=+62.489217934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.996204 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a117bdc-99af-4fd8-a810-b2d08e174f77-catalog-content\") pod \"community-operators-g8lz7\" (UID: \"4a117bdc-99af-4fd8-a810-b2d08e174f77\") " pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:08:44 crc kubenswrapper[4841]: I0130 05:08:44.997269 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a117bdc-99af-4fd8-a810-b2d08e174f77-utilities\") pod \"community-operators-g8lz7\" (UID: \"4a117bdc-99af-4fd8-a810-b2d08e174f77\") " pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.000637 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5rhzf"] Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.029435 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btxhr\" (UniqueName: \"kubernetes.io/projected/4a117bdc-99af-4fd8-a810-b2d08e174f77-kube-api-access-btxhr\") pod \"community-operators-g8lz7\" (UID: \"4a117bdc-99af-4fd8-a810-b2d08e174f77\") " pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.095147 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.095359 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e336685-24da-4b58-b586-c3f673d2c226-catalog-content\") pod \"certified-operators-5rhzf\" (UID: \"0e336685-24da-4b58-b586-c3f673d2c226\") " pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.095442 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjx45\" (UniqueName: \"kubernetes.io/projected/0e336685-24da-4b58-b586-c3f673d2c226-kube-api-access-vjx45\") pod \"certified-operators-5rhzf\" (UID: \"0e336685-24da-4b58-b586-c3f673d2c226\") " pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.095470 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e336685-24da-4b58-b586-c3f673d2c226-utilities\") pod \"certified-operators-5rhzf\" (UID: \"0e336685-24da-4b58-b586-c3f673d2c226\") " pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:08:45 crc kubenswrapper[4841]: E0130 05:08:45.095914 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:45.595885937 +0000 UTC m=+62.589358575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.096277 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e336685-24da-4b58-b586-c3f673d2c226-catalog-content\") pod \"certified-operators-5rhzf\" (UID: \"0e336685-24da-4b58-b586-c3f673d2c226\") " pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.096663 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e336685-24da-4b58-b586-c3f673d2c226-utilities\") pod \"certified-operators-5rhzf\" (UID: \"0e336685-24da-4b58-b586-c3f673d2c226\") " pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.111157 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjx45\" (UniqueName: \"kubernetes.io/projected/0e336685-24da-4b58-b586-c3f673d2c226-kube-api-access-vjx45\") pod \"certified-operators-5rhzf\" (UID: \"0e336685-24da-4b58-b586-c3f673d2c226\") " pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.121303 4841 patch_prober.go:28] interesting pod/router-default-5444994796-r5skt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:08:45 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Jan 30 05:08:45 crc kubenswrapper[4841]: [+]process-running ok Jan 30 05:08:45 crc kubenswrapper[4841]: healthz check failed Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.121368 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5skt" podUID="64d09cb6-73a6-4de8-8164-d8a241df4e5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.125661 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.138837 4841 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.189600 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nhvlf"] Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.190802 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.196534 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:45 crc kubenswrapper[4841]: E0130 05:08:45.196839 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:45.696829109 +0000 UTC m=+62.690301747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.248452 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nhvlf"] Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.309023 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.309180 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b9e216-aaea-4222-ab65-efadd17f2f46-utilities\") pod \"community-operators-nhvlf\" (UID: \"f7b9e216-aaea-4222-ab65-efadd17f2f46\") " pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.309243 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b9e216-aaea-4222-ab65-efadd17f2f46-catalog-content\") pod \"community-operators-nhvlf\" (UID: \"f7b9e216-aaea-4222-ab65-efadd17f2f46\") " pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.309282 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg2tp\" (UniqueName: \"kubernetes.io/projected/f7b9e216-aaea-4222-ab65-efadd17f2f46-kube-api-access-wg2tp\") pod \"community-operators-nhvlf\" (UID: \"f7b9e216-aaea-4222-ab65-efadd17f2f46\") " pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:08:45 crc kubenswrapper[4841]: E0130 05:08:45.309438 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:45.809411044 +0000 UTC m=+62.802883682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.326829 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.410630 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b9e216-aaea-4222-ab65-efadd17f2f46-catalog-content\") pod \"community-operators-nhvlf\" (UID: \"f7b9e216-aaea-4222-ab65-efadd17f2f46\") " pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.410825 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg2tp\" (UniqueName: \"kubernetes.io/projected/f7b9e216-aaea-4222-ab65-efadd17f2f46-kube-api-access-wg2tp\") pod \"community-operators-nhvlf\" (UID: \"f7b9e216-aaea-4222-ab65-efadd17f2f46\") " pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.410850 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.410886 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b9e216-aaea-4222-ab65-efadd17f2f46-utilities\") pod \"community-operators-nhvlf\" (UID: \"f7b9e216-aaea-4222-ab65-efadd17f2f46\") " pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.412667 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b9e216-aaea-4222-ab65-efadd17f2f46-utilities\") pod \"community-operators-nhvlf\" (UID: \"f7b9e216-aaea-4222-ab65-efadd17f2f46\") " pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.412882 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b9e216-aaea-4222-ab65-efadd17f2f46-catalog-content\") pod \"community-operators-nhvlf\" (UID: \"f7b9e216-aaea-4222-ab65-efadd17f2f46\") " pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:08:45 crc kubenswrapper[4841]: E0130 05:08:45.413377 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:45.913367552 +0000 UTC m=+62.906840190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.451583 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg2tp\" (UniqueName: \"kubernetes.io/projected/f7b9e216-aaea-4222-ab65-efadd17f2f46-kube-api-access-wg2tp\") pod \"community-operators-nhvlf\" (UID: \"f7b9e216-aaea-4222-ab65-efadd17f2f46\") " pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.513042 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:45 crc kubenswrapper[4841]: E0130 05:08:45.513366 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 05:08:46.01335199 +0000 UTC m=+63.006824628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.521821 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t88dq"] Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.540652 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:08:45 crc kubenswrapper[4841]: W0130 05:08:45.543374 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12231fcc_9527_405e_bac6_734865031f83.slice/crio-d9cd64144ba1fc5848c24bfa84a57f177fc74339ffc06f4302f221f591f87eab WatchSource:0}: Error finding container d9cd64144ba1fc5848c24bfa84a57f177fc74339ffc06f4302f221f591f87eab: Status 404 returned error can't find the container with id d9cd64144ba1fc5848c24bfa84a57f177fc74339ffc06f4302f221f591f87eab Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.611381 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8lz7"] Jan 30 05:08:45 crc kubenswrapper[4841]: W0130 05:08:45.646744 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a117bdc_99af_4fd8_a810_b2d08e174f77.slice/crio-8120d6002c95afbd79abe10ef0c5de7d47d6558d485a4607c2819131d152793a WatchSource:0}: Error finding container 8120d6002c95afbd79abe10ef0c5de7d47d6558d485a4607c2819131d152793a: Status 404 returned error can't find the container with id 8120d6002c95afbd79abe10ef0c5de7d47d6558d485a4607c2819131d152793a Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.653849 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:45 crc kubenswrapper[4841]: E0130 05:08:45.654450 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 05:08:46.154428612 +0000 UTC m=+63.147901270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2f7jj" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.684481 4841 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-30T05:08:45.138860389Z","Handler":null,"Name":""} Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.697700 4841 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.697743 4841 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.755305 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.759801 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.845880 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nhvlf"] Jan 30 05:08:45 crc kubenswrapper[4841]: W0130 05:08:45.855215 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7b9e216_aaea_4222_ab65_efadd17f2f46.slice/crio-d1a377937c8cfb61ed291dbb846284c315e9ce2495e9cfecf096c993fdc27ec3 WatchSource:0}: Error finding container d1a377937c8cfb61ed291dbb846284c315e9ce2495e9cfecf096c993fdc27ec3: Status 404 returned error can't find the container with id d1a377937c8cfb61ed291dbb846284c315e9ce2495e9cfecf096c993fdc27ec3 Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.856887 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.859420 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.859461 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.888961 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2f7jj\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.924926 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5rhzf"] Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.930601 4841 generic.go:334] "Generic (PLEG): container finished" podID="12231fcc-9527-405e-bac6-734865031f83" containerID="de97f35d7278621768d6fd3f5a32b75aef6db03f5a3f2071ab9995bbfbc2d67d" exitCode=0 Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.930643 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t88dq" event={"ID":"12231fcc-9527-405e-bac6-734865031f83","Type":"ContainerDied","Data":"de97f35d7278621768d6fd3f5a32b75aef6db03f5a3f2071ab9995bbfbc2d67d"} Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.930661 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t88dq" event={"ID":"12231fcc-9527-405e-bac6-734865031f83","Type":"ContainerStarted","Data":"d9cd64144ba1fc5848c24bfa84a57f177fc74339ffc06f4302f221f591f87eab"} Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.942018 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.951520 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" event={"ID":"ee896b81-0955-4a1e-a9ac-20887e4612c1","Type":"ContainerStarted","Data":"f2b4a525d3cf7ee1d3372e24c8b2aecbf7c0f0c7614e40a18bcd107222969e2c"} Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.953712 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" event={"ID":"ee896b81-0955-4a1e-a9ac-20887e4612c1","Type":"ContainerStarted","Data":"c4411d845edef12392cf5416871a834fc96efca94d121a900464c06a33e87196"} Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.953747 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhvlf" event={"ID":"f7b9e216-aaea-4222-ab65-efadd17f2f46","Type":"ContainerStarted","Data":"d1a377937c8cfb61ed291dbb846284c315e9ce2495e9cfecf096c993fdc27ec3"} Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.957890 4841 generic.go:334] "Generic (PLEG): container finished" podID="4a117bdc-99af-4fd8-a810-b2d08e174f77" containerID="a3e8e7f692a58c41d7bfddcfd37abf5b0f769a8b168cbbc22bf45c0258a054cb" exitCode=0 Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.958409 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8lz7" event={"ID":"4a117bdc-99af-4fd8-a810-b2d08e174f77","Type":"ContainerDied","Data":"a3e8e7f692a58c41d7bfddcfd37abf5b0f769a8b168cbbc22bf45c0258a054cb"} Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.958433 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8lz7" event={"ID":"4a117bdc-99af-4fd8-a810-b2d08e174f77","Type":"ContainerStarted","Data":"8120d6002c95afbd79abe10ef0c5de7d47d6558d485a4607c2819131d152793a"} Jan 30 05:08:45 crc kubenswrapper[4841]: I0130 05:08:45.968700 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-7kqcm" podStartSLOduration=10.968684526 podStartE2EDuration="10.968684526s" podCreationTimestamp="2026-01-30 05:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:45.967157067 +0000 UTC m=+62.960629705" watchObservedRunningTime="2026-01-30 05:08:45.968684526 +0000 UTC m=+62.962157164" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.094566 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.103180 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.116738 4841 patch_prober.go:28] interesting pod/router-default-5444994796-r5skt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:08:46 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Jan 30 05:08:46 crc kubenswrapper[4841]: [+]process-running ok Jan 30 05:08:46 crc kubenswrapper[4841]: healthz check failed Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.116805 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5skt" podUID="64d09cb6-73a6-4de8-8164-d8a241df4e5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.291711 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.292290 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.294549 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.294696 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.300656 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.318312 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2f7jj"] Jan 30 05:08:46 crc kubenswrapper[4841]: W0130 05:08:46.327760 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ec5e12_1868_4efd_a76c_e7a06360cb3b.slice/crio-a083385be97f65aa7fe56523cba5017456bef1b3b22c07eab2fffa9bd9ada7f9 WatchSource:0}: Error finding container a083385be97f65aa7fe56523cba5017456bef1b3b22c07eab2fffa9bd9ada7f9: Status 404 returned error can't find the container with id a083385be97f65aa7fe56523cba5017456bef1b3b22c07eab2fffa9bd9ada7f9 Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.365498 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3712fbb-4357-4dca-a13d-6c39538fdd30-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e3712fbb-4357-4dca-a13d-6c39538fdd30\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.365569 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3712fbb-4357-4dca-a13d-6c39538fdd30-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e3712fbb-4357-4dca-a13d-6c39538fdd30\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.443237 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.466394 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3712fbb-4357-4dca-a13d-6c39538fdd30-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e3712fbb-4357-4dca-a13d-6c39538fdd30\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.466492 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3712fbb-4357-4dca-a13d-6c39538fdd30-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e3712fbb-4357-4dca-a13d-6c39538fdd30\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.466798 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3712fbb-4357-4dca-a13d-6c39538fdd30-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e3712fbb-4357-4dca-a13d-6c39538fdd30\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.491239 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3712fbb-4357-4dca-a13d-6c39538fdd30-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e3712fbb-4357-4dca-a13d-6c39538fdd30\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.585185 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xzz8x"] Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.586123 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.588906 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.601441 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzz8x"] Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.622588 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.669653 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6ce2ed-da59-4d16-8d01-022b22e746f1-utilities\") pod \"redhat-marketplace-xzz8x\" (UID: \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\") " pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.669725 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6ce2ed-da59-4d16-8d01-022b22e746f1-catalog-content\") pod \"redhat-marketplace-xzz8x\" (UID: \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\") " pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.669772 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkrng\" (UniqueName: \"kubernetes.io/projected/6a6ce2ed-da59-4d16-8d01-022b22e746f1-kube-api-access-fkrng\") pod \"redhat-marketplace-xzz8x\" (UID: \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\") " pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.770810 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6ce2ed-da59-4d16-8d01-022b22e746f1-utilities\") pod \"redhat-marketplace-xzz8x\" (UID: \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\") " pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.771116 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6ce2ed-da59-4d16-8d01-022b22e746f1-catalog-content\") pod \"redhat-marketplace-xzz8x\" (UID: \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\") " pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.771157 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkrng\" (UniqueName: \"kubernetes.io/projected/6a6ce2ed-da59-4d16-8d01-022b22e746f1-kube-api-access-fkrng\") pod \"redhat-marketplace-xzz8x\" (UID: \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\") " pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.771701 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6ce2ed-da59-4d16-8d01-022b22e746f1-utilities\") pod \"redhat-marketplace-xzz8x\" (UID: \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\") " pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.771865 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6ce2ed-da59-4d16-8d01-022b22e746f1-catalog-content\") pod \"redhat-marketplace-xzz8x\" (UID: \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\") " pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.798492 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkrng\" (UniqueName: \"kubernetes.io/projected/6a6ce2ed-da59-4d16-8d01-022b22e746f1-kube-api-access-fkrng\") pod \"redhat-marketplace-xzz8x\" (UID: \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\") " pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.868975 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.921679 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.968563 4841 generic.go:334] "Generic (PLEG): container finished" podID="705df608-7f08-4d29-aaf2-c39ae4f0e0cd" containerID="44c5fce58ba6b5265fed46e62435d71d97b647c9e13baca4ce7f2c4c81a3671f" exitCode=0 Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.968650 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" event={"ID":"705df608-7f08-4d29-aaf2-c39ae4f0e0cd","Type":"ContainerDied","Data":"44c5fce58ba6b5265fed46e62435d71d97b647c9e13baca4ce7f2c4c81a3671f"} Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.983999 4841 generic.go:334] "Generic (PLEG): container finished" podID="0e336685-24da-4b58-b586-c3f673d2c226" containerID="318ea930abaac4e7eb4f201fa18ea8021537231a4a1deb336d99f99e9f1feef4" exitCode=0 Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.984090 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rhzf" event={"ID":"0e336685-24da-4b58-b586-c3f673d2c226","Type":"ContainerDied","Data":"318ea930abaac4e7eb4f201fa18ea8021537231a4a1deb336d99f99e9f1feef4"} Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.984334 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rhzf" event={"ID":"0e336685-24da-4b58-b586-c3f673d2c226","Type":"ContainerStarted","Data":"24249701f22c0b651677da26cbf77ebf9586352c6b2b85423131f8c03e7283e7"} Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.989289 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qp57k"] Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.990393 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.995087 4841 generic.go:334] "Generic (PLEG): container finished" podID="f7b9e216-aaea-4222-ab65-efadd17f2f46" containerID="e8c0e06eedb659d46fec811ebbbe96bd2efa57cdd4fa3c04b26083b550a1ad44" exitCode=0 Jan 30 05:08:46 crc kubenswrapper[4841]: I0130 05:08:46.995157 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhvlf" event={"ID":"f7b9e216-aaea-4222-ab65-efadd17f2f46","Type":"ContainerDied","Data":"e8c0e06eedb659d46fec811ebbbe96bd2efa57cdd4fa3c04b26083b550a1ad44"} Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.005325 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qp57k"] Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.007182 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" event={"ID":"79ec5e12-1868-4efd-a76c-e7a06360cb3b","Type":"ContainerStarted","Data":"f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28"} Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.007217 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" event={"ID":"79ec5e12-1868-4efd-a76c-e7a06360cb3b","Type":"ContainerStarted","Data":"a083385be97f65aa7fe56523cba5017456bef1b3b22c07eab2fffa9bd9ada7f9"} Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.007888 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.014169 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e3712fbb-4357-4dca-a13d-6c39538fdd30","Type":"ContainerStarted","Data":"70e0d4946857a1239fef02b768226039a6e874d5c889ad8fcba0853df36f84e1"} Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.063095 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" podStartSLOduration=40.063078067 podStartE2EDuration="40.063078067s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:47.057632822 +0000 UTC m=+64.051105460" watchObservedRunningTime="2026-01-30 05:08:47.063078067 +0000 UTC m=+64.056550705" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.075142 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brjwf\" (UniqueName: \"kubernetes.io/projected/5ff4432f-c571-42f5-a82c-58b4cc8be05d-kube-api-access-brjwf\") pod \"redhat-marketplace-qp57k\" (UID: \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\") " pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.075225 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff4432f-c571-42f5-a82c-58b4cc8be05d-catalog-content\") pod \"redhat-marketplace-qp57k\" (UID: \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\") " pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.075274 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff4432f-c571-42f5-a82c-58b4cc8be05d-utilities\") pod \"redhat-marketplace-qp57k\" (UID: \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\") " pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.116141 4841 patch_prober.go:28] interesting pod/router-default-5444994796-r5skt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:08:47 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Jan 30 05:08:47 crc kubenswrapper[4841]: [+]process-running ok Jan 30 05:08:47 crc kubenswrapper[4841]: healthz check failed Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.116218 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5skt" podUID="64d09cb6-73a6-4de8-8164-d8a241df4e5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.121196 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzz8x"] Jan 30 05:08:47 crc kubenswrapper[4841]: W0130 05:08:47.141496 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a6ce2ed_da59_4d16_8d01_022b22e746f1.slice/crio-04839d41d0b38db171354dcaa9510504c7298ab5eb1c6ba11acd1bf902471b9b WatchSource:0}: Error finding container 04839d41d0b38db171354dcaa9510504c7298ab5eb1c6ba11acd1bf902471b9b: Status 404 returned error can't find the container with id 04839d41d0b38db171354dcaa9510504c7298ab5eb1c6ba11acd1bf902471b9b Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.176833 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brjwf\" (UniqueName: \"kubernetes.io/projected/5ff4432f-c571-42f5-a82c-58b4cc8be05d-kube-api-access-brjwf\") pod \"redhat-marketplace-qp57k\" (UID: \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\") " pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.177255 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff4432f-c571-42f5-a82c-58b4cc8be05d-catalog-content\") pod \"redhat-marketplace-qp57k\" (UID: \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\") " pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.177509 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff4432f-c571-42f5-a82c-58b4cc8be05d-utilities\") pod \"redhat-marketplace-qp57k\" (UID: \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\") " pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.178070 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff4432f-c571-42f5-a82c-58b4cc8be05d-catalog-content\") pod \"redhat-marketplace-qp57k\" (UID: \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\") " pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.178641 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff4432f-c571-42f5-a82c-58b4cc8be05d-utilities\") pod \"redhat-marketplace-qp57k\" (UID: \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\") " pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.196772 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpv8z" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.199182 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brjwf\" (UniqueName: \"kubernetes.io/projected/5ff4432f-c571-42f5-a82c-58b4cc8be05d-kube-api-access-brjwf\") pod \"redhat-marketplace-qp57k\" (UID: \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\") " pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.315161 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.599335 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qp57k"] Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.783676 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9pv7l"] Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.784589 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.788299 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.790472 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9pv7l"] Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.897544 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c7d511-de5b-4e9d-acdc-615d18346215-catalog-content\") pod \"redhat-operators-9pv7l\" (UID: \"13c7d511-de5b-4e9d-acdc-615d18346215\") " pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.897584 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmv29\" (UniqueName: \"kubernetes.io/projected/13c7d511-de5b-4e9d-acdc-615d18346215-kube-api-access-lmv29\") pod \"redhat-operators-9pv7l\" (UID: \"13c7d511-de5b-4e9d-acdc-615d18346215\") " pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.897609 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c7d511-de5b-4e9d-acdc-615d18346215-utilities\") pod \"redhat-operators-9pv7l\" (UID: \"13c7d511-de5b-4e9d-acdc-615d18346215\") " pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.998710 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmv29\" (UniqueName: \"kubernetes.io/projected/13c7d511-de5b-4e9d-acdc-615d18346215-kube-api-access-lmv29\") pod \"redhat-operators-9pv7l\" (UID: \"13c7d511-de5b-4e9d-acdc-615d18346215\") " pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.998978 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c7d511-de5b-4e9d-acdc-615d18346215-catalog-content\") pod \"redhat-operators-9pv7l\" (UID: \"13c7d511-de5b-4e9d-acdc-615d18346215\") " pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.999017 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c7d511-de5b-4e9d-acdc-615d18346215-utilities\") pod \"redhat-operators-9pv7l\" (UID: \"13c7d511-de5b-4e9d-acdc-615d18346215\") " pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:08:47 crc kubenswrapper[4841]: I0130 05:08:47.999452 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c7d511-de5b-4e9d-acdc-615d18346215-catalog-content\") pod \"redhat-operators-9pv7l\" (UID: \"13c7d511-de5b-4e9d-acdc-615d18346215\") " pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.000451 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c7d511-de5b-4e9d-acdc-615d18346215-utilities\") pod \"redhat-operators-9pv7l\" (UID: \"13c7d511-de5b-4e9d-acdc-615d18346215\") " pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.015001 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmv29\" (UniqueName: \"kubernetes.io/projected/13c7d511-de5b-4e9d-acdc-615d18346215-kube-api-access-lmv29\") pod \"redhat-operators-9pv7l\" (UID: \"13c7d511-de5b-4e9d-acdc-615d18346215\") " pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.020318 4841 generic.go:334] "Generic (PLEG): container finished" podID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" containerID="38501d9648ca5a8165332856b2251c566c7b943950eff4659a144c6a5135d104" exitCode=0 Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.020426 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qp57k" event={"ID":"5ff4432f-c571-42f5-a82c-58b4cc8be05d","Type":"ContainerDied","Data":"38501d9648ca5a8165332856b2251c566c7b943950eff4659a144c6a5135d104"} Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.020465 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qp57k" event={"ID":"5ff4432f-c571-42f5-a82c-58b4cc8be05d","Type":"ContainerStarted","Data":"e65e439041c71d89d1aebe378b547b1081bcc614360977fa385b6db464c0d190"} Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.026753 4841 generic.go:334] "Generic (PLEG): container finished" podID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" containerID="39b9f7530dc9e32f761d0518b1512207f4775634bc4cb5140afc51abfa02aa8a" exitCode=0 Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.026817 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzz8x" event={"ID":"6a6ce2ed-da59-4d16-8d01-022b22e746f1","Type":"ContainerDied","Data":"39b9f7530dc9e32f761d0518b1512207f4775634bc4cb5140afc51abfa02aa8a"} Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.026886 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzz8x" event={"ID":"6a6ce2ed-da59-4d16-8d01-022b22e746f1","Type":"ContainerStarted","Data":"04839d41d0b38db171354dcaa9510504c7298ab5eb1c6ba11acd1bf902471b9b"} Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.028668 4841 generic.go:334] "Generic (PLEG): container finished" podID="e3712fbb-4357-4dca-a13d-6c39538fdd30" containerID="5b73a1790c125d6d7fe8779930643e98d85f8d9301a8155b89b45fc12d76f57e" exitCode=0 Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.028742 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e3712fbb-4357-4dca-a13d-6c39538fdd30","Type":"ContainerDied","Data":"5b73a1790c125d6d7fe8779930643e98d85f8d9301a8155b89b45fc12d76f57e"} Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.055141 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.072280 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n2xzz" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.088185 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.092464 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-r9f92" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.126384 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.127148 4841 patch_prober.go:28] interesting pod/router-default-5444994796-r5skt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 05:08:48 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Jan 30 05:08:48 crc kubenswrapper[4841]: [+]process-running ok Jan 30 05:08:48 crc kubenswrapper[4841]: healthz check failed Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.127184 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5skt" podUID="64d09cb6-73a6-4de8-8164-d8a241df4e5c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.155239 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.155842 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.158463 4841 patch_prober.go:28] interesting pod/console-f9d7485db-7zvv6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.158662 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7zvv6" podUID="68fe97c1-4b26-445b-af5b-73808e119f0b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.189319 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qhg2s"] Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.190825 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.205653 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qhg2s"] Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.305149 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-utilities\") pod \"redhat-operators-qhg2s\" (UID: \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\") " pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.305250 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-catalog-content\") pod \"redhat-operators-qhg2s\" (UID: \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\") " pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.305274 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r55jn\" (UniqueName: \"kubernetes.io/projected/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-kube-api-access-r55jn\") pod \"redhat-operators-qhg2s\" (UID: \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\") " pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:08:48 crc kubenswrapper[4841]: E0130 05:08:48.384026 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:08:48 crc kubenswrapper[4841]: E0130 05:08:48.394412 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:08:48 crc kubenswrapper[4841]: E0130 05:08:48.398629 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:08:48 crc kubenswrapper[4841]: E0130 05:08:48.398691 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" podUID="87eccd50-4e4a-408b-aa2e-3c431b6d17d0" containerName="kube-multus-additional-cni-plugins" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.406054 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-utilities\") pod \"redhat-operators-qhg2s\" (UID: \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\") " pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.406211 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-catalog-content\") pod \"redhat-operators-qhg2s\" (UID: \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\") " pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.406232 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r55jn\" (UniqueName: \"kubernetes.io/projected/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-kube-api-access-r55jn\") pod \"redhat-operators-qhg2s\" (UID: \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\") " pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.406936 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-utilities\") pod \"redhat-operators-qhg2s\" (UID: \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\") " pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.407134 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-catalog-content\") pod \"redhat-operators-qhg2s\" (UID: \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\") " pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.423547 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r55jn\" (UniqueName: \"kubernetes.io/projected/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-kube-api-access-r55jn\") pod \"redhat-operators-qhg2s\" (UID: \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\") " pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.457301 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9pv7l"] Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.460543 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:48 crc kubenswrapper[4841]: W0130 05:08:48.485930 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13c7d511_de5b_4e9d_acdc_615d18346215.slice/crio-570516297c13d95ae4b70da6b81dcd5101ee8c2acf2068c1ebdd69b190288575 WatchSource:0}: Error finding container 570516297c13d95ae4b70da6b81dcd5101ee8c2acf2068c1ebdd69b190288575: Status 404 returned error can't find the container with id 570516297c13d95ae4b70da6b81dcd5101ee8c2acf2068c1ebdd69b190288575 Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.495297 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-t9l6j container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.495328 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-t9l6j" podUID="76152f0e-2b76-469b-a55e-f94c53fe9e4d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.495616 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-t9l6j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.495662 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t9l6j" podUID="76152f0e-2b76-469b-a55e-f94c53fe9e4d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.507242 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-secret-volume\") pod \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\" (UID: \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\") " Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.507275 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-config-volume\") pod \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\" (UID: \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\") " Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.507318 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jpxb\" (UniqueName: \"kubernetes.io/projected/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-kube-api-access-6jpxb\") pod \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\" (UID: \"705df608-7f08-4d29-aaf2-c39ae4f0e0cd\") " Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.508945 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-config-volume" (OuterVolumeSpecName: "config-volume") pod "705df608-7f08-4d29-aaf2-c39ae4f0e0cd" (UID: "705df608-7f08-4d29-aaf2-c39ae4f0e0cd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.524934 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-kube-api-access-6jpxb" (OuterVolumeSpecName: "kube-api-access-6jpxb") pod "705df608-7f08-4d29-aaf2-c39ae4f0e0cd" (UID: "705df608-7f08-4d29-aaf2-c39ae4f0e0cd"). InnerVolumeSpecName "kube-api-access-6jpxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.525094 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "705df608-7f08-4d29-aaf2-c39ae4f0e0cd" (UID: "705df608-7f08-4d29-aaf2-c39ae4f0e0cd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.556643 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.608312 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.608354 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.608364 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jpxb\" (UniqueName: \"kubernetes.io/projected/705df608-7f08-4d29-aaf2-c39ae4f0e0cd-kube-api-access-6jpxb\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:48 crc kubenswrapper[4841]: I0130 05:08:48.879228 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qhg2s"] Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.057540 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhg2s" event={"ID":"94ac7c20-1f9a-4fb1-8107-1159fb740ab5","Type":"ContainerStarted","Data":"9bf34ec0055fc61e9675f76dbb8b7a9f57f97dc2d00d18ef4568f369432ce27f"} Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.077291 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" event={"ID":"705df608-7f08-4d29-aaf2-c39ae4f0e0cd","Type":"ContainerDied","Data":"a4b7499dfa26d0b3af94839e2637a6d6f58f2496dc1d15423893565f032f42eb"} Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.077329 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b7499dfa26d0b3af94839e2637a6d6f58f2496dc1d15423893565f032f42eb" Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.077384 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh" Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.105863 4841 generic.go:334] "Generic (PLEG): container finished" podID="13c7d511-de5b-4e9d-acdc-615d18346215" containerID="66fc3d09ce39daa2a0cf45cffc19a1655ade83dfb5253832731eb5f43d3fa2a8" exitCode=0 Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.105935 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pv7l" event={"ID":"13c7d511-de5b-4e9d-acdc-615d18346215","Type":"ContainerDied","Data":"66fc3d09ce39daa2a0cf45cffc19a1655ade83dfb5253832731eb5f43d3fa2a8"} Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.106972 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pv7l" event={"ID":"13c7d511-de5b-4e9d-acdc-615d18346215","Type":"ContainerStarted","Data":"570516297c13d95ae4b70da6b81dcd5101ee8c2acf2068c1ebdd69b190288575"} Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.116605 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.137158 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.239815 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.506522 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.639416 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3712fbb-4357-4dca-a13d-6c39538fdd30-kube-api-access\") pod \"e3712fbb-4357-4dca-a13d-6c39538fdd30\" (UID: \"e3712fbb-4357-4dca-a13d-6c39538fdd30\") " Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.639772 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3712fbb-4357-4dca-a13d-6c39538fdd30-kubelet-dir\") pod \"e3712fbb-4357-4dca-a13d-6c39538fdd30\" (UID: \"e3712fbb-4357-4dca-a13d-6c39538fdd30\") " Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.639847 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3712fbb-4357-4dca-a13d-6c39538fdd30-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e3712fbb-4357-4dca-a13d-6c39538fdd30" (UID: "e3712fbb-4357-4dca-a13d-6c39538fdd30"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.648067 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3712fbb-4357-4dca-a13d-6c39538fdd30-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e3712fbb-4357-4dca-a13d-6c39538fdd30" (UID: "e3712fbb-4357-4dca-a13d-6c39538fdd30"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.741643 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3712fbb-4357-4dca-a13d-6c39538fdd30-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:49 crc kubenswrapper[4841]: I0130 05:08:49.741681 4841 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3712fbb-4357-4dca-a13d-6c39538fdd30-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:08:50 crc kubenswrapper[4841]: I0130 05:08:50.119222 4841 generic.go:334] "Generic (PLEG): container finished" podID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" containerID="e25ea69e52cb8c365944a6c242e10e46d579d484ebcaa8303c0749273a13e21f" exitCode=0 Jan 30 05:08:50 crc kubenswrapper[4841]: I0130 05:08:50.119285 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhg2s" event={"ID":"94ac7c20-1f9a-4fb1-8107-1159fb740ab5","Type":"ContainerDied","Data":"e25ea69e52cb8c365944a6c242e10e46d579d484ebcaa8303c0749273a13e21f"} Jan 30 05:08:50 crc kubenswrapper[4841]: I0130 05:08:50.123309 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e3712fbb-4357-4dca-a13d-6c39538fdd30","Type":"ContainerDied","Data":"70e0d4946857a1239fef02b768226039a6e874d5c889ad8fcba0853df36f84e1"} Jan 30 05:08:50 crc kubenswrapper[4841]: I0130 05:08:50.123341 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70e0d4946857a1239fef02b768226039a6e874d5c889ad8fcba0853df36f84e1" Jan 30 05:08:50 crc kubenswrapper[4841]: I0130 05:08:50.128736 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 05:08:50 crc kubenswrapper[4841]: I0130 05:08:50.133461 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r5skt" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.227741 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 05:08:52 crc kubenswrapper[4841]: E0130 05:08:52.228302 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3712fbb-4357-4dca-a13d-6c39538fdd30" containerName="pruner" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.228315 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3712fbb-4357-4dca-a13d-6c39538fdd30" containerName="pruner" Jan 30 05:08:52 crc kubenswrapper[4841]: E0130 05:08:52.228328 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="705df608-7f08-4d29-aaf2-c39ae4f0e0cd" containerName="collect-profiles" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.228334 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="705df608-7f08-4d29-aaf2-c39ae4f0e0cd" containerName="collect-profiles" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.228443 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="705df608-7f08-4d29-aaf2-c39ae4f0e0cd" containerName="collect-profiles" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.228458 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3712fbb-4357-4dca-a13d-6c39538fdd30" containerName="pruner" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.228894 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.231954 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.232781 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.232996 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.310177 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdc39d8e-c44d-4b29-99de-a59a89d898cd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bdc39d8e-c44d-4b29-99de-a59a89d898cd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.310336 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdc39d8e-c44d-4b29-99de-a59a89d898cd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bdc39d8e-c44d-4b29-99de-a59a89d898cd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.411907 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdc39d8e-c44d-4b29-99de-a59a89d898cd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bdc39d8e-c44d-4b29-99de-a59a89d898cd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.412000 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdc39d8e-c44d-4b29-99de-a59a89d898cd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bdc39d8e-c44d-4b29-99de-a59a89d898cd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.412060 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdc39d8e-c44d-4b29-99de-a59a89d898cd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bdc39d8e-c44d-4b29-99de-a59a89d898cd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.440051 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdc39d8e-c44d-4b29-99de-a59a89d898cd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bdc39d8e-c44d-4b29-99de-a59a89d898cd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.562881 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.924806 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs\") pod \"network-metrics-daemon-25sxv\" (UID: \"1e275bab-612f-4fe8-8a4f-792634265c15\") " pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.927529 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.950912 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e275bab-612f-4fe8-8a4f-792634265c15-metrics-certs\") pod \"network-metrics-daemon-25sxv\" (UID: \"1e275bab-612f-4fe8-8a4f-792634265c15\") " pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:52 crc kubenswrapper[4841]: I0130 05:08:52.968481 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:08:53 crc kubenswrapper[4841]: I0130 05:08:53.047184 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 05:08:53 crc kubenswrapper[4841]: I0130 05:08:53.055762 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25sxv" Jan 30 05:08:53 crc kubenswrapper[4841]: I0130 05:08:53.161287 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 05:08:53 crc kubenswrapper[4841]: W0130 05:08:53.180878 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbdc39d8e_c44d_4b29_99de_a59a89d898cd.slice/crio-7b46d9c71d469685f7bf1872caa496602ccb1e32fd3c285ae037121df71f5f5c WatchSource:0}: Error finding container 7b46d9c71d469685f7bf1872caa496602ccb1e32fd3c285ae037121df71f5f5c: Status 404 returned error can't find the container with id 7b46d9c71d469685f7bf1872caa496602ccb1e32fd3c285ae037121df71f5f5c Jan 30 05:08:53 crc kubenswrapper[4841]: I0130 05:08:53.760245 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-25sxv"] Jan 30 05:08:54 crc kubenswrapper[4841]: I0130 05:08:54.176328 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bdc39d8e-c44d-4b29-99de-a59a89d898cd","Type":"ContainerStarted","Data":"ad98f961286af4d9ec3649043ab7460fe0270896e961468f4a4af86de99d0f46"} Jan 30 05:08:54 crc kubenswrapper[4841]: I0130 05:08:54.176654 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bdc39d8e-c44d-4b29-99de-a59a89d898cd","Type":"ContainerStarted","Data":"7b46d9c71d469685f7bf1872caa496602ccb1e32fd3c285ae037121df71f5f5c"} Jan 30 05:08:54 crc kubenswrapper[4841]: I0130 05:08:54.354533 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7wz6l" Jan 30 05:08:54 crc kubenswrapper[4841]: I0130 05:08:54.374316 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.374271545 podStartE2EDuration="2.374271545s" podCreationTimestamp="2026-01-30 05:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:54.197646726 +0000 UTC m=+71.191119374" watchObservedRunningTime="2026-01-30 05:08:54.374271545 +0000 UTC m=+71.367744183" Jan 30 05:08:55 crc kubenswrapper[4841]: I0130 05:08:55.191613 4841 generic.go:334] "Generic (PLEG): container finished" podID="bdc39d8e-c44d-4b29-99de-a59a89d898cd" containerID="ad98f961286af4d9ec3649043ab7460fe0270896e961468f4a4af86de99d0f46" exitCode=0 Jan 30 05:08:55 crc kubenswrapper[4841]: I0130 05:08:55.191675 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bdc39d8e-c44d-4b29-99de-a59a89d898cd","Type":"ContainerDied","Data":"ad98f961286af4d9ec3649043ab7460fe0270896e961468f4a4af86de99d0f46"} Jan 30 05:08:56 crc kubenswrapper[4841]: I0130 05:08:56.453101 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 30 05:08:58 crc kubenswrapper[4841]: I0130 05:08:58.161139 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:58 crc kubenswrapper[4841]: I0130 05:08:58.166993 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:08:58 crc kubenswrapper[4841]: I0130 05:08:58.196736 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.196715892 podStartE2EDuration="2.196715892s" podCreationTimestamp="2026-01-30 05:08:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:08:58.191715149 +0000 UTC m=+75.185187787" watchObservedRunningTime="2026-01-30 05:08:58.196715892 +0000 UTC m=+75.190188530" Jan 30 05:08:58 crc kubenswrapper[4841]: E0130 05:08:58.381459 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:08:58 crc kubenswrapper[4841]: E0130 05:08:58.383535 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:08:58 crc kubenswrapper[4841]: E0130 05:08:58.386353 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:08:58 crc kubenswrapper[4841]: E0130 05:08:58.386428 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" podUID="87eccd50-4e4a-408b-aa2e-3c431b6d17d0" containerName="kube-multus-additional-cni-plugins" Jan 30 05:08:58 crc kubenswrapper[4841]: I0130 05:08:58.499696 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-t9l6j" Jan 30 05:09:01 crc kubenswrapper[4841]: I0130 05:09:01.963218 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:09:02 crc kubenswrapper[4841]: I0130 05:09:02.132233 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdc39d8e-c44d-4b29-99de-a59a89d898cd-kubelet-dir\") pod \"bdc39d8e-c44d-4b29-99de-a59a89d898cd\" (UID: \"bdc39d8e-c44d-4b29-99de-a59a89d898cd\") " Jan 30 05:09:02 crc kubenswrapper[4841]: I0130 05:09:02.132609 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdc39d8e-c44d-4b29-99de-a59a89d898cd-kube-api-access\") pod \"bdc39d8e-c44d-4b29-99de-a59a89d898cd\" (UID: \"bdc39d8e-c44d-4b29-99de-a59a89d898cd\") " Jan 30 05:09:02 crc kubenswrapper[4841]: I0130 05:09:02.132385 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bdc39d8e-c44d-4b29-99de-a59a89d898cd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bdc39d8e-c44d-4b29-99de-a59a89d898cd" (UID: "bdc39d8e-c44d-4b29-99de-a59a89d898cd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:09:02 crc kubenswrapper[4841]: I0130 05:09:02.133050 4841 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdc39d8e-c44d-4b29-99de-a59a89d898cd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:09:02 crc kubenswrapper[4841]: I0130 05:09:02.138181 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc39d8e-c44d-4b29-99de-a59a89d898cd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bdc39d8e-c44d-4b29-99de-a59a89d898cd" (UID: "bdc39d8e-c44d-4b29-99de-a59a89d898cd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:09:02 crc kubenswrapper[4841]: I0130 05:09:02.234197 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdc39d8e-c44d-4b29-99de-a59a89d898cd-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:09:02 crc kubenswrapper[4841]: I0130 05:09:02.269988 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bdc39d8e-c44d-4b29-99de-a59a89d898cd","Type":"ContainerDied","Data":"7b46d9c71d469685f7bf1872caa496602ccb1e32fd3c285ae037121df71f5f5c"} Jan 30 05:09:02 crc kubenswrapper[4841]: I0130 05:09:02.270034 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b46d9c71d469685f7bf1872caa496602ccb1e32fd3c285ae037121df71f5f5c" Jan 30 05:09:02 crc kubenswrapper[4841]: I0130 05:09:02.270055 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 05:09:03 crc kubenswrapper[4841]: W0130 05:09:03.188465 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e275bab_612f_4fe8_8a4f_792634265c15.slice/crio-c6b7bd9e78f8c5c5dff7f2a95fcd451a8a1d0fd25aeb7f33006fb4f77417ae7f WatchSource:0}: Error finding container c6b7bd9e78f8c5c5dff7f2a95fcd451a8a1d0fd25aeb7f33006fb4f77417ae7f: Status 404 returned error can't find the container with id c6b7bd9e78f8c5c5dff7f2a95fcd451a8a1d0fd25aeb7f33006fb4f77417ae7f Jan 30 05:09:03 crc kubenswrapper[4841]: I0130 05:09:03.280350 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-25sxv" event={"ID":"1e275bab-612f-4fe8-8a4f-792634265c15","Type":"ContainerStarted","Data":"c6b7bd9e78f8c5c5dff7f2a95fcd451a8a1d0fd25aeb7f33006fb4f77417ae7f"} Jan 30 05:09:06 crc kubenswrapper[4841]: I0130 05:09:06.108902 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:09:08 crc kubenswrapper[4841]: E0130 05:09:08.382996 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:09:08 crc kubenswrapper[4841]: E0130 05:09:08.385049 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:09:08 crc kubenswrapper[4841]: E0130 05:09:08.386751 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:09:08 crc kubenswrapper[4841]: E0130 05:09:08.386794 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" podUID="87eccd50-4e4a-408b-aa2e-3c431b6d17d0" containerName="kube-multus-additional-cni-plugins" Jan 30 05:09:14 crc kubenswrapper[4841]: I0130 05:09:14.347738 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-wh9ns_87eccd50-4e4a-408b-aa2e-3c431b6d17d0/kube-multus-additional-cni-plugins/0.log" Jan 30 05:09:14 crc kubenswrapper[4841]: I0130 05:09:14.348153 4841 generic.go:334] "Generic (PLEG): container finished" podID="87eccd50-4e4a-408b-aa2e-3c431b6d17d0" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" exitCode=137 Jan 30 05:09:14 crc kubenswrapper[4841]: I0130 05:09:14.348179 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" event={"ID":"87eccd50-4e4a-408b-aa2e-3c431b6d17d0","Type":"ContainerDied","Data":"693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114"} Jan 30 05:09:18 crc kubenswrapper[4841]: E0130 05:09:18.379576 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114 is running failed: container process not found" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:09:18 crc kubenswrapper[4841]: E0130 05:09:18.381981 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114 is running failed: container process not found" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:09:18 crc kubenswrapper[4841]: E0130 05:09:18.382620 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114 is running failed: container process not found" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:09:18 crc kubenswrapper[4841]: E0130 05:09:18.382688 4841 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" podUID="87eccd50-4e4a-408b-aa2e-3c431b6d17d0" containerName="kube-multus-additional-cni-plugins" Jan 30 05:09:19 crc kubenswrapper[4841]: E0130 05:09:19.036037 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 30 05:09:19 crc kubenswrapper[4841]: E0130 05:09:19.036387 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-st6r9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-t88dq_openshift-marketplace(12231fcc-9527-405e-bac6-734865031f83): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:09:19 crc kubenswrapper[4841]: E0130 05:09:19.037596 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-t88dq" podUID="12231fcc-9527-405e-bac6-734865031f83" Jan 30 05:09:19 crc kubenswrapper[4841]: I0130 05:09:19.238508 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p8zbg" Jan 30 05:09:20 crc kubenswrapper[4841]: I0130 05:09:20.366710 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 05:09:22 crc kubenswrapper[4841]: E0130 05:09:22.801529 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-t88dq" podUID="12231fcc-9527-405e-bac6-734865031f83" Jan 30 05:09:23 crc kubenswrapper[4841]: I0130 05:09:23.458727 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 30 05:09:24 crc kubenswrapper[4841]: I0130 05:09:24.490447 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.490381851 podStartE2EDuration="1.490381851s" podCreationTimestamp="2026-01-30 05:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:24.483588141 +0000 UTC m=+101.477060819" watchObservedRunningTime="2026-01-30 05:09:24.490381851 +0000 UTC m=+101.483854529" Jan 30 05:09:26 crc kubenswrapper[4841]: E0130 05:09:26.998427 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 30 05:09:26 crc kubenswrapper[4841]: E0130 05:09:26.998822 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vjx45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5rhzf_openshift-marketplace(0e336685-24da-4b58-b586-c3f673d2c226): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:09:27 crc kubenswrapper[4841]: E0130 05:09:27.000097 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5rhzf" podUID="0e336685-24da-4b58-b586-c3f673d2c226" Jan 30 05:09:27 crc kubenswrapper[4841]: E0130 05:09:27.800572 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5rhzf" podUID="0e336685-24da-4b58-b586-c3f673d2c226" Jan 30 05:09:27 crc kubenswrapper[4841]: E0130 05:09:27.901606 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 30 05:09:27 crc kubenswrapper[4841]: E0130 05:09:27.901800 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lmv29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-9pv7l_openshift-marketplace(13c7d511-de5b-4e9d-acdc-615d18346215): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:09:27 crc kubenswrapper[4841]: E0130 05:09:27.903042 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-9pv7l" podUID="13c7d511-de5b-4e9d-acdc-615d18346215" Jan 30 05:09:27 crc kubenswrapper[4841]: E0130 05:09:27.980963 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 30 05:09:27 crc kubenswrapper[4841]: E0130 05:09:27.981125 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btxhr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-g8lz7_openshift-marketplace(4a117bdc-99af-4fd8-a810-b2d08e174f77): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:09:27 crc kubenswrapper[4841]: E0130 05:09:27.982298 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-g8lz7" podUID="4a117bdc-99af-4fd8-a810-b2d08e174f77" Jan 30 05:09:28 crc kubenswrapper[4841]: E0130 05:09:28.380133 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114 is running failed: container process not found" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:09:28 crc kubenswrapper[4841]: E0130 05:09:28.380386 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114 is running failed: container process not found" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:09:28 crc kubenswrapper[4841]: E0130 05:09:28.380821 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114 is running failed: container process not found" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 05:09:28 crc kubenswrapper[4841]: E0130 05:09:28.380851 4841 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" podUID="87eccd50-4e4a-408b-aa2e-3c431b6d17d0" containerName="kube-multus-additional-cni-plugins" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.021149 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 05:09:29 crc kubenswrapper[4841]: E0130 05:09:29.021421 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc39d8e-c44d-4b29-99de-a59a89d898cd" containerName="pruner" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.021436 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc39d8e-c44d-4b29-99de-a59a89d898cd" containerName="pruner" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.021593 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc39d8e-c44d-4b29-99de-a59a89d898cd" containerName="pruner" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.021982 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.024748 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.025299 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.029779 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.203590 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89bf0300-8e0e-422b-b6f6-5648dd610ede-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"89bf0300-8e0e-422b-b6f6-5648dd610ede\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.203681 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89bf0300-8e0e-422b-b6f6-5648dd610ede-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"89bf0300-8e0e-422b-b6f6-5648dd610ede\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.305389 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89bf0300-8e0e-422b-b6f6-5648dd610ede-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"89bf0300-8e0e-422b-b6f6-5648dd610ede\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.305507 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89bf0300-8e0e-422b-b6f6-5648dd610ede-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"89bf0300-8e0e-422b-b6f6-5648dd610ede\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.305525 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89bf0300-8e0e-422b-b6f6-5648dd610ede-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"89bf0300-8e0e-422b-b6f6-5648dd610ede\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.326665 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89bf0300-8e0e-422b-b6f6-5648dd610ede-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"89bf0300-8e0e-422b-b6f6-5648dd610ede\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:09:29 crc kubenswrapper[4841]: I0130 05:09:29.357828 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:09:30 crc kubenswrapper[4841]: E0130 05:09:30.426473 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-g8lz7" podUID="4a117bdc-99af-4fd8-a810-b2d08e174f77" Jan 30 05:09:30 crc kubenswrapper[4841]: E0130 05:09:30.426538 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-9pv7l" podUID="13c7d511-de5b-4e9d-acdc-615d18346215" Jan 30 05:09:30 crc kubenswrapper[4841]: E0130 05:09:30.523380 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 30 05:09:30 crc kubenswrapper[4841]: E0130 05:09:30.523624 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r55jn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qhg2s_openshift-marketplace(94ac7c20-1f9a-4fb1-8107-1159fb740ab5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:09:30 crc kubenswrapper[4841]: E0130 05:09:30.525301 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qhg2s" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" Jan 30 05:09:30 crc kubenswrapper[4841]: I0130 05:09:30.927715 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 05:09:30 crc kubenswrapper[4841]: W0130 05:09:30.937508 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod89bf0300_8e0e_422b_b6f6_5648dd610ede.slice/crio-6f87c2c42e352b6484f40f1617af23dc966c1dd7e4790dfbac554d39c34d0f1e WatchSource:0}: Error finding container 6f87c2c42e352b6484f40f1617af23dc966c1dd7e4790dfbac554d39c34d0f1e: Status 404 returned error can't find the container with id 6f87c2c42e352b6484f40f1617af23dc966c1dd7e4790dfbac554d39c34d0f1e Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.490988 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"89bf0300-8e0e-422b-b6f6-5648dd610ede","Type":"ContainerStarted","Data":"6f87c2c42e352b6484f40f1617af23dc966c1dd7e4790dfbac554d39c34d0f1e"} Jan 30 05:09:31 crc kubenswrapper[4841]: E0130 05:09:31.496987 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qhg2s" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.767694 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-wh9ns_87eccd50-4e4a-408b-aa2e-3c431b6d17d0/kube-multus-additional-cni-plugins/0.log" Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.768195 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.940701 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-tuning-conf-dir\") pod \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.941056 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-cni-sysctl-allowlist\") pod \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.941223 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-ready\") pod \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.941594 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gpkz\" (UniqueName: \"kubernetes.io/projected/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-kube-api-access-8gpkz\") pod \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\" (UID: \"87eccd50-4e4a-408b-aa2e-3c431b6d17d0\") " Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.940870 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "87eccd50-4e4a-408b-aa2e-3c431b6d17d0" (UID: "87eccd50-4e4a-408b-aa2e-3c431b6d17d0"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.941846 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-ready" (OuterVolumeSpecName: "ready") pod "87eccd50-4e4a-408b-aa2e-3c431b6d17d0" (UID: "87eccd50-4e4a-408b-aa2e-3c431b6d17d0"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.942224 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "87eccd50-4e4a-408b-aa2e-3c431b6d17d0" (UID: "87eccd50-4e4a-408b-aa2e-3c431b6d17d0"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.942511 4841 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.942672 4841 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.942813 4841 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-ready\") on node \"crc\" DevicePath \"\"" Jan 30 05:09:31 crc kubenswrapper[4841]: I0130 05:09:31.950491 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-kube-api-access-8gpkz" (OuterVolumeSpecName: "kube-api-access-8gpkz") pod "87eccd50-4e4a-408b-aa2e-3c431b6d17d0" (UID: "87eccd50-4e4a-408b-aa2e-3c431b6d17d0"). InnerVolumeSpecName "kube-api-access-8gpkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:09:32 crc kubenswrapper[4841]: I0130 05:09:32.044539 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gpkz\" (UniqueName: \"kubernetes.io/projected/87eccd50-4e4a-408b-aa2e-3c431b6d17d0-kube-api-access-8gpkz\") on node \"crc\" DevicePath \"\"" Jan 30 05:09:32 crc kubenswrapper[4841]: I0130 05:09:32.499801 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-wh9ns_87eccd50-4e4a-408b-aa2e-3c431b6d17d0/kube-multus-additional-cni-plugins/0.log" Jan 30 05:09:32 crc kubenswrapper[4841]: I0130 05:09:32.499987 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" Jan 30 05:09:32 crc kubenswrapper[4841]: I0130 05:09:32.500813 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-wh9ns" event={"ID":"87eccd50-4e4a-408b-aa2e-3c431b6d17d0","Type":"ContainerDied","Data":"1cb3ba3a872080b8feeee5f7a44e324674e82ae7de9f5b4629024b7bbc64af39"} Jan 30 05:09:32 crc kubenswrapper[4841]: I0130 05:09:32.500860 4841 scope.go:117] "RemoveContainer" containerID="693b52457c78d88e1b82b0202ea84a21be07d4dc4738f063f43bb027f5a0e114" Jan 30 05:09:32 crc kubenswrapper[4841]: I0130 05:09:32.503652 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-25sxv" event={"ID":"1e275bab-612f-4fe8-8a4f-792634265c15","Type":"ContainerStarted","Data":"09119a922bf129bf4a888106297cccf9c85cdaa52977a222de9185a63fc3e016"} Jan 30 05:09:32 crc kubenswrapper[4841]: I0130 05:09:32.506365 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"89bf0300-8e0e-422b-b6f6-5648dd610ede","Type":"ContainerStarted","Data":"b719a953b907073e7e6c8a70cc538c5faccdffd5f32ac7904866ae10377a9fa8"} Jan 30 05:09:32 crc kubenswrapper[4841]: I0130 05:09:32.522342 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-wh9ns"] Jan 30 05:09:32 crc kubenswrapper[4841]: I0130 05:09:32.525720 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-wh9ns"] Jan 30 05:09:32 crc kubenswrapper[4841]: E0130 05:09:32.899578 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 30 05:09:32 crc kubenswrapper[4841]: E0130 05:09:32.899740 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wg2tp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nhvlf_openshift-marketplace(f7b9e216-aaea-4222-ab65-efadd17f2f46): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:09:32 crc kubenswrapper[4841]: E0130 05:09:32.900987 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nhvlf" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" Jan 30 05:09:33 crc kubenswrapper[4841]: E0130 05:09:33.517564 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nhvlf" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.221649 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 05:09:34 crc kubenswrapper[4841]: E0130 05:09:34.222061 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87eccd50-4e4a-408b-aa2e-3c431b6d17d0" containerName="kube-multus-additional-cni-plugins" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.222102 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="87eccd50-4e4a-408b-aa2e-3c431b6d17d0" containerName="kube-multus-additional-cni-plugins" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.222462 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="87eccd50-4e4a-408b-aa2e-3c431b6d17d0" containerName="kube-multus-additional-cni-plugins" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.223182 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.235276 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.375903 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf712d8-8aae-4943-8737-0fe6d9eda70d-kube-api-access\") pod \"installer-9-crc\" (UID: \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.376228 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf712d8-8aae-4943-8737-0fe6d9eda70d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.376527 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1cf712d8-8aae-4943-8737-0fe6d9eda70d-var-lock\") pod \"installer-9-crc\" (UID: \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.462473 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87eccd50-4e4a-408b-aa2e-3c431b6d17d0" path="/var/lib/kubelet/pods/87eccd50-4e4a-408b-aa2e-3c431b6d17d0/volumes" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.477919 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf712d8-8aae-4943-8737-0fe6d9eda70d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.478027 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1cf712d8-8aae-4943-8737-0fe6d9eda70d-var-lock\") pod \"installer-9-crc\" (UID: \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.478070 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf712d8-8aae-4943-8737-0fe6d9eda70d-kube-api-access\") pod \"installer-9-crc\" (UID: \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.478164 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf712d8-8aae-4943-8737-0fe6d9eda70d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.478198 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1cf712d8-8aae-4943-8737-0fe6d9eda70d-var-lock\") pod \"installer-9-crc\" (UID: \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.524341 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf712d8-8aae-4943-8737-0fe6d9eda70d-kube-api-access\") pod \"installer-9-crc\" (UID: \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.531474 4841 generic.go:334] "Generic (PLEG): container finished" podID="89bf0300-8e0e-422b-b6f6-5648dd610ede" containerID="b719a953b907073e7e6c8a70cc538c5faccdffd5f32ac7904866ae10377a9fa8" exitCode=0 Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.531787 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"89bf0300-8e0e-422b-b6f6-5648dd610ede","Type":"ContainerDied","Data":"b719a953b907073e7e6c8a70cc538c5faccdffd5f32ac7904866ae10377a9fa8"} Jan 30 05:09:34 crc kubenswrapper[4841]: E0130 05:09:34.540604 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 30 05:09:34 crc kubenswrapper[4841]: E0130 05:09:34.540982 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-brjwf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-qp57k_openshift-marketplace(5ff4432f-c571-42f5-a82c-58b4cc8be05d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:09:34 crc kubenswrapper[4841]: E0130 05:09:34.542452 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-qp57k" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.556799 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:09:34 crc kubenswrapper[4841]: E0130 05:09:34.653006 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 30 05:09:34 crc kubenswrapper[4841]: E0130 05:09:34.653145 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fkrng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-xzz8x_openshift-marketplace(6a6ce2ed-da59-4d16-8d01-022b22e746f1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 05:09:34 crc kubenswrapper[4841]: E0130 05:09:34.654243 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-xzz8x" podUID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" Jan 30 05:09:34 crc kubenswrapper[4841]: I0130 05:09:34.745143 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 05:09:34 crc kubenswrapper[4841]: W0130 05:09:34.748950 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1cf712d8_8aae_4943_8737_0fe6d9eda70d.slice/crio-801fc81a14940d82a4d6d9bdcd38a7e58c64193a441192ec4a996d559a1fa549 WatchSource:0}: Error finding container 801fc81a14940d82a4d6d9bdcd38a7e58c64193a441192ec4a996d559a1fa549: Status 404 returned error can't find the container with id 801fc81a14940d82a4d6d9bdcd38a7e58c64193a441192ec4a996d559a1fa549 Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.539413 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-25sxv" event={"ID":"1e275bab-612f-4fe8-8a4f-792634265c15","Type":"ContainerStarted","Data":"12ae315610ebfea30cbdf9b1785a17a62ce09b2519d9918051b2114ca2134814"} Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.541896 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1cf712d8-8aae-4943-8737-0fe6d9eda70d","Type":"ContainerStarted","Data":"a0e9d35f815a7e80ada692c642063de37dd780d655cd885325b3f60a80801b41"} Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.542019 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1cf712d8-8aae-4943-8737-0fe6d9eda70d","Type":"ContainerStarted","Data":"801fc81a14940d82a4d6d9bdcd38a7e58c64193a441192ec4a996d559a1fa549"} Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.543777 4841 generic.go:334] "Generic (PLEG): container finished" podID="12231fcc-9527-405e-bac6-734865031f83" containerID="eabf638e1e003020430b9da88dfa4942f3166ae57a60ec4df1c92e392ec86c4b" exitCode=0 Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.544263 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t88dq" event={"ID":"12231fcc-9527-405e-bac6-734865031f83","Type":"ContainerDied","Data":"eabf638e1e003020430b9da88dfa4942f3166ae57a60ec4df1c92e392ec86c4b"} Jan 30 05:09:35 crc kubenswrapper[4841]: E0130 05:09:35.556042 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-xzz8x" podUID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" Jan 30 05:09:35 crc kubenswrapper[4841]: E0130 05:09:35.556330 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-qp57k" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.558699 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-25sxv" podStartSLOduration=88.558680915 podStartE2EDuration="1m28.558680915s" podCreationTimestamp="2026-01-30 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:35.552452708 +0000 UTC m=+112.545925336" watchObservedRunningTime="2026-01-30 05:09:35.558680915 +0000 UTC m=+112.552153553" Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.632154 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.6321361730000001 podStartE2EDuration="1.632136173s" podCreationTimestamp="2026-01-30 05:09:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:09:35.609984582 +0000 UTC m=+112.603457220" watchObservedRunningTime="2026-01-30 05:09:35.632136173 +0000 UTC m=+112.625608811" Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.809001 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.897103 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89bf0300-8e0e-422b-b6f6-5648dd610ede-kube-api-access\") pod \"89bf0300-8e0e-422b-b6f6-5648dd610ede\" (UID: \"89bf0300-8e0e-422b-b6f6-5648dd610ede\") " Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.897214 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89bf0300-8e0e-422b-b6f6-5648dd610ede-kubelet-dir\") pod \"89bf0300-8e0e-422b-b6f6-5648dd610ede\" (UID: \"89bf0300-8e0e-422b-b6f6-5648dd610ede\") " Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.897604 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89bf0300-8e0e-422b-b6f6-5648dd610ede-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "89bf0300-8e0e-422b-b6f6-5648dd610ede" (UID: "89bf0300-8e0e-422b-b6f6-5648dd610ede"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.898123 4841 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89bf0300-8e0e-422b-b6f6-5648dd610ede-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.919626 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89bf0300-8e0e-422b-b6f6-5648dd610ede-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "89bf0300-8e0e-422b-b6f6-5648dd610ede" (UID: "89bf0300-8e0e-422b-b6f6-5648dd610ede"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:09:35 crc kubenswrapper[4841]: I0130 05:09:35.999856 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89bf0300-8e0e-422b-b6f6-5648dd610ede-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:09:36 crc kubenswrapper[4841]: I0130 05:09:36.551477 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"89bf0300-8e0e-422b-b6f6-5648dd610ede","Type":"ContainerDied","Data":"6f87c2c42e352b6484f40f1617af23dc966c1dd7e4790dfbac554d39c34d0f1e"} Jan 30 05:09:36 crc kubenswrapper[4841]: I0130 05:09:36.551793 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f87c2c42e352b6484f40f1617af23dc966c1dd7e4790dfbac554d39c34d0f1e" Jan 30 05:09:36 crc kubenswrapper[4841]: I0130 05:09:36.551519 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 05:09:36 crc kubenswrapper[4841]: I0130 05:09:36.555977 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t88dq" event={"ID":"12231fcc-9527-405e-bac6-734865031f83","Type":"ContainerStarted","Data":"c3e3310da4b49162dfc532fc80d8b8d94dae98b15b44de787a1da985dfef4b5f"} Jan 30 05:09:36 crc kubenswrapper[4841]: I0130 05:09:36.589497 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t88dq" podStartSLOduration=2.351135773 podStartE2EDuration="52.589467368s" podCreationTimestamp="2026-01-30 05:08:44 +0000 UTC" firstStartedPulling="2026-01-30 05:08:45.941672595 +0000 UTC m=+62.935145243" lastFinishedPulling="2026-01-30 05:09:36.1800042 +0000 UTC m=+113.173476838" observedRunningTime="2026-01-30 05:09:36.586688864 +0000 UTC m=+113.580161512" watchObservedRunningTime="2026-01-30 05:09:36.589467368 +0000 UTC m=+113.582940066" Jan 30 05:09:41 crc kubenswrapper[4841]: I0130 05:09:41.594183 4841 generic.go:334] "Generic (PLEG): container finished" podID="0e336685-24da-4b58-b586-c3f673d2c226" containerID="c5b0821e7ca6094a67fdda30dadcf3bf2ae990d0fc28de7d0ef13c1b74d8cfc6" exitCode=0 Jan 30 05:09:41 crc kubenswrapper[4841]: I0130 05:09:41.594240 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rhzf" event={"ID":"0e336685-24da-4b58-b586-c3f673d2c226","Type":"ContainerDied","Data":"c5b0821e7ca6094a67fdda30dadcf3bf2ae990d0fc28de7d0ef13c1b74d8cfc6"} Jan 30 05:09:42 crc kubenswrapper[4841]: I0130 05:09:42.600246 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rhzf" event={"ID":"0e336685-24da-4b58-b586-c3f673d2c226","Type":"ContainerStarted","Data":"18507df8bc9ba1a66cc277b4a240d53becc9d78bb61888ce6259fdd7aeb52717"} Jan 30 05:09:42 crc kubenswrapper[4841]: I0130 05:09:42.603209 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pv7l" event={"ID":"13c7d511-de5b-4e9d-acdc-615d18346215","Type":"ContainerStarted","Data":"7f8f28c4fc6e50e80368f4e8757fbbe47cfd12f574d358b9fdd9d951b29ded2d"} Jan 30 05:09:42 crc kubenswrapper[4841]: I0130 05:09:42.617368 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5rhzf" podStartSLOduration=3.616208117 podStartE2EDuration="58.617354527s" podCreationTimestamp="2026-01-30 05:08:44 +0000 UTC" firstStartedPulling="2026-01-30 05:08:46.990590726 +0000 UTC m=+63.984063364" lastFinishedPulling="2026-01-30 05:09:41.991737136 +0000 UTC m=+118.985209774" observedRunningTime="2026-01-30 05:09:42.615522308 +0000 UTC m=+119.608994956" watchObservedRunningTime="2026-01-30 05:09:42.617354527 +0000 UTC m=+119.610827165" Jan 30 05:09:43 crc kubenswrapper[4841]: I0130 05:09:43.613154 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8lz7" event={"ID":"4a117bdc-99af-4fd8-a810-b2d08e174f77","Type":"ContainerStarted","Data":"2c66e8c8fcd07a8f41b10764c3f1d21ade4ef113625266fc99161e0efb1ce4fe"} Jan 30 05:09:43 crc kubenswrapper[4841]: I0130 05:09:43.617437 4841 generic.go:334] "Generic (PLEG): container finished" podID="13c7d511-de5b-4e9d-acdc-615d18346215" containerID="7f8f28c4fc6e50e80368f4e8757fbbe47cfd12f574d358b9fdd9d951b29ded2d" exitCode=0 Jan 30 05:09:43 crc kubenswrapper[4841]: I0130 05:09:43.617482 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pv7l" event={"ID":"13c7d511-de5b-4e9d-acdc-615d18346215","Type":"ContainerDied","Data":"7f8f28c4fc6e50e80368f4e8757fbbe47cfd12f574d358b9fdd9d951b29ded2d"} Jan 30 05:09:44 crc kubenswrapper[4841]: I0130 05:09:44.626934 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pv7l" event={"ID":"13c7d511-de5b-4e9d-acdc-615d18346215","Type":"ContainerStarted","Data":"70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284"} Jan 30 05:09:44 crc kubenswrapper[4841]: I0130 05:09:44.631659 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhg2s" event={"ID":"94ac7c20-1f9a-4fb1-8107-1159fb740ab5","Type":"ContainerStarted","Data":"4fadf4bb12ce50091dbbc931199bb56cef1efd3749cacc5863e373801393bff5"} Jan 30 05:09:44 crc kubenswrapper[4841]: I0130 05:09:44.635087 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8lz7" event={"ID":"4a117bdc-99af-4fd8-a810-b2d08e174f77","Type":"ContainerDied","Data":"2c66e8c8fcd07a8f41b10764c3f1d21ade4ef113625266fc99161e0efb1ce4fe"} Jan 30 05:09:44 crc kubenswrapper[4841]: I0130 05:09:44.635109 4841 generic.go:334] "Generic (PLEG): container finished" podID="4a117bdc-99af-4fd8-a810-b2d08e174f77" containerID="2c66e8c8fcd07a8f41b10764c3f1d21ade4ef113625266fc99161e0efb1ce4fe" exitCode=0 Jan 30 05:09:44 crc kubenswrapper[4841]: I0130 05:09:44.658685 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9pv7l" podStartSLOduration=2.752510295 podStartE2EDuration="57.658661304s" podCreationTimestamp="2026-01-30 05:08:47 +0000 UTC" firstStartedPulling="2026-01-30 05:08:49.116426876 +0000 UTC m=+66.109899514" lastFinishedPulling="2026-01-30 05:09:44.022577885 +0000 UTC m=+121.016050523" observedRunningTime="2026-01-30 05:09:44.650371553 +0000 UTC m=+121.643844191" watchObservedRunningTime="2026-01-30 05:09:44.658661304 +0000 UTC m=+121.652133952" Jan 30 05:09:44 crc kubenswrapper[4841]: I0130 05:09:44.934664 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:09:44 crc kubenswrapper[4841]: I0130 05:09:44.934958 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:09:45 crc kubenswrapper[4841]: I0130 05:09:45.086013 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:09:45 crc kubenswrapper[4841]: I0130 05:09:45.328084 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:09:45 crc kubenswrapper[4841]: I0130 05:09:45.328134 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:09:45 crc kubenswrapper[4841]: I0130 05:09:45.368248 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:09:45 crc kubenswrapper[4841]: I0130 05:09:45.641892 4841 generic.go:334] "Generic (PLEG): container finished" podID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" containerID="4fadf4bb12ce50091dbbc931199bb56cef1efd3749cacc5863e373801393bff5" exitCode=0 Jan 30 05:09:45 crc kubenswrapper[4841]: I0130 05:09:45.641965 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhg2s" event={"ID":"94ac7c20-1f9a-4fb1-8107-1159fb740ab5","Type":"ContainerDied","Data":"4fadf4bb12ce50091dbbc931199bb56cef1efd3749cacc5863e373801393bff5"} Jan 30 05:09:45 crc kubenswrapper[4841]: I0130 05:09:45.646610 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8lz7" event={"ID":"4a117bdc-99af-4fd8-a810-b2d08e174f77","Type":"ContainerStarted","Data":"53c57b5e2798421fc4a66deb59265f89b7723297b8552704b8d55553ae59063d"} Jan 30 05:09:45 crc kubenswrapper[4841]: I0130 05:09:45.681926 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g8lz7" podStartSLOduration=2.594677549 podStartE2EDuration="1m1.681903877s" podCreationTimestamp="2026-01-30 05:08:44 +0000 UTC" firstStartedPulling="2026-01-30 05:08:45.960591942 +0000 UTC m=+62.954064580" lastFinishedPulling="2026-01-30 05:09:45.04781827 +0000 UTC m=+122.041290908" observedRunningTime="2026-01-30 05:09:45.67638995 +0000 UTC m=+122.669862588" watchObservedRunningTime="2026-01-30 05:09:45.681903877 +0000 UTC m=+122.675376515" Jan 30 05:09:45 crc kubenswrapper[4841]: I0130 05:09:45.695497 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:09:48 crc kubenswrapper[4841]: I0130 05:09:48.127317 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:09:48 crc kubenswrapper[4841]: I0130 05:09:48.127373 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:09:49 crc kubenswrapper[4841]: I0130 05:09:49.176172 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9pv7l" podUID="13c7d511-de5b-4e9d-acdc-615d18346215" containerName="registry-server" probeResult="failure" output=< Jan 30 05:09:49 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 05:09:49 crc kubenswrapper[4841]: > Jan 30 05:09:55 crc kubenswrapper[4841]: I0130 05:09:55.125787 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:09:55 crc kubenswrapper[4841]: I0130 05:09:55.126478 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:09:55 crc kubenswrapper[4841]: I0130 05:09:55.184348 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:09:55 crc kubenswrapper[4841]: I0130 05:09:55.371906 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:09:55 crc kubenswrapper[4841]: I0130 05:09:55.752629 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:09:57 crc kubenswrapper[4841]: I0130 05:09:57.677544 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5rhzf"] Jan 30 05:09:57 crc kubenswrapper[4841]: I0130 05:09:57.678211 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5rhzf" podUID="0e336685-24da-4b58-b586-c3f673d2c226" containerName="registry-server" containerID="cri-o://18507df8bc9ba1a66cc277b4a240d53becc9d78bb61888ce6259fdd7aeb52717" gracePeriod=2 Jan 30 05:09:58 crc kubenswrapper[4841]: I0130 05:09:58.198942 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:09:58 crc kubenswrapper[4841]: I0130 05:09:58.258368 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:09:58 crc kubenswrapper[4841]: I0130 05:09:58.715972 4841 generic.go:334] "Generic (PLEG): container finished" podID="0e336685-24da-4b58-b586-c3f673d2c226" containerID="18507df8bc9ba1a66cc277b4a240d53becc9d78bb61888ce6259fdd7aeb52717" exitCode=0 Jan 30 05:09:58 crc kubenswrapper[4841]: I0130 05:09:58.716038 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rhzf" event={"ID":"0e336685-24da-4b58-b586-c3f673d2c226","Type":"ContainerDied","Data":"18507df8bc9ba1a66cc277b4a240d53becc9d78bb61888ce6259fdd7aeb52717"} Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.290144 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.405621 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjx45\" (UniqueName: \"kubernetes.io/projected/0e336685-24da-4b58-b586-c3f673d2c226-kube-api-access-vjx45\") pod \"0e336685-24da-4b58-b586-c3f673d2c226\" (UID: \"0e336685-24da-4b58-b586-c3f673d2c226\") " Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.405671 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e336685-24da-4b58-b586-c3f673d2c226-utilities\") pod \"0e336685-24da-4b58-b586-c3f673d2c226\" (UID: \"0e336685-24da-4b58-b586-c3f673d2c226\") " Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.405769 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e336685-24da-4b58-b586-c3f673d2c226-catalog-content\") pod \"0e336685-24da-4b58-b586-c3f673d2c226\" (UID: \"0e336685-24da-4b58-b586-c3f673d2c226\") " Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.406619 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e336685-24da-4b58-b586-c3f673d2c226-utilities" (OuterVolumeSpecName: "utilities") pod "0e336685-24da-4b58-b586-c3f673d2c226" (UID: "0e336685-24da-4b58-b586-c3f673d2c226"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.416660 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e336685-24da-4b58-b586-c3f673d2c226-kube-api-access-vjx45" (OuterVolumeSpecName: "kube-api-access-vjx45") pod "0e336685-24da-4b58-b586-c3f673d2c226" (UID: "0e336685-24da-4b58-b586-c3f673d2c226"). InnerVolumeSpecName "kube-api-access-vjx45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.451468 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e336685-24da-4b58-b586-c3f673d2c226-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e336685-24da-4b58-b586-c3f673d2c226" (UID: "0e336685-24da-4b58-b586-c3f673d2c226"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.507972 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjx45\" (UniqueName: \"kubernetes.io/projected/0e336685-24da-4b58-b586-c3f673d2c226-kube-api-access-vjx45\") on node \"crc\" DevicePath \"\"" Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.508020 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e336685-24da-4b58-b586-c3f673d2c226-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.508035 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e336685-24da-4b58-b586-c3f673d2c226-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.723381 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhg2s" event={"ID":"94ac7c20-1f9a-4fb1-8107-1159fb740ab5","Type":"ContainerStarted","Data":"b7a2e2dfb466fbe724a2791aab2210e95283d153d0a371780b19df51140ce75e"} Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.725585 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rhzf" Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.725971 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rhzf" event={"ID":"0e336685-24da-4b58-b586-c3f673d2c226","Type":"ContainerDied","Data":"24249701f22c0b651677da26cbf77ebf9586352c6b2b85423131f8c03e7283e7"} Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.726003 4841 scope.go:117] "RemoveContainer" containerID="18507df8bc9ba1a66cc277b4a240d53becc9d78bb61888ce6259fdd7aeb52717" Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.728885 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzz8x" event={"ID":"6a6ce2ed-da59-4d16-8d01-022b22e746f1","Type":"ContainerStarted","Data":"1f783a52545342ff19e5d16cde005f3b602741f7151f446a8ce5dcdea35fa19a"} Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.755884 4841 scope.go:117] "RemoveContainer" containerID="c5b0821e7ca6094a67fdda30dadcf3bf2ae990d0fc28de7d0ef13c1b74d8cfc6" Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.770693 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qhg2s" podStartSLOduration=6.944190124 podStartE2EDuration="1m11.770671062s" podCreationTimestamp="2026-01-30 05:08:48 +0000 UTC" firstStartedPulling="2026-01-30 05:08:50.122598134 +0000 UTC m=+67.116070772" lastFinishedPulling="2026-01-30 05:09:54.949079042 +0000 UTC m=+131.942551710" observedRunningTime="2026-01-30 05:09:59.757896297 +0000 UTC m=+136.751368935" watchObservedRunningTime="2026-01-30 05:09:59.770671062 +0000 UTC m=+136.764143710" Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.774329 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5rhzf"] Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.780496 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5rhzf"] Jan 30 05:09:59 crc kubenswrapper[4841]: I0130 05:09:59.780585 4841 scope.go:117] "RemoveContainer" containerID="318ea930abaac4e7eb4f201fa18ea8021537231a4a1deb336d99f99e9f1feef4" Jan 30 05:10:00 crc kubenswrapper[4841]: I0130 05:10:00.448524 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e336685-24da-4b58-b586-c3f673d2c226" path="/var/lib/kubelet/pods/0e336685-24da-4b58-b586-c3f673d2c226/volumes" Jan 30 05:10:00 crc kubenswrapper[4841]: I0130 05:10:00.735618 4841 generic.go:334] "Generic (PLEG): container finished" podID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" containerID="f91bd6cd600e439faed0c14adc100bcc1482d66a25768eda5a05334ba4171c0b" exitCode=0 Jan 30 05:10:00 crc kubenswrapper[4841]: I0130 05:10:00.735674 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qp57k" event={"ID":"5ff4432f-c571-42f5-a82c-58b4cc8be05d","Type":"ContainerDied","Data":"f91bd6cd600e439faed0c14adc100bcc1482d66a25768eda5a05334ba4171c0b"} Jan 30 05:10:00 crc kubenswrapper[4841]: I0130 05:10:00.737207 4841 generic.go:334] "Generic (PLEG): container finished" podID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" containerID="1f783a52545342ff19e5d16cde005f3b602741f7151f446a8ce5dcdea35fa19a" exitCode=0 Jan 30 05:10:00 crc kubenswrapper[4841]: I0130 05:10:00.737229 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzz8x" event={"ID":"6a6ce2ed-da59-4d16-8d01-022b22e746f1","Type":"ContainerDied","Data":"1f783a52545342ff19e5d16cde005f3b602741f7151f446a8ce5dcdea35fa19a"} Jan 30 05:10:00 crc kubenswrapper[4841]: I0130 05:10:00.739134 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhvlf" event={"ID":"f7b9e216-aaea-4222-ab65-efadd17f2f46","Type":"ContainerStarted","Data":"7a9237339dded0cc16f47c9935d3a3286d5ceac3b7e00245f27c81ba1202991e"} Jan 30 05:10:01 crc kubenswrapper[4841]: I0130 05:10:01.747523 4841 generic.go:334] "Generic (PLEG): container finished" podID="f7b9e216-aaea-4222-ab65-efadd17f2f46" containerID="7a9237339dded0cc16f47c9935d3a3286d5ceac3b7e00245f27c81ba1202991e" exitCode=0 Jan 30 05:10:01 crc kubenswrapper[4841]: I0130 05:10:01.747657 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhvlf" event={"ID":"f7b9e216-aaea-4222-ab65-efadd17f2f46","Type":"ContainerDied","Data":"7a9237339dded0cc16f47c9935d3a3286d5ceac3b7e00245f27c81ba1202991e"} Jan 30 05:10:02 crc kubenswrapper[4841]: I0130 05:10:02.759302 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qp57k" event={"ID":"5ff4432f-c571-42f5-a82c-58b4cc8be05d","Type":"ContainerStarted","Data":"6be740d9b06a7a65a3edf7a92a9882679d9656ac9467a2d44e01a89242d320b5"} Jan 30 05:10:02 crc kubenswrapper[4841]: I0130 05:10:02.762844 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzz8x" event={"ID":"6a6ce2ed-da59-4d16-8d01-022b22e746f1","Type":"ContainerStarted","Data":"4ba16a3964468fbeafe5fd4b370713974204613a888d8d447b8c93eca07054f9"} Jan 30 05:10:02 crc kubenswrapper[4841]: I0130 05:10:02.764716 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhvlf" event={"ID":"f7b9e216-aaea-4222-ab65-efadd17f2f46","Type":"ContainerStarted","Data":"954b3dae9e656fbd6e938c990df2ecf605a709518726fa0c5a5e715eff693039"} Jan 30 05:10:02 crc kubenswrapper[4841]: I0130 05:10:02.781486 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qp57k" podStartSLOduration=2.657362329 podStartE2EDuration="1m16.781467231s" podCreationTimestamp="2026-01-30 05:08:46 +0000 UTC" firstStartedPulling="2026-01-30 05:08:48.023877166 +0000 UTC m=+65.017349804" lastFinishedPulling="2026-01-30 05:10:02.147982058 +0000 UTC m=+139.141454706" observedRunningTime="2026-01-30 05:10:02.779167877 +0000 UTC m=+139.772640515" watchObservedRunningTime="2026-01-30 05:10:02.781467231 +0000 UTC m=+139.774939869" Jan 30 05:10:02 crc kubenswrapper[4841]: I0130 05:10:02.801333 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xzz8x" podStartSLOduration=2.593638478 podStartE2EDuration="1m16.801307785s" podCreationTimestamp="2026-01-30 05:08:46 +0000 UTC" firstStartedPulling="2026-01-30 05:08:48.027967664 +0000 UTC m=+65.021440302" lastFinishedPulling="2026-01-30 05:10:02.235636961 +0000 UTC m=+139.229109609" observedRunningTime="2026-01-30 05:10:02.799881456 +0000 UTC m=+139.793354094" watchObservedRunningTime="2026-01-30 05:10:02.801307785 +0000 UTC m=+139.794780433" Jan 30 05:10:02 crc kubenswrapper[4841]: I0130 05:10:02.827259 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nhvlf" podStartSLOduration=2.597692731 podStartE2EDuration="1m17.827245809s" podCreationTimestamp="2026-01-30 05:08:45 +0000 UTC" firstStartedPulling="2026-01-30 05:08:46.996413071 +0000 UTC m=+63.989885709" lastFinishedPulling="2026-01-30 05:10:02.225966139 +0000 UTC m=+139.219438787" observedRunningTime="2026-01-30 05:10:02.824513049 +0000 UTC m=+139.817985687" watchObservedRunningTime="2026-01-30 05:10:02.827245809 +0000 UTC m=+139.820718437" Jan 30 05:10:05 crc kubenswrapper[4841]: I0130 05:10:05.541222 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:10:05 crc kubenswrapper[4841]: I0130 05:10:05.541499 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:10:05 crc kubenswrapper[4841]: I0130 05:10:05.587962 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:10:06 crc kubenswrapper[4841]: I0130 05:10:06.242953 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5hkjb"] Jan 30 05:10:06 crc kubenswrapper[4841]: I0130 05:10:06.922468 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:10:06 crc kubenswrapper[4841]: I0130 05:10:06.922913 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:10:06 crc kubenswrapper[4841]: I0130 05:10:06.994138 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:10:07 crc kubenswrapper[4841]: I0130 05:10:07.315562 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:10:07 crc kubenswrapper[4841]: I0130 05:10:07.315842 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:10:07 crc kubenswrapper[4841]: I0130 05:10:07.376120 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:10:07 crc kubenswrapper[4841]: I0130 05:10:07.866521 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:10:07 crc kubenswrapper[4841]: I0130 05:10:07.869493 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:10:08 crc kubenswrapper[4841]: I0130 05:10:08.558379 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:10:08 crc kubenswrapper[4841]: I0130 05:10:08.558505 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:10:08 crc kubenswrapper[4841]: I0130 05:10:08.618572 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:10:08 crc kubenswrapper[4841]: I0130 05:10:08.882864 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:10:10 crc kubenswrapper[4841]: I0130 05:10:10.077744 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qp57k"] Jan 30 05:10:10 crc kubenswrapper[4841]: I0130 05:10:10.855963 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qp57k" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" containerName="registry-server" containerID="cri-o://6be740d9b06a7a65a3edf7a92a9882679d9656ac9467a2d44e01a89242d320b5" gracePeriod=2 Jan 30 05:10:11 crc kubenswrapper[4841]: I0130 05:10:11.875453 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qhg2s"] Jan 30 05:10:11 crc kubenswrapper[4841]: I0130 05:10:11.875799 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qhg2s" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" containerName="registry-server" containerID="cri-o://b7a2e2dfb466fbe724a2791aab2210e95283d153d0a371780b19df51140ce75e" gracePeriod=2 Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.963937 4841 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 05:10:12 crc kubenswrapper[4841]: E0130 05:10:12.964560 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e336685-24da-4b58-b586-c3f673d2c226" containerName="extract-content" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.964577 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e336685-24da-4b58-b586-c3f673d2c226" containerName="extract-content" Jan 30 05:10:12 crc kubenswrapper[4841]: E0130 05:10:12.964591 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89bf0300-8e0e-422b-b6f6-5648dd610ede" containerName="pruner" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.964599 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="89bf0300-8e0e-422b-b6f6-5648dd610ede" containerName="pruner" Jan 30 05:10:12 crc kubenswrapper[4841]: E0130 05:10:12.964615 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e336685-24da-4b58-b586-c3f673d2c226" containerName="extract-utilities" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.964624 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e336685-24da-4b58-b586-c3f673d2c226" containerName="extract-utilities" Jan 30 05:10:12 crc kubenswrapper[4841]: E0130 05:10:12.964636 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e336685-24da-4b58-b586-c3f673d2c226" containerName="registry-server" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.964644 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e336685-24da-4b58-b586-c3f673d2c226" containerName="registry-server" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.964768 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="89bf0300-8e0e-422b-b6f6-5648dd610ede" containerName="pruner" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.964784 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e336685-24da-4b58-b586-c3f673d2c226" containerName="registry-server" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.965163 4841 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.965491 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617" gracePeriod=15 Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.965641 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://f9a32e40ddae47aeb3bdffbc164805c1dfcdc5195e7b0d1c61ad87b68976eed9" gracePeriod=15 Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.965702 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea" gracePeriod=15 Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.965794 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d" gracePeriod=15 Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.965795 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54" gracePeriod=15 Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.967818 4841 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.968099 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:12 crc kubenswrapper[4841]: E0130 05:10:12.968692 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.968719 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 05:10:12 crc kubenswrapper[4841]: E0130 05:10:12.968745 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.968758 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:10:12 crc kubenswrapper[4841]: E0130 05:10:12.968776 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.968788 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 05:10:12 crc kubenswrapper[4841]: E0130 05:10:12.968808 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.968821 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 05:10:12 crc kubenswrapper[4841]: E0130 05:10:12.968836 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.968847 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:10:12 crc kubenswrapper[4841]: E0130 05:10:12.968860 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.968873 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 05:10:12 crc kubenswrapper[4841]: E0130 05:10:12.968895 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.968908 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 30 05:10:12 crc kubenswrapper[4841]: E0130 05:10:12.968926 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.968938 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.969119 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.969142 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.969157 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.969174 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.969192 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.969249 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.969610 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 05:10:12 crc kubenswrapper[4841]: I0130 05:10:12.980136 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.031773 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.108185 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.108236 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.108266 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.108285 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.108315 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.108427 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.108520 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.108573 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.209835 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.210284 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.210632 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.210941 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.211225 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.210714 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.210366 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.211103 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.210007 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.211278 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.212275 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.212331 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.212734 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.212624 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.213160 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.213240 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.324017 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:10:13 crc kubenswrapper[4841]: W0130 05:10:13.353840 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-8e8f492d6dd4cd5e26ae1402b5252ab958f5da88cc2c72984b2c696ad18c7c4b WatchSource:0}: Error finding container 8e8f492d6dd4cd5e26ae1402b5252ab958f5da88cc2c72984b2c696ad18c7c4b: Status 404 returned error can't find the container with id 8e8f492d6dd4cd5e26ae1402b5252ab958f5da88cc2c72984b2c696ad18c7c4b Jan 30 05:10:13 crc kubenswrapper[4841]: E0130 05:10:13.358441 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.36:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f6a142673bf1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 05:10:13.35690422 +0000 UTC m=+150.350376898,LastTimestamp:2026-01-30 05:10:13.35690422 +0000 UTC m=+150.350376898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.878540 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8e8f492d6dd4cd5e26ae1402b5252ab958f5da88cc2c72984b2c696ad18c7c4b"} Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.882277 4841 generic.go:334] "Generic (PLEG): container finished" podID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" containerID="6be740d9b06a7a65a3edf7a92a9882679d9656ac9467a2d44e01a89242d320b5" exitCode=0 Jan 30 05:10:13 crc kubenswrapper[4841]: I0130 05:10:13.882343 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qp57k" event={"ID":"5ff4432f-c571-42f5-a82c-58b4cc8be05d","Type":"ContainerDied","Data":"6be740d9b06a7a65a3edf7a92a9882679d9656ac9467a2d44e01a89242d320b5"} Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.434243 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.891814 4841 generic.go:334] "Generic (PLEG): container finished" podID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" containerID="a0e9d35f815a7e80ada692c642063de37dd780d655cd885325b3f60a80801b41" exitCode=0 Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.891920 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1cf712d8-8aae-4943-8737-0fe6d9eda70d","Type":"ContainerDied","Data":"a0e9d35f815a7e80ada692c642063de37dd780d655cd885325b3f60a80801b41"} Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.892903 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.893301 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.897042 4841 generic.go:334] "Generic (PLEG): container finished" podID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" containerID="b7a2e2dfb466fbe724a2791aab2210e95283d153d0a371780b19df51140ce75e" exitCode=0 Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.897124 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhg2s" event={"ID":"94ac7c20-1f9a-4fb1-8107-1159fb740ab5","Type":"ContainerDied","Data":"b7a2e2dfb466fbe724a2791aab2210e95283d153d0a371780b19df51140ce75e"} Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.900517 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.902485 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.903906 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f9a32e40ddae47aeb3bdffbc164805c1dfcdc5195e7b0d1c61ad87b68976eed9" exitCode=0 Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.903949 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54" exitCode=0 Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.903969 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d" exitCode=0 Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.903985 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea" exitCode=2 Jan 30 05:10:14 crc kubenswrapper[4841]: I0130 05:10:14.903997 4841 scope.go:117] "RemoveContainer" containerID="042b316a938f2cfaf322c1e62ed9d763a258d623818e78bb450c08625cffe6d3" Jan 30 05:10:15 crc kubenswrapper[4841]: I0130 05:10:15.617077 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:10:15 crc kubenswrapper[4841]: I0130 05:10:15.619930 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:15 crc kubenswrapper[4841]: I0130 05:10:15.620675 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:15 crc kubenswrapper[4841]: I0130 05:10:15.621199 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:15 crc kubenswrapper[4841]: I0130 05:10:15.917197 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d9331a16299a53b6af16bb9da4cd45a78dd22e4b98e043516dcfd444a3187ba2"} Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.218629 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.219284 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.219750 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.220684 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.362859 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1cf712d8-8aae-4943-8737-0fe6d9eda70d-var-lock\") pod \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\" (UID: \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\") " Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.362952 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1cf712d8-8aae-4943-8737-0fe6d9eda70d-var-lock" (OuterVolumeSpecName: "var-lock") pod "1cf712d8-8aae-4943-8737-0fe6d9eda70d" (UID: "1cf712d8-8aae-4943-8737-0fe6d9eda70d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.363017 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf712d8-8aae-4943-8737-0fe6d9eda70d-kubelet-dir\") pod \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\" (UID: \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\") " Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.363099 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1cf712d8-8aae-4943-8737-0fe6d9eda70d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1cf712d8-8aae-4943-8737-0fe6d9eda70d" (UID: "1cf712d8-8aae-4943-8737-0fe6d9eda70d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.363146 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf712d8-8aae-4943-8737-0fe6d9eda70d-kube-api-access\") pod \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\" (UID: \"1cf712d8-8aae-4943-8737-0fe6d9eda70d\") " Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.363629 4841 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf712d8-8aae-4943-8737-0fe6d9eda70d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.363697 4841 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1cf712d8-8aae-4943-8737-0fe6d9eda70d-var-lock\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.369717 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cf712d8-8aae-4943-8737-0fe6d9eda70d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1cf712d8-8aae-4943-8737-0fe6d9eda70d" (UID: "1cf712d8-8aae-4943-8737-0fe6d9eda70d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.464946 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf712d8-8aae-4943-8737-0fe6d9eda70d-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.927901 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1cf712d8-8aae-4943-8737-0fe6d9eda70d","Type":"ContainerDied","Data":"801fc81a14940d82a4d6d9bdcd38a7e58c64193a441192ec4a996d559a1fa549"} Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.927960 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="801fc81a14940d82a4d6d9bdcd38a7e58c64193a441192ec4a996d559a1fa549" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.928027 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.934651 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.935529 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:16 crc kubenswrapper[4841]: I0130 05:10:16.936038 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: E0130 05:10:17.240423 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.36:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f6a142673bf1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 05:10:13.35690422 +0000 UTC m=+150.350376898,LastTimestamp:2026-01-30 05:10:13.35690422 +0000 UTC m=+150.350376898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.251675 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.256005 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.256740 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.257277 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.257806 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.310338 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.311484 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.312176 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.312632 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.313023 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.313246 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.313667 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.346632 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.347170 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.348024 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.348631 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.348864 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.349696 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.350153 4841 status_manager.go:851] "Failed to get status for pod" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" pod="openshift-marketplace/redhat-operators-qhg2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qhg2s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.381264 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brjwf\" (UniqueName: \"kubernetes.io/projected/5ff4432f-c571-42f5-a82c-58b4cc8be05d-kube-api-access-brjwf\") pod \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\" (UID: \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\") " Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.381325 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff4432f-c571-42f5-a82c-58b4cc8be05d-catalog-content\") pod \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\" (UID: \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\") " Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.381346 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff4432f-c571-42f5-a82c-58b4cc8be05d-utilities\") pod \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\" (UID: \"5ff4432f-c571-42f5-a82c-58b4cc8be05d\") " Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.382300 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ff4432f-c571-42f5-a82c-58b4cc8be05d-utilities" (OuterVolumeSpecName: "utilities") pod "5ff4432f-c571-42f5-a82c-58b4cc8be05d" (UID: "5ff4432f-c571-42f5-a82c-58b4cc8be05d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.390368 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff4432f-c571-42f5-a82c-58b4cc8be05d-kube-api-access-brjwf" (OuterVolumeSpecName: "kube-api-access-brjwf") pod "5ff4432f-c571-42f5-a82c-58b4cc8be05d" (UID: "5ff4432f-c571-42f5-a82c-58b4cc8be05d"). InnerVolumeSpecName "kube-api-access-brjwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.412386 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ff4432f-c571-42f5-a82c-58b4cc8be05d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ff4432f-c571-42f5-a82c-58b4cc8be05d" (UID: "5ff4432f-c571-42f5-a82c-58b4cc8be05d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.482514 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r55jn\" (UniqueName: \"kubernetes.io/projected/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-kube-api-access-r55jn\") pod \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\" (UID: \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\") " Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.482580 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.482606 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-utilities\") pod \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\" (UID: \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\") " Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.482622 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-catalog-content\") pod \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\" (UID: \"94ac7c20-1f9a-4fb1-8107-1159fb740ab5\") " Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.482670 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.482710 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.482801 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.482929 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.482976 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.483515 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-utilities" (OuterVolumeSpecName: "utilities") pod "94ac7c20-1f9a-4fb1-8107-1159fb740ab5" (UID: "94ac7c20-1f9a-4fb1-8107-1159fb740ab5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.484096 4841 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.484155 4841 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.484185 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brjwf\" (UniqueName: \"kubernetes.io/projected/5ff4432f-c571-42f5-a82c-58b4cc8be05d-kube-api-access-brjwf\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.484213 4841 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.484239 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.484264 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff4432f-c571-42f5-a82c-58b4cc8be05d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.484287 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff4432f-c571-42f5-a82c-58b4cc8be05d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.487277 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-kube-api-access-r55jn" (OuterVolumeSpecName: "kube-api-access-r55jn") pod "94ac7c20-1f9a-4fb1-8107-1159fb740ab5" (UID: "94ac7c20-1f9a-4fb1-8107-1159fb740ab5"). InnerVolumeSpecName "kube-api-access-r55jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.586361 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r55jn\" (UniqueName: \"kubernetes.io/projected/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-kube-api-access-r55jn\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.605469 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94ac7c20-1f9a-4fb1-8107-1159fb740ab5" (UID: "94ac7c20-1f9a-4fb1-8107-1159fb740ab5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.688182 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94ac7c20-1f9a-4fb1-8107-1159fb740ab5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.937615 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qp57k" event={"ID":"5ff4432f-c571-42f5-a82c-58b4cc8be05d","Type":"ContainerDied","Data":"e65e439041c71d89d1aebe378b547b1081bcc614360977fa385b6db464c0d190"} Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.937666 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qp57k" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.937720 4841 scope.go:117] "RemoveContainer" containerID="6be740d9b06a7a65a3edf7a92a9882679d9656ac9467a2d44e01a89242d320b5" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.938615 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.938942 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.939156 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.939423 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.939853 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.940357 4841 status_manager.go:851] "Failed to get status for pod" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" pod="openshift-marketplace/redhat-operators-qhg2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qhg2s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.942449 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.943529 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617" exitCode=0 Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.943718 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.946268 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhg2s" event={"ID":"94ac7c20-1f9a-4fb1-8107-1159fb740ab5","Type":"ContainerDied","Data":"9bf34ec0055fc61e9675f76dbb8b7a9f57f97dc2d00d18ef4568f369432ce27f"} Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.946336 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhg2s" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.946907 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.947360 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.947827 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.948213 4841 status_manager.go:851] "Failed to get status for pod" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" pod="openshift-marketplace/redhat-operators-qhg2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qhg2s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.948803 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.949253 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.951145 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.951595 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.952914 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.953337 4841 status_manager.go:851] "Failed to get status for pod" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" pod="openshift-marketplace/redhat-operators-qhg2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qhg2s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.953810 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.954228 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.964459 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.965098 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.965616 4841 status_manager.go:851] "Failed to get status for pod" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" pod="openshift-marketplace/redhat-operators-qhg2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qhg2s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.965907 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.966351 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.966874 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.973673 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.974141 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.974650 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.974975 4841 scope.go:117] "RemoveContainer" containerID="f91bd6cd600e439faed0c14adc100bcc1482d66a25768eda5a05334ba4171c0b" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.975071 4841 status_manager.go:851] "Failed to get status for pod" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" pod="openshift-marketplace/redhat-operators-qhg2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qhg2s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.976183 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.976707 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.977358 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.977919 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.978475 4841 status_manager.go:851] "Failed to get status for pod" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" pod="openshift-marketplace/redhat-operators-qhg2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qhg2s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.979082 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.979618 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:17 crc kubenswrapper[4841]: I0130 05:10:17.980081 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.017151 4841 scope.go:117] "RemoveContainer" containerID="38501d9648ca5a8165332856b2251c566c7b943950eff4659a144c6a5135d104" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.037703 4841 scope.go:117] "RemoveContainer" containerID="f9a32e40ddae47aeb3bdffbc164805c1dfcdc5195e7b0d1c61ad87b68976eed9" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.057702 4841 scope.go:117] "RemoveContainer" containerID="32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.083240 4841 scope.go:117] "RemoveContainer" containerID="68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.105154 4841 scope.go:117] "RemoveContainer" containerID="54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.127047 4841 scope.go:117] "RemoveContainer" containerID="05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.147422 4841 scope.go:117] "RemoveContainer" containerID="11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.177109 4841 scope.go:117] "RemoveContainer" containerID="f9a32e40ddae47aeb3bdffbc164805c1dfcdc5195e7b0d1c61ad87b68976eed9" Jan 30 05:10:18 crc kubenswrapper[4841]: E0130 05:10:18.178062 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a32e40ddae47aeb3bdffbc164805c1dfcdc5195e7b0d1c61ad87b68976eed9\": container with ID starting with f9a32e40ddae47aeb3bdffbc164805c1dfcdc5195e7b0d1c61ad87b68976eed9 not found: ID does not exist" containerID="f9a32e40ddae47aeb3bdffbc164805c1dfcdc5195e7b0d1c61ad87b68976eed9" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.178115 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a32e40ddae47aeb3bdffbc164805c1dfcdc5195e7b0d1c61ad87b68976eed9"} err="failed to get container status \"f9a32e40ddae47aeb3bdffbc164805c1dfcdc5195e7b0d1c61ad87b68976eed9\": rpc error: code = NotFound desc = could not find container \"f9a32e40ddae47aeb3bdffbc164805c1dfcdc5195e7b0d1c61ad87b68976eed9\": container with ID starting with f9a32e40ddae47aeb3bdffbc164805c1dfcdc5195e7b0d1c61ad87b68976eed9 not found: ID does not exist" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.178171 4841 scope.go:117] "RemoveContainer" containerID="32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54" Jan 30 05:10:18 crc kubenswrapper[4841]: E0130 05:10:18.178684 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\": container with ID starting with 32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54 not found: ID does not exist" containerID="32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.178810 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54"} err="failed to get container status \"32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\": rpc error: code = NotFound desc = could not find container \"32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54\": container with ID starting with 32d84fb40470b68133af21df0266c8b96f5c13b41179eb3a3df2e7350e315a54 not found: ID does not exist" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.178893 4841 scope.go:117] "RemoveContainer" containerID="68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d" Jan 30 05:10:18 crc kubenswrapper[4841]: E0130 05:10:18.179259 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\": container with ID starting with 68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d not found: ID does not exist" containerID="68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.179292 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d"} err="failed to get container status \"68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\": rpc error: code = NotFound desc = could not find container \"68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d\": container with ID starting with 68b73d8ce348b89404af9f8f029d1b85fca4445541753ef47cf6f430dae5438d not found: ID does not exist" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.179313 4841 scope.go:117] "RemoveContainer" containerID="54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea" Jan 30 05:10:18 crc kubenswrapper[4841]: E0130 05:10:18.179673 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\": container with ID starting with 54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea not found: ID does not exist" containerID="54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.179739 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea"} err="failed to get container status \"54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\": rpc error: code = NotFound desc = could not find container \"54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea\": container with ID starting with 54f1e3ea6334a4d71166ba8585f0ce6f305165ccb61b43b1dece02cf8de389ea not found: ID does not exist" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.179778 4841 scope.go:117] "RemoveContainer" containerID="05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617" Jan 30 05:10:18 crc kubenswrapper[4841]: E0130 05:10:18.180166 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\": container with ID starting with 05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617 not found: ID does not exist" containerID="05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.180277 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617"} err="failed to get container status \"05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\": rpc error: code = NotFound desc = could not find container \"05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617\": container with ID starting with 05080a3ecc5f0e5006d55182aeefcbf5d75a33da37fac449f427c7f7f35cc617 not found: ID does not exist" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.180318 4841 scope.go:117] "RemoveContainer" containerID="11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9" Jan 30 05:10:18 crc kubenswrapper[4841]: E0130 05:10:18.180624 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\": container with ID starting with 11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9 not found: ID does not exist" containerID="11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.180658 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9"} err="failed to get container status \"11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\": rpc error: code = NotFound desc = could not find container \"11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9\": container with ID starting with 11edf00a576b6dd67554ea46d19a18877a527d333eee43fbda01b485a8085fb9 not found: ID does not exist" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.180679 4841 scope.go:117] "RemoveContainer" containerID="b7a2e2dfb466fbe724a2791aab2210e95283d153d0a371780b19df51140ce75e" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.197868 4841 scope.go:117] "RemoveContainer" containerID="4fadf4bb12ce50091dbbc931199bb56cef1efd3749cacc5863e373801393bff5" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.245780 4841 scope.go:117] "RemoveContainer" containerID="e25ea69e52cb8c365944a6c242e10e46d579d484ebcaa8303c0749273a13e21f" Jan 30 05:10:18 crc kubenswrapper[4841]: I0130 05:10:18.443845 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 30 05:10:20 crc kubenswrapper[4841]: E0130 05:10:20.031097 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:20 crc kubenswrapper[4841]: E0130 05:10:20.031954 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:20 crc kubenswrapper[4841]: E0130 05:10:20.032603 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:20 crc kubenswrapper[4841]: E0130 05:10:20.033105 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:20 crc kubenswrapper[4841]: E0130 05:10:20.033540 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:20 crc kubenswrapper[4841]: I0130 05:10:20.033591 4841 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 30 05:10:20 crc kubenswrapper[4841]: E0130 05:10:20.034204 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="200ms" Jan 30 05:10:20 crc kubenswrapper[4841]: E0130 05:10:20.235627 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="400ms" Jan 30 05:10:20 crc kubenswrapper[4841]: E0130 05:10:20.636712 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="800ms" Jan 30 05:10:21 crc kubenswrapper[4841]: E0130 05:10:21.438327 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="1.6s" Jan 30 05:10:23 crc kubenswrapper[4841]: E0130 05:10:23.039682 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="3.2s" Jan 30 05:10:24 crc kubenswrapper[4841]: I0130 05:10:24.436841 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:24 crc kubenswrapper[4841]: I0130 05:10:24.437514 4841 status_manager.go:851] "Failed to get status for pod" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" pod="openshift-marketplace/redhat-operators-qhg2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qhg2s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:24 crc kubenswrapper[4841]: I0130 05:10:24.437942 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:24 crc kubenswrapper[4841]: I0130 05:10:24.438761 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:24 crc kubenswrapper[4841]: I0130 05:10:24.439322 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:26 crc kubenswrapper[4841]: E0130 05:10:26.241223 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="6.4s" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.024437 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.024844 4841 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e" exitCode=1 Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.025001 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e"} Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.025664 4841 scope.go:117] "RemoveContainer" containerID="9ed81e6b5f4b1585d7159940cfdd6b0ca18bd2205e4807bdb2ff042b7d28c49e" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.026785 4841 status_manager.go:851] "Failed to get status for pod" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" pod="openshift-marketplace/redhat-operators-qhg2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qhg2s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.027679 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.028320 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.028878 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.029262 4841 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.029603 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:27 crc kubenswrapper[4841]: E0130 05:10:27.241826 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.36:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f6a142673bf1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 05:10:13.35690422 +0000 UTC m=+150.350376898,LastTimestamp:2026-01-30 05:10:13.35690422 +0000 UTC m=+150.350376898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.431179 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.432184 4841 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.432969 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.433716 4841 status_manager.go:851] "Failed to get status for pod" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" pod="openshift-marketplace/redhat-operators-qhg2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qhg2s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.434229 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.434694 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.435188 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.453459 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.453500 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:27 crc kubenswrapper[4841]: E0130 05:10:27.454072 4841 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.454725 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:27 crc kubenswrapper[4841]: W0130 05:10:27.485296 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-691247af92c1e6316fe1b423481cf8060313ab535bb19041f9037fe437c84c3d WatchSource:0}: Error finding container 691247af92c1e6316fe1b423481cf8060313ab535bb19041f9037fe437c84c3d: Status 404 returned error can't find the container with id 691247af92c1e6316fe1b423481cf8060313ab535bb19041f9037fe437c84c3d Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.835843 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:10:27 crc kubenswrapper[4841]: I0130 05:10:27.843244 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:10:28 crc kubenswrapper[4841]: I0130 05:10:28.039705 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 30 05:10:28 crc kubenswrapper[4841]: I0130 05:10:28.039849 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dac4c7377c93b9a926db4337edc0156dbbab455403add61a99f2f0138e64c7ef"} Jan 30 05:10:28 crc kubenswrapper[4841]: I0130 05:10:28.040878 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:28 crc kubenswrapper[4841]: I0130 05:10:28.041443 4841 status_manager.go:851] "Failed to get status for pod" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" pod="openshift-marketplace/redhat-operators-qhg2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qhg2s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:28 crc kubenswrapper[4841]: I0130 05:10:28.042050 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:28 crc kubenswrapper[4841]: I0130 05:10:28.042630 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:28 crc kubenswrapper[4841]: I0130 05:10:28.043225 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:28 crc kubenswrapper[4841]: I0130 05:10:28.043637 4841 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:28 crc kubenswrapper[4841]: I0130 05:10:28.047317 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"691247af92c1e6316fe1b423481cf8060313ab535bb19041f9037fe437c84c3d"} Jan 30 05:10:29 crc kubenswrapper[4841]: I0130 05:10:29.058688 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ff15ddb8c15ce55519d5889d19e1caeaae894732729b39ccdfcd458f8c8419eb"} Jan 30 05:10:29 crc kubenswrapper[4841]: I0130 05:10:29.082221 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:10:30 crc kubenswrapper[4841]: I0130 05:10:30.067067 4841 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ff15ddb8c15ce55519d5889d19e1caeaae894732729b39ccdfcd458f8c8419eb" exitCode=0 Jan 30 05:10:30 crc kubenswrapper[4841]: I0130 05:10:30.067159 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ff15ddb8c15ce55519d5889d19e1caeaae894732729b39ccdfcd458f8c8419eb"} Jan 30 05:10:30 crc kubenswrapper[4841]: I0130 05:10:30.067801 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:30 crc kubenswrapper[4841]: I0130 05:10:30.067839 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:30 crc kubenswrapper[4841]: I0130 05:10:30.068367 4841 status_manager.go:851] "Failed to get status for pod" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" pod="openshift-marketplace/community-operators-nhvlf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nhvlf\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:30 crc kubenswrapper[4841]: E0130 05:10:30.068455 4841 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:30 crc kubenswrapper[4841]: I0130 05:10:30.069061 4841 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:30 crc kubenswrapper[4841]: I0130 05:10:30.070056 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:30 crc kubenswrapper[4841]: I0130 05:10:30.070615 4841 status_manager.go:851] "Failed to get status for pod" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" pod="openshift-marketplace/redhat-operators-qhg2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qhg2s\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:30 crc kubenswrapper[4841]: I0130 05:10:30.071125 4841 status_manager.go:851] "Failed to get status for pod" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" pod="openshift-marketplace/redhat-marketplace-qp57k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-qp57k\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:30 crc kubenswrapper[4841]: I0130 05:10:30.071657 4841 status_manager.go:851] "Failed to get status for pod" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Jan 30 05:10:31 crc kubenswrapper[4841]: I0130 05:10:31.271383 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" podUID="cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" containerName="oauth-openshift" containerID="cri-o://aec7788a70a630de48298989ffd5285ecd960aade19357c79f5d21ecb349db3f" gracePeriod=15 Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.086985 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1014215e7de235747a2ef8d18a452ab61c63ecf26837dcd033872305850e2a6e"} Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.090151 4841 generic.go:334] "Generic (PLEG): container finished" podID="cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" containerID="aec7788a70a630de48298989ffd5285ecd960aade19357c79f5d21ecb349db3f" exitCode=0 Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.090200 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" event={"ID":"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8","Type":"ContainerDied","Data":"aec7788a70a630de48298989ffd5285ecd960aade19357c79f5d21ecb349db3f"} Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.533022 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.630825 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-provider-selection\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.630900 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-audit-policies\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.630933 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-serving-cert\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.630973 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-session\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.631000 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-service-ca\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.631018 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-ocp-branding-template\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.631038 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-error\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.631613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.631755 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.631832 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-router-certs\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.631876 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-idp-0-file-data\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.631903 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-trusted-ca-bundle\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.631931 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-cliconfig\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.631954 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-login\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.631979 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-audit-dir\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.632004 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6fqx\" (UniqueName: \"kubernetes.io/projected/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-kube-api-access-z6fqx\") pod \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\" (UID: \"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8\") " Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.632260 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.632632 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.632626 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.632651 4841 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.632697 4841 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.633186 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.636095 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.636373 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.636481 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-kube-api-access-z6fqx" (OuterVolumeSpecName: "kube-api-access-z6fqx") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "kube-api-access-z6fqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.636528 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.636930 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.640768 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.641064 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.641500 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.641682 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" (UID: "cbe3fe34-31a2-4843-9df6-8dd5d6c968d8"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.734027 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.734260 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.734322 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6fqx\" (UniqueName: \"kubernetes.io/projected/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-kube-api-access-z6fqx\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.734378 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.734473 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.734530 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.734593 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.734653 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.734708 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.734767 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:32 crc kubenswrapper[4841]: I0130 05:10:32.734829 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:10:33 crc kubenswrapper[4841]: I0130 05:10:33.097023 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e5399dbcc6b09ad80cb3cb85d7cd1a1b55d8f43887d2664559d3fac8024e8c17"} Jan 30 05:10:33 crc kubenswrapper[4841]: I0130 05:10:33.097386 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"75b76088ef533ddcda5d70d6b481b818111fbb9d363f3e9e6c1dc349ba35df3a"} Jan 30 05:10:33 crc kubenswrapper[4841]: I0130 05:10:33.098801 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" event={"ID":"cbe3fe34-31a2-4843-9df6-8dd5d6c968d8","Type":"ContainerDied","Data":"8fd32b2ceaa40ce2ab8660e894215fe4a654d9dfe28e2f70030a41d3abb53ab6"} Jan 30 05:10:33 crc kubenswrapper[4841]: I0130 05:10:33.098855 4841 scope.go:117] "RemoveContainer" containerID="aec7788a70a630de48298989ffd5285ecd960aade19357c79f5d21ecb349db3f" Jan 30 05:10:33 crc kubenswrapper[4841]: I0130 05:10:33.098859 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5hkjb" Jan 30 05:10:34 crc kubenswrapper[4841]: I0130 05:10:34.119060 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c6bd673533b5c588a5ea4acf8b5eeeba1bd2b2a25c3da1850c77b38a4022771d"} Jan 30 05:10:34 crc kubenswrapper[4841]: I0130 05:10:34.119125 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7ebb104234543dd2e6c7fbd64c34527cdb35e10ee417d135b2f680098e44431e"} Jan 30 05:10:34 crc kubenswrapper[4841]: I0130 05:10:34.119184 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:34 crc kubenswrapper[4841]: I0130 05:10:34.119252 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:34 crc kubenswrapper[4841]: I0130 05:10:34.119272 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:34 crc kubenswrapper[4841]: I0130 05:10:34.127581 4841 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:35 crc kubenswrapper[4841]: I0130 05:10:35.128653 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:35 crc kubenswrapper[4841]: I0130 05:10:35.129810 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:37 crc kubenswrapper[4841]: I0130 05:10:37.455787 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:37 crc kubenswrapper[4841]: I0130 05:10:37.456160 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:37 crc kubenswrapper[4841]: I0130 05:10:37.456782 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:37 crc kubenswrapper[4841]: I0130 05:10:37.456808 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:37 crc kubenswrapper[4841]: I0130 05:10:37.463693 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:37 crc kubenswrapper[4841]: I0130 05:10:37.842903 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:10:37 crc kubenswrapper[4841]: I0130 05:10:37.843298 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 30 05:10:37 crc kubenswrapper[4841]: I0130 05:10:37.843378 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 30 05:10:37 crc kubenswrapper[4841]: I0130 05:10:37.966592 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4fbd2442-6fd6-4f81-81bc-df4b690b2b3f" Jan 30 05:10:38 crc kubenswrapper[4841]: I0130 05:10:38.148876 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:38 crc kubenswrapper[4841]: I0130 05:10:38.148917 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:38 crc kubenswrapper[4841]: I0130 05:10:38.153851 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4fbd2442-6fd6-4f81-81bc-df4b690b2b3f" Jan 30 05:10:38 crc kubenswrapper[4841]: I0130 05:10:38.155127 4841 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://1014215e7de235747a2ef8d18a452ab61c63ecf26837dcd033872305850e2a6e" Jan 30 05:10:38 crc kubenswrapper[4841]: I0130 05:10:38.155173 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:39 crc kubenswrapper[4841]: I0130 05:10:39.155953 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:39 crc kubenswrapper[4841]: I0130 05:10:39.155998 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f791ed0a-befc-479e-862b-deb440b67c6b" Jan 30 05:10:39 crc kubenswrapper[4841]: I0130 05:10:39.161466 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4fbd2442-6fd6-4f81-81bc-df4b690b2b3f" Jan 30 05:10:40 crc kubenswrapper[4841]: I0130 05:10:40.464664 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:10:40 crc kubenswrapper[4841]: I0130 05:10:40.465024 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:10:47 crc kubenswrapper[4841]: I0130 05:10:47.203632 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 30 05:10:47 crc kubenswrapper[4841]: I0130 05:10:47.246388 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 30 05:10:47 crc kubenswrapper[4841]: I0130 05:10:47.329704 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 30 05:10:47 crc kubenswrapper[4841]: I0130 05:10:47.650099 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 05:10:47 crc kubenswrapper[4841]: I0130 05:10:47.843319 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 30 05:10:47 crc kubenswrapper[4841]: I0130 05:10:47.843463 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 30 05:10:47 crc kubenswrapper[4841]: I0130 05:10:47.923859 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 30 05:10:48 crc kubenswrapper[4841]: I0130 05:10:48.274796 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 05:10:48 crc kubenswrapper[4841]: I0130 05:10:48.368348 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 30 05:10:48 crc kubenswrapper[4841]: I0130 05:10:48.435373 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 30 05:10:48 crc kubenswrapper[4841]: I0130 05:10:48.716622 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 30 05:10:48 crc kubenswrapper[4841]: I0130 05:10:48.931472 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 30 05:10:48 crc kubenswrapper[4841]: I0130 05:10:48.934372 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 30 05:10:49 crc kubenswrapper[4841]: I0130 05:10:49.312903 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 30 05:10:49 crc kubenswrapper[4841]: I0130 05:10:49.383805 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 30 05:10:49 crc kubenswrapper[4841]: I0130 05:10:49.490519 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 30 05:10:49 crc kubenswrapper[4841]: I0130 05:10:49.611380 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 30 05:10:49 crc kubenswrapper[4841]: I0130 05:10:49.705856 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 30 05:10:49 crc kubenswrapper[4841]: I0130 05:10:49.994950 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.166316 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.220810 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.253810 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.306884 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.394710 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.549298 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.554359 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.704026 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.720865 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.725493 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.740414 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.790006 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 30 05:10:50 crc kubenswrapper[4841]: I0130 05:10:50.979515 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 30 05:10:51 crc kubenswrapper[4841]: I0130 05:10:51.224959 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 05:10:51 crc kubenswrapper[4841]: I0130 05:10:51.303666 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 30 05:10:51 crc kubenswrapper[4841]: I0130 05:10:51.307223 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 30 05:10:51 crc kubenswrapper[4841]: I0130 05:10:51.315164 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 30 05:10:51 crc kubenswrapper[4841]: I0130 05:10:51.424778 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 30 05:10:51 crc kubenswrapper[4841]: I0130 05:10:51.665924 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 30 05:10:51 crc kubenswrapper[4841]: I0130 05:10:51.674731 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 30 05:10:51 crc kubenswrapper[4841]: I0130 05:10:51.730872 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 30 05:10:51 crc kubenswrapper[4841]: I0130 05:10:51.734712 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 30 05:10:51 crc kubenswrapper[4841]: I0130 05:10:51.754278 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 30 05:10:51 crc kubenswrapper[4841]: I0130 05:10:51.777508 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 30 05:10:51 crc kubenswrapper[4841]: I0130 05:10:51.938609 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.027471 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.110490 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.143835 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.245751 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.386573 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.400908 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.415523 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.416764 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.433289 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.533797 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.575370 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.708492 4841 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.719112 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.739900 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.880891 4841 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.887592 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=39.887564662 podStartE2EDuration="39.887564662s" podCreationTimestamp="2026-01-30 05:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:37.84785066 +0000 UTC m=+174.841323338" watchObservedRunningTime="2026-01-30 05:10:52.887564662 +0000 UTC m=+189.881037330" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.888735 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qhg2s","openshift-marketplace/redhat-marketplace-qp57k","openshift-authentication/oauth-openshift-558db77b4-5hkjb","openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.888820 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.896615 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 05:10:52 crc kubenswrapper[4841]: I0130 05:10:52.918435 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.918378274 podStartE2EDuration="18.918378274s" podCreationTimestamp="2026-01-30 05:10:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:10:52.914755945 +0000 UTC m=+189.908228623" watchObservedRunningTime="2026-01-30 05:10:52.918378274 +0000 UTC m=+189.911850942" Jan 30 05:10:53 crc kubenswrapper[4841]: I0130 05:10:53.052533 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 30 05:10:53 crc kubenswrapper[4841]: I0130 05:10:53.093713 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 30 05:10:53 crc kubenswrapper[4841]: I0130 05:10:53.145595 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 05:10:53 crc kubenswrapper[4841]: I0130 05:10:53.300443 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 30 05:10:53 crc kubenswrapper[4841]: I0130 05:10:53.329569 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 30 05:10:53 crc kubenswrapper[4841]: I0130 05:10:53.369304 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 30 05:10:53 crc kubenswrapper[4841]: I0130 05:10:53.384560 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 05:10:53 crc kubenswrapper[4841]: I0130 05:10:53.498746 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.074476 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.114137 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.132620 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.216175 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.246018 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.309593 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.340100 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.444370 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" path="/var/lib/kubelet/pods/5ff4432f-c571-42f5-a82c-58b4cc8be05d/volumes" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.446009 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" path="/var/lib/kubelet/pods/94ac7c20-1f9a-4fb1-8107-1159fb740ab5/volumes" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.447455 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" path="/var/lib/kubelet/pods/cbe3fe34-31a2-4843-9df6-8dd5d6c968d8/volumes" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.494891 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.579039 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.683612 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.746087 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 30 05:10:54 crc kubenswrapper[4841]: I0130 05:10:54.778589 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.015543 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.051022 4841 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.111624 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.157486 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-85f4f78dc8-d2x46"] Jan 30 05:10:55 crc kubenswrapper[4841]: E0130 05:10:55.157693 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" containerName="extract-utilities" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.157706 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" containerName="extract-utilities" Jan 30 05:10:55 crc kubenswrapper[4841]: E0130 05:10:55.157717 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" containerName="extract-content" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.157725 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" containerName="extract-content" Jan 30 05:10:55 crc kubenswrapper[4841]: E0130 05:10:55.157739 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" containerName="extract-utilities" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.157747 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" containerName="extract-utilities" Jan 30 05:10:55 crc kubenswrapper[4841]: E0130 05:10:55.157759 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" containerName="installer" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.157766 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" containerName="installer" Jan 30 05:10:55 crc kubenswrapper[4841]: E0130 05:10:55.157781 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" containerName="registry-server" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.157788 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" containerName="registry-server" Jan 30 05:10:55 crc kubenswrapper[4841]: E0130 05:10:55.157800 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" containerName="registry-server" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.157808 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" containerName="registry-server" Jan 30 05:10:55 crc kubenswrapper[4841]: E0130 05:10:55.157820 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" containerName="oauth-openshift" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.157829 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" containerName="oauth-openshift" Jan 30 05:10:55 crc kubenswrapper[4841]: E0130 05:10:55.157840 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" containerName="extract-content" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.157847 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" containerName="extract-content" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.157959 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="94ac7c20-1f9a-4fb1-8107-1159fb740ab5" containerName="registry-server" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.157974 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff4432f-c571-42f5-a82c-58b4cc8be05d" containerName="registry-server" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.157989 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cf712d8-8aae-4943-8737-0fe6d9eda70d" containerName="installer" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.158000 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe3fe34-31a2-4843-9df6-8dd5d6c968d8" containerName="oauth-openshift" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.159023 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.166068 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.166443 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.166550 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.166885 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.167003 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.167088 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.167163 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.167963 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.168132 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.168265 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.168284 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.168666 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.182119 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.199071 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.222108 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.234580 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.340971 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.354124 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-service-ca\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.354206 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.354246 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-user-template-error\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.354365 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.354447 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf3bea98-f750-48e9-82e7-c5905ec78162-audit-dir\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.354488 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf3bea98-f750-48e9-82e7-c5905ec78162-audit-policies\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.354540 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mlz4\" (UniqueName: \"kubernetes.io/projected/bf3bea98-f750-48e9-82e7-c5905ec78162-kube-api-access-4mlz4\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.354581 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.354727 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-user-template-login\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.354845 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.354899 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.354934 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.355003 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-router-certs\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.355084 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-session\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.360722 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.370249 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.442964 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.447208 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.456577 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.456648 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.456692 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.456746 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-router-certs\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.456779 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-session\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.456838 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-service-ca\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.456907 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.456940 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-user-template-error\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.457047 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.457107 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf3bea98-f750-48e9-82e7-c5905ec78162-audit-dir\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.457158 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf3bea98-f750-48e9-82e7-c5905ec78162-audit-policies\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.457212 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mlz4\" (UniqueName: \"kubernetes.io/projected/bf3bea98-f750-48e9-82e7-c5905ec78162-kube-api-access-4mlz4\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.457243 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.457282 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-user-template-login\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.459230 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf3bea98-f750-48e9-82e7-c5905ec78162-audit-dir\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.461255 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.461577 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf3bea98-f750-48e9-82e7-c5905ec78162-audit-policies\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.462073 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-service-ca\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.462859 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.466856 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.467039 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-user-template-error\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.467858 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.468288 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-user-template-login\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.470013 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.486152 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.486917 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-router-certs\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.486990 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf3bea98-f750-48e9-82e7-c5905ec78162-v4-0-config-system-session\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.491724 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mlz4\" (UniqueName: \"kubernetes.io/projected/bf3bea98-f750-48e9-82e7-c5905ec78162-kube-api-access-4mlz4\") pod \"oauth-openshift-85f4f78dc8-d2x46\" (UID: \"bf3bea98-f750-48e9-82e7-c5905ec78162\") " pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.521208 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.559878 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.573998 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.592014 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.714636 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.723986 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.871254 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 05:10:55 crc kubenswrapper[4841]: I0130 05:10:55.943585 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.045848 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.079139 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.159675 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.210851 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.304199 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.518044 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.604697 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.612513 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.773156 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.774481 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.777898 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.927152 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 30 05:10:56 crc kubenswrapper[4841]: I0130 05:10:56.972054 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.046371 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.069565 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.110024 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.289577 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.335257 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.347133 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.354299 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.529201 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.549882 4841 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.561997 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.614931 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.832780 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.849505 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.859235 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.879874 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.890188 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.934714 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 30 05:10:57 crc kubenswrapper[4841]: I0130 05:10:57.961389 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.038825 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.041887 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.190167 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.201172 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.209720 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.262367 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.493084 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.511224 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.512937 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.568754 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.676969 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.779173 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.801958 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.861189 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.864459 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 30 05:10:58 crc kubenswrapper[4841]: I0130 05:10:58.873437 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.117376 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.151211 4841 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.152021 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d9331a16299a53b6af16bb9da4cd45a78dd22e4b98e043516dcfd444a3187ba2" gracePeriod=5 Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.346383 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.409653 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.518165 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.528609 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.617595 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.643490 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.719741 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.882349 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.910469 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.948168 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 30 05:10:59 crc kubenswrapper[4841]: I0130 05:10:59.993314 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.309276 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.546935 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.595002 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.616570 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.651563 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.722113 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.752531 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.776797 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.802544 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.867051 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.876269 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.906687 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 05:11:00 crc kubenswrapper[4841]: I0130 05:11:00.979010 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 30 05:11:01 crc kubenswrapper[4841]: I0130 05:11:01.163947 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 30 05:11:01 crc kubenswrapper[4841]: I0130 05:11:01.352469 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 30 05:11:01 crc kubenswrapper[4841]: I0130 05:11:01.364014 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 30 05:11:01 crc kubenswrapper[4841]: I0130 05:11:01.427351 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 30 05:11:01 crc kubenswrapper[4841]: I0130 05:11:01.508389 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 30 05:11:01 crc kubenswrapper[4841]: I0130 05:11:01.511170 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 30 05:11:01 crc kubenswrapper[4841]: I0130 05:11:01.602290 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 30 05:11:01 crc kubenswrapper[4841]: I0130 05:11:01.872583 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-85f4f78dc8-d2x46"] Jan 30 05:11:01 crc kubenswrapper[4841]: I0130 05:11:01.923213 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 30 05:11:02 crc kubenswrapper[4841]: I0130 05:11:02.211152 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 30 05:11:02 crc kubenswrapper[4841]: I0130 05:11:02.234706 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 30 05:11:02 crc kubenswrapper[4841]: E0130 05:11:02.251554 4841 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 30 05:11:02 crc kubenswrapper[4841]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-85f4f78dc8-d2x46_openshift-authentication_bf3bea98-f750-48e9-82e7-c5905ec78162_0(fcc556bec607c04c5459a0b3067f4b527844927bf4b4a918f44f48f94a3e7718): error adding pod openshift-authentication_oauth-openshift-85f4f78dc8-d2x46 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"fcc556bec607c04c5459a0b3067f4b527844927bf4b4a918f44f48f94a3e7718" Netns:"/var/run/netns/98f62980-5793-4b14-903c-86d7bc4461b5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-85f4f78dc8-d2x46;K8S_POD_INFRA_CONTAINER_ID=fcc556bec607c04c5459a0b3067f4b527844927bf4b4a918f44f48f94a3e7718;K8S_POD_UID=bf3bea98-f750-48e9-82e7-c5905ec78162" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46] networking: Multus: [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46/bf3bea98-f750-48e9-82e7-c5905ec78162]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-85f4f78dc8-d2x46 in out of cluster comm: pod "oauth-openshift-85f4f78dc8-d2x46" not found Jan 30 05:11:02 crc kubenswrapper[4841]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 05:11:02 crc kubenswrapper[4841]: > Jan 30 05:11:02 crc kubenswrapper[4841]: E0130 05:11:02.251695 4841 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 30 05:11:02 crc kubenswrapper[4841]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-85f4f78dc8-d2x46_openshift-authentication_bf3bea98-f750-48e9-82e7-c5905ec78162_0(fcc556bec607c04c5459a0b3067f4b527844927bf4b4a918f44f48f94a3e7718): error adding pod openshift-authentication_oauth-openshift-85f4f78dc8-d2x46 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"fcc556bec607c04c5459a0b3067f4b527844927bf4b4a918f44f48f94a3e7718" Netns:"/var/run/netns/98f62980-5793-4b14-903c-86d7bc4461b5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-85f4f78dc8-d2x46;K8S_POD_INFRA_CONTAINER_ID=fcc556bec607c04c5459a0b3067f4b527844927bf4b4a918f44f48f94a3e7718;K8S_POD_UID=bf3bea98-f750-48e9-82e7-c5905ec78162" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46] networking: Multus: [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46/bf3bea98-f750-48e9-82e7-c5905ec78162]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-85f4f78dc8-d2x46 in out of cluster comm: pod "oauth-openshift-85f4f78dc8-d2x46" not found Jan 30 05:11:02 crc kubenswrapper[4841]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 05:11:02 crc kubenswrapper[4841]: > pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:11:02 crc kubenswrapper[4841]: E0130 05:11:02.251730 4841 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Jan 30 05:11:02 crc kubenswrapper[4841]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-85f4f78dc8-d2x46_openshift-authentication_bf3bea98-f750-48e9-82e7-c5905ec78162_0(fcc556bec607c04c5459a0b3067f4b527844927bf4b4a918f44f48f94a3e7718): error adding pod openshift-authentication_oauth-openshift-85f4f78dc8-d2x46 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"fcc556bec607c04c5459a0b3067f4b527844927bf4b4a918f44f48f94a3e7718" Netns:"/var/run/netns/98f62980-5793-4b14-903c-86d7bc4461b5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-85f4f78dc8-d2x46;K8S_POD_INFRA_CONTAINER_ID=fcc556bec607c04c5459a0b3067f4b527844927bf4b4a918f44f48f94a3e7718;K8S_POD_UID=bf3bea98-f750-48e9-82e7-c5905ec78162" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46] networking: Multus: [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46/bf3bea98-f750-48e9-82e7-c5905ec78162]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-85f4f78dc8-d2x46 in out of cluster comm: pod "oauth-openshift-85f4f78dc8-d2x46" not found Jan 30 05:11:02 crc kubenswrapper[4841]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 05:11:02 crc kubenswrapper[4841]: > pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:11:02 crc kubenswrapper[4841]: E0130 05:11:02.251829 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-85f4f78dc8-d2x46_openshift-authentication(bf3bea98-f750-48e9-82e7-c5905ec78162)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-85f4f78dc8-d2x46_openshift-authentication(bf3bea98-f750-48e9-82e7-c5905ec78162)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-85f4f78dc8-d2x46_openshift-authentication_bf3bea98-f750-48e9-82e7-c5905ec78162_0(fcc556bec607c04c5459a0b3067f4b527844927bf4b4a918f44f48f94a3e7718): error adding pod openshift-authentication_oauth-openshift-85f4f78dc8-d2x46 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"fcc556bec607c04c5459a0b3067f4b527844927bf4b4a918f44f48f94a3e7718\\\" Netns:\\\"/var/run/netns/98f62980-5793-4b14-903c-86d7bc4461b5\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-85f4f78dc8-d2x46;K8S_POD_INFRA_CONTAINER_ID=fcc556bec607c04c5459a0b3067f4b527844927bf4b4a918f44f48f94a3e7718;K8S_POD_UID=bf3bea98-f750-48e9-82e7-c5905ec78162\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46] networking: Multus: [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46/bf3bea98-f750-48e9-82e7-c5905ec78162]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-85f4f78dc8-d2x46 in out of cluster comm: pod \\\"oauth-openshift-85f4f78dc8-d2x46\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" podUID="bf3bea98-f750-48e9-82e7-c5905ec78162" Jan 30 05:11:02 crc kubenswrapper[4841]: I0130 05:11:02.278331 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 05:11:02 crc kubenswrapper[4841]: I0130 05:11:02.339980 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:11:02 crc kubenswrapper[4841]: I0130 05:11:02.340777 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:11:02 crc kubenswrapper[4841]: I0130 05:11:02.465452 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 30 05:11:02 crc kubenswrapper[4841]: I0130 05:11:02.734534 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 30 05:11:02 crc kubenswrapper[4841]: I0130 05:11:02.836742 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 30 05:11:03 crc kubenswrapper[4841]: I0130 05:11:03.296728 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 30 05:11:03 crc kubenswrapper[4841]: I0130 05:11:03.572171 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 05:11:03 crc kubenswrapper[4841]: I0130 05:11:03.920632 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.358633 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.358719 4841 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d9331a16299a53b6af16bb9da4cd45a78dd22e4b98e043516dcfd444a3187ba2" exitCode=137 Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.725348 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.725493 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.895385 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.895564 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.895703 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.895828 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.895846 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.895872 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.895953 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.895943 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.896024 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.896547 4841 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.896575 4841 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.896597 4841 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.896615 4841 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.907522 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:11:04 crc kubenswrapper[4841]: I0130 05:11:04.997651 4841 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:05 crc kubenswrapper[4841]: E0130 05:11:05.322631 4841 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 30 05:11:05 crc kubenswrapper[4841]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-85f4f78dc8-d2x46_openshift-authentication_bf3bea98-f750-48e9-82e7-c5905ec78162_0(745dc1b195f9b7a7a141e78379f104b1116f9e49fefc5b3494420ea1dbc6a8b6): error adding pod openshift-authentication_oauth-openshift-85f4f78dc8-d2x46 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"745dc1b195f9b7a7a141e78379f104b1116f9e49fefc5b3494420ea1dbc6a8b6" Netns:"/var/run/netns/b7e591a5-6e7a-4143-9735-668860533422" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-85f4f78dc8-d2x46;K8S_POD_INFRA_CONTAINER_ID=745dc1b195f9b7a7a141e78379f104b1116f9e49fefc5b3494420ea1dbc6a8b6;K8S_POD_UID=bf3bea98-f750-48e9-82e7-c5905ec78162" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46] networking: Multus: [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46/bf3bea98-f750-48e9-82e7-c5905ec78162]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-85f4f78dc8-d2x46 in out of cluster comm: pod "oauth-openshift-85f4f78dc8-d2x46" not found Jan 30 05:11:05 crc kubenswrapper[4841]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 05:11:05 crc kubenswrapper[4841]: > Jan 30 05:11:05 crc kubenswrapper[4841]: E0130 05:11:05.322702 4841 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 30 05:11:05 crc kubenswrapper[4841]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-85f4f78dc8-d2x46_openshift-authentication_bf3bea98-f750-48e9-82e7-c5905ec78162_0(745dc1b195f9b7a7a141e78379f104b1116f9e49fefc5b3494420ea1dbc6a8b6): error adding pod openshift-authentication_oauth-openshift-85f4f78dc8-d2x46 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"745dc1b195f9b7a7a141e78379f104b1116f9e49fefc5b3494420ea1dbc6a8b6" Netns:"/var/run/netns/b7e591a5-6e7a-4143-9735-668860533422" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-85f4f78dc8-d2x46;K8S_POD_INFRA_CONTAINER_ID=745dc1b195f9b7a7a141e78379f104b1116f9e49fefc5b3494420ea1dbc6a8b6;K8S_POD_UID=bf3bea98-f750-48e9-82e7-c5905ec78162" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46] networking: Multus: [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46/bf3bea98-f750-48e9-82e7-c5905ec78162]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-85f4f78dc8-d2x46 in out of cluster comm: pod "oauth-openshift-85f4f78dc8-d2x46" not found Jan 30 05:11:05 crc kubenswrapper[4841]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 05:11:05 crc kubenswrapper[4841]: > pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:11:05 crc kubenswrapper[4841]: E0130 05:11:05.322728 4841 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Jan 30 05:11:05 crc kubenswrapper[4841]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-85f4f78dc8-d2x46_openshift-authentication_bf3bea98-f750-48e9-82e7-c5905ec78162_0(745dc1b195f9b7a7a141e78379f104b1116f9e49fefc5b3494420ea1dbc6a8b6): error adding pod openshift-authentication_oauth-openshift-85f4f78dc8-d2x46 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"745dc1b195f9b7a7a141e78379f104b1116f9e49fefc5b3494420ea1dbc6a8b6" Netns:"/var/run/netns/b7e591a5-6e7a-4143-9735-668860533422" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-85f4f78dc8-d2x46;K8S_POD_INFRA_CONTAINER_ID=745dc1b195f9b7a7a141e78379f104b1116f9e49fefc5b3494420ea1dbc6a8b6;K8S_POD_UID=bf3bea98-f750-48e9-82e7-c5905ec78162" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46] networking: Multus: [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46/bf3bea98-f750-48e9-82e7-c5905ec78162]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-85f4f78dc8-d2x46 in out of cluster comm: pod "oauth-openshift-85f4f78dc8-d2x46" not found Jan 30 05:11:05 crc kubenswrapper[4841]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 05:11:05 crc kubenswrapper[4841]: > pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:11:05 crc kubenswrapper[4841]: E0130 05:11:05.322788 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-85f4f78dc8-d2x46_openshift-authentication(bf3bea98-f750-48e9-82e7-c5905ec78162)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-85f4f78dc8-d2x46_openshift-authentication(bf3bea98-f750-48e9-82e7-c5905ec78162)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-85f4f78dc8-d2x46_openshift-authentication_bf3bea98-f750-48e9-82e7-c5905ec78162_0(745dc1b195f9b7a7a141e78379f104b1116f9e49fefc5b3494420ea1dbc6a8b6): error adding pod openshift-authentication_oauth-openshift-85f4f78dc8-d2x46 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"745dc1b195f9b7a7a141e78379f104b1116f9e49fefc5b3494420ea1dbc6a8b6\\\" Netns:\\\"/var/run/netns/b7e591a5-6e7a-4143-9735-668860533422\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-85f4f78dc8-d2x46;K8S_POD_INFRA_CONTAINER_ID=745dc1b195f9b7a7a141e78379f104b1116f9e49fefc5b3494420ea1dbc6a8b6;K8S_POD_UID=bf3bea98-f750-48e9-82e7-c5905ec78162\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46] networking: Multus: [openshift-authentication/oauth-openshift-85f4f78dc8-d2x46/bf3bea98-f750-48e9-82e7-c5905ec78162]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-85f4f78dc8-d2x46 in out of cluster comm: pod \\\"oauth-openshift-85f4f78dc8-d2x46\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" podUID="bf3bea98-f750-48e9-82e7-c5905ec78162" Jan 30 05:11:05 crc kubenswrapper[4841]: I0130 05:11:05.368664 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 05:11:05 crc kubenswrapper[4841]: I0130 05:11:05.368730 4841 scope.go:117] "RemoveContainer" containerID="d9331a16299a53b6af16bb9da4cd45a78dd22e4b98e043516dcfd444a3187ba2" Jan 30 05:11:05 crc kubenswrapper[4841]: I0130 05:11:05.368841 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 05:11:06 crc kubenswrapper[4841]: I0130 05:11:06.442602 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 30 05:11:06 crc kubenswrapper[4841]: I0130 05:11:06.443266 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 30 05:11:06 crc kubenswrapper[4841]: I0130 05:11:06.458950 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 05:11:06 crc kubenswrapper[4841]: I0130 05:11:06.459029 4841 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="ea4fdca7-d34c-4e63-833a-86c118437cb3" Jan 30 05:11:06 crc kubenswrapper[4841]: I0130 05:11:06.465514 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 05:11:06 crc kubenswrapper[4841]: I0130 05:11:06.465567 4841 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="ea4fdca7-d34c-4e63-833a-86c118437cb3" Jan 30 05:11:09 crc kubenswrapper[4841]: I0130 05:11:09.727991 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 30 05:11:10 crc kubenswrapper[4841]: I0130 05:11:10.463768 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:11:10 crc kubenswrapper[4841]: I0130 05:11:10.463849 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:11:10 crc kubenswrapper[4841]: I0130 05:11:10.492922 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 30 05:11:10 crc kubenswrapper[4841]: I0130 05:11:10.952977 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 30 05:11:13 crc kubenswrapper[4841]: I0130 05:11:13.421896 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 30 05:11:16 crc kubenswrapper[4841]: I0130 05:11:16.452821 4841 generic.go:334] "Generic (PLEG): container finished" podID="7599df20-a4f3-4f48-b413-8eec3c0bba38" containerID="852c3759a609a71af3531896af0f8229e0870854040facca8847ea946c31339e" exitCode=0 Jan 30 05:11:16 crc kubenswrapper[4841]: I0130 05:11:16.452919 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" event={"ID":"7599df20-a4f3-4f48-b413-8eec3c0bba38","Type":"ContainerDied","Data":"852c3759a609a71af3531896af0f8229e0870854040facca8847ea946c31339e"} Jan 30 05:11:16 crc kubenswrapper[4841]: I0130 05:11:16.454199 4841 scope.go:117] "RemoveContainer" containerID="852c3759a609a71af3531896af0f8229e0870854040facca8847ea946c31339e" Jan 30 05:11:16 crc kubenswrapper[4841]: I0130 05:11:16.574792 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 30 05:11:16 crc kubenswrapper[4841]: I0130 05:11:16.983113 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 05:11:17 crc kubenswrapper[4841]: I0130 05:11:17.406986 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 30 05:11:17 crc kubenswrapper[4841]: I0130 05:11:17.465789 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" event={"ID":"7599df20-a4f3-4f48-b413-8eec3c0bba38","Type":"ContainerStarted","Data":"b4c5ec1c563b6c96488f2960c37456f3c0275c8eecb5d48c7068cea77991ebf3"} Jan 30 05:11:17 crc kubenswrapper[4841]: I0130 05:11:17.466674 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:11:17 crc kubenswrapper[4841]: I0130 05:11:17.470306 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:11:17 crc kubenswrapper[4841]: I0130 05:11:17.488989 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 30 05:11:17 crc kubenswrapper[4841]: I0130 05:11:17.708242 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 30 05:11:18 crc kubenswrapper[4841]: I0130 05:11:18.126223 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 30 05:11:19 crc kubenswrapper[4841]: I0130 05:11:19.776885 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 05:11:20 crc kubenswrapper[4841]: I0130 05:11:20.043393 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 30 05:11:20 crc kubenswrapper[4841]: I0130 05:11:20.431865 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:11:20 crc kubenswrapper[4841]: I0130 05:11:20.432440 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:11:20 crc kubenswrapper[4841]: I0130 05:11:20.696758 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-85f4f78dc8-d2x46"] Jan 30 05:11:20 crc kubenswrapper[4841]: I0130 05:11:20.787112 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 30 05:11:21 crc kubenswrapper[4841]: I0130 05:11:21.160595 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 30 05:11:21 crc kubenswrapper[4841]: I0130 05:11:21.264802 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 30 05:11:21 crc kubenswrapper[4841]: I0130 05:11:21.493789 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" event={"ID":"bf3bea98-f750-48e9-82e7-c5905ec78162","Type":"ContainerStarted","Data":"2c720b717f95773001ca2324948b0f35e25b34e9778e5b0626d84f90f367c2b0"} Jan 30 05:11:21 crc kubenswrapper[4841]: I0130 05:11:21.493876 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" event={"ID":"bf3bea98-f750-48e9-82e7-c5905ec78162","Type":"ContainerStarted","Data":"79b55055a81dbf26935a3bfcbba2bb8ea450ea85d3f8f21f499b2313211fa35d"} Jan 30 05:11:21 crc kubenswrapper[4841]: I0130 05:11:21.494209 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:11:21 crc kubenswrapper[4841]: I0130 05:11:21.525945 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" podStartSLOduration=75.525918772 podStartE2EDuration="1m15.525918772s" podCreationTimestamp="2026-01-30 05:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:11:21.522925218 +0000 UTC m=+218.516397946" watchObservedRunningTime="2026-01-30 05:11:21.525918772 +0000 UTC m=+218.519391450" Jan 30 05:11:21 crc kubenswrapper[4841]: I0130 05:11:21.597516 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 30 05:11:21 crc kubenswrapper[4841]: I0130 05:11:21.753925 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-85f4f78dc8-d2x46" Jan 30 05:11:22 crc kubenswrapper[4841]: I0130 05:11:22.684349 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 30 05:11:23 crc kubenswrapper[4841]: I0130 05:11:23.208339 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 30 05:11:23 crc kubenswrapper[4841]: I0130 05:11:23.258106 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 30 05:11:23 crc kubenswrapper[4841]: I0130 05:11:23.396164 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 30 05:11:23 crc kubenswrapper[4841]: I0130 05:11:23.484141 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 30 05:11:23 crc kubenswrapper[4841]: I0130 05:11:23.566485 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 30 05:11:24 crc kubenswrapper[4841]: I0130 05:11:24.159337 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 30 05:11:24 crc kubenswrapper[4841]: I0130 05:11:24.533321 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 30 05:11:24 crc kubenswrapper[4841]: I0130 05:11:24.872348 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 30 05:11:25 crc kubenswrapper[4841]: I0130 05:11:25.576023 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 30 05:11:25 crc kubenswrapper[4841]: I0130 05:11:25.848570 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 30 05:11:26 crc kubenswrapper[4841]: I0130 05:11:26.233003 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 30 05:11:26 crc kubenswrapper[4841]: I0130 05:11:26.914806 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 30 05:11:27 crc kubenswrapper[4841]: I0130 05:11:27.322012 4841 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 30 05:11:27 crc kubenswrapper[4841]: I0130 05:11:27.367095 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.088489 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9wm9f"] Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.089119 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" podUID="a785f654-41ed-4c03-baf7-b0fb5bc3f543" containerName="controller-manager" containerID="cri-o://b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b" gracePeriod=30 Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.184142 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq"] Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.184791 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" podUID="343813f5-7868-4aa9-9d23-6c3f70f6bbd8" containerName="route-controller-manager" containerID="cri-o://79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2" gracePeriod=30 Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.499358 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.510104 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.561990 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.562161 4841 generic.go:334] "Generic (PLEG): container finished" podID="343813f5-7868-4aa9-9d23-6c3f70f6bbd8" containerID="79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2" exitCode=0 Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.562187 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" event={"ID":"343813f5-7868-4aa9-9d23-6c3f70f6bbd8","Type":"ContainerDied","Data":"79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2"} Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.562467 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" event={"ID":"343813f5-7868-4aa9-9d23-6c3f70f6bbd8","Type":"ContainerDied","Data":"c1ef02c93d006124da44ebaf4d95d614d15817b21dbfd7f04d2747a7b64f29de"} Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.562529 4841 scope.go:117] "RemoveContainer" containerID="79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.564806 4841 generic.go:334] "Generic (PLEG): container finished" podID="a785f654-41ed-4c03-baf7-b0fb5bc3f543" containerID="b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b" exitCode=0 Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.564838 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" event={"ID":"a785f654-41ed-4c03-baf7-b0fb5bc3f543","Type":"ContainerDied","Data":"b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b"} Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.564858 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" event={"ID":"a785f654-41ed-4c03-baf7-b0fb5bc3f543","Type":"ContainerDied","Data":"0d4b94244b9cee6d38105026731cb2e3c2579462d484dc74cc9f5930d17c3064"} Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.564894 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9wm9f" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.593003 4841 scope.go:117] "RemoveContainer" containerID="79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2" Jan 30 05:11:29 crc kubenswrapper[4841]: E0130 05:11:29.593626 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2\": container with ID starting with 79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2 not found: ID does not exist" containerID="79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.593660 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2"} err="failed to get container status \"79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2\": rpc error: code = NotFound desc = could not find container \"79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2\": container with ID starting with 79e262a0edeaaea1b2e2f7737c3fccd6b687c6265e8ec1c9ee31055e1ba01dc2 not found: ID does not exist" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.593684 4841 scope.go:117] "RemoveContainer" containerID="b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.609643 4841 scope.go:117] "RemoveContainer" containerID="b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b" Jan 30 05:11:29 crc kubenswrapper[4841]: E0130 05:11:29.610376 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b\": container with ID starting with b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b not found: ID does not exist" containerID="b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.610459 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b"} err="failed to get container status \"b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b\": rpc error: code = NotFound desc = could not find container \"b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b\": container with ID starting with b9d4e40610cfcfcd92977d1c4df933f580d4f0a34e74440096ba402c2bc3448b not found: ID does not exist" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.650070 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-config\") pod \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.650145 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-proxy-ca-bundles\") pod \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.650226 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a785f654-41ed-4c03-baf7-b0fb5bc3f543-serving-cert\") pod \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.650290 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-client-ca\") pod \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.650311 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn6xp\" (UniqueName: \"kubernetes.io/projected/a785f654-41ed-4c03-baf7-b0fb5bc3f543-kube-api-access-nn6xp\") pod \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\" (UID: \"a785f654-41ed-4c03-baf7-b0fb5bc3f543\") " Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.650986 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-client-ca" (OuterVolumeSpecName: "client-ca") pod "a785f654-41ed-4c03-baf7-b0fb5bc3f543" (UID: "a785f654-41ed-4c03-baf7-b0fb5bc3f543"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.651059 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a785f654-41ed-4c03-baf7-b0fb5bc3f543" (UID: "a785f654-41ed-4c03-baf7-b0fb5bc3f543"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.651108 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-config" (OuterVolumeSpecName: "config") pod "a785f654-41ed-4c03-baf7-b0fb5bc3f543" (UID: "a785f654-41ed-4c03-baf7-b0fb5bc3f543"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.651247 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.651259 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.651267 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a785f654-41ed-4c03-baf7-b0fb5bc3f543-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.656742 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a785f654-41ed-4c03-baf7-b0fb5bc3f543-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a785f654-41ed-4c03-baf7-b0fb5bc3f543" (UID: "a785f654-41ed-4c03-baf7-b0fb5bc3f543"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.656832 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a785f654-41ed-4c03-baf7-b0fb5bc3f543-kube-api-access-nn6xp" (OuterVolumeSpecName: "kube-api-access-nn6xp") pod "a785f654-41ed-4c03-baf7-b0fb5bc3f543" (UID: "a785f654-41ed-4c03-baf7-b0fb5bc3f543"). InnerVolumeSpecName "kube-api-access-nn6xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.751790 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-serving-cert\") pod \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.751858 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-client-ca\") pod \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.751917 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-config\") pod \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.751946 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn5cp\" (UniqueName: \"kubernetes.io/projected/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-kube-api-access-wn5cp\") pod \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\" (UID: \"343813f5-7868-4aa9-9d23-6c3f70f6bbd8\") " Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.752198 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a785f654-41ed-4c03-baf7-b0fb5bc3f543-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.752215 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn6xp\" (UniqueName: \"kubernetes.io/projected/a785f654-41ed-4c03-baf7-b0fb5bc3f543-kube-api-access-nn6xp\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.753206 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-client-ca" (OuterVolumeSpecName: "client-ca") pod "343813f5-7868-4aa9-9d23-6c3f70f6bbd8" (UID: "343813f5-7868-4aa9-9d23-6c3f70f6bbd8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.753278 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-config" (OuterVolumeSpecName: "config") pod "343813f5-7868-4aa9-9d23-6c3f70f6bbd8" (UID: "343813f5-7868-4aa9-9d23-6c3f70f6bbd8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.756949 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-kube-api-access-wn5cp" (OuterVolumeSpecName: "kube-api-access-wn5cp") pod "343813f5-7868-4aa9-9d23-6c3f70f6bbd8" (UID: "343813f5-7868-4aa9-9d23-6c3f70f6bbd8"). InnerVolumeSpecName "kube-api-access-wn5cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.756991 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "343813f5-7868-4aa9-9d23-6c3f70f6bbd8" (UID: "343813f5-7868-4aa9-9d23-6c3f70f6bbd8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.841609 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.854295 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.854365 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.854384 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.854451 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn5cp\" (UniqueName: \"kubernetes.io/projected/343813f5-7868-4aa9-9d23-6c3f70f6bbd8-kube-api-access-wn5cp\") on node \"crc\" DevicePath \"\"" Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.911808 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9wm9f"] Jan 30 05:11:29 crc kubenswrapper[4841]: I0130 05:11:29.916770 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9wm9f"] Jan 30 05:11:30 crc kubenswrapper[4841]: I0130 05:11:30.096806 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 30 05:11:30 crc kubenswrapper[4841]: I0130 05:11:30.352819 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 30 05:11:30 crc kubenswrapper[4841]: I0130 05:11:30.444022 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a785f654-41ed-4c03-baf7-b0fb5bc3f543" path="/var/lib/kubelet/pods/a785f654-41ed-4c03-baf7-b0fb5bc3f543/volumes" Jan 30 05:11:30 crc kubenswrapper[4841]: I0130 05:11:30.574006 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq" Jan 30 05:11:30 crc kubenswrapper[4841]: I0130 05:11:30.597000 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq"] Jan 30 05:11:30 crc kubenswrapper[4841]: I0130 05:11:30.604515 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-52nnq"] Jan 30 05:11:30 crc kubenswrapper[4841]: I0130 05:11:30.729260 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.013058 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.189503 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz"] Jan 30 05:11:31 crc kubenswrapper[4841]: E0130 05:11:31.189793 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a785f654-41ed-4c03-baf7-b0fb5bc3f543" containerName="controller-manager" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.189814 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a785f654-41ed-4c03-baf7-b0fb5bc3f543" containerName="controller-manager" Jan 30 05:11:31 crc kubenswrapper[4841]: E0130 05:11:31.189844 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.189854 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 05:11:31 crc kubenswrapper[4841]: E0130 05:11:31.189869 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343813f5-7868-4aa9-9d23-6c3f70f6bbd8" containerName="route-controller-manager" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.189877 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="343813f5-7868-4aa9-9d23-6c3f70f6bbd8" containerName="route-controller-manager" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.189989 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="343813f5-7868-4aa9-9d23-6c3f70f6bbd8" containerName="route-controller-manager" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.190003 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a785f654-41ed-4c03-baf7-b0fb5bc3f543" containerName="controller-manager" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.190017 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.190537 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.194561 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.195254 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.195498 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.195715 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.195912 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.196285 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.201089 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8"] Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.202230 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.207902 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8"] Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.208884 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.208934 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.211758 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz"] Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.226865 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.227452 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.229253 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.229271 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.231979 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.372619 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb6p6\" (UniqueName: \"kubernetes.io/projected/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-kube-api-access-sb6p6\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.372985 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c0d3e02-522d-4cb7-848f-51b35f02388f-client-ca\") pod \"route-controller-manager-66d4fcb69d-99hc8\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.373007 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwt9p\" (UniqueName: \"kubernetes.io/projected/7c0d3e02-522d-4cb7-848f-51b35f02388f-kube-api-access-fwt9p\") pod \"route-controller-manager-66d4fcb69d-99hc8\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.373031 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-serving-cert\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.373093 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-config\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.373121 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0d3e02-522d-4cb7-848f-51b35f02388f-serving-cert\") pod \"route-controller-manager-66d4fcb69d-99hc8\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.373149 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-proxy-ca-bundles\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.373179 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0d3e02-522d-4cb7-848f-51b35f02388f-config\") pod \"route-controller-manager-66d4fcb69d-99hc8\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.373203 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-client-ca\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.474714 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0d3e02-522d-4cb7-848f-51b35f02388f-serving-cert\") pod \"route-controller-manager-66d4fcb69d-99hc8\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.474780 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-proxy-ca-bundles\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.474829 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0d3e02-522d-4cb7-848f-51b35f02388f-config\") pod \"route-controller-manager-66d4fcb69d-99hc8\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.474905 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-client-ca\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.474951 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb6p6\" (UniqueName: \"kubernetes.io/projected/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-kube-api-access-sb6p6\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.474990 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c0d3e02-522d-4cb7-848f-51b35f02388f-client-ca\") pod \"route-controller-manager-66d4fcb69d-99hc8\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.475022 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwt9p\" (UniqueName: \"kubernetes.io/projected/7c0d3e02-522d-4cb7-848f-51b35f02388f-kube-api-access-fwt9p\") pod \"route-controller-manager-66d4fcb69d-99hc8\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.475055 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-serving-cert\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.475136 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-config\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.476318 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c0d3e02-522d-4cb7-848f-51b35f02388f-client-ca\") pod \"route-controller-manager-66d4fcb69d-99hc8\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.477363 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-config\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.477384 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-proxy-ca-bundles\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.477916 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-client-ca\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.478766 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0d3e02-522d-4cb7-848f-51b35f02388f-config\") pod \"route-controller-manager-66d4fcb69d-99hc8\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.483538 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-serving-cert\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.487218 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0d3e02-522d-4cb7-848f-51b35f02388f-serving-cert\") pod \"route-controller-manager-66d4fcb69d-99hc8\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.494690 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb6p6\" (UniqueName: \"kubernetes.io/projected/e0e8ae97-9364-4396-a9f5-e7bc2acc2147-kube-api-access-sb6p6\") pod \"controller-manager-6dcd857c4d-kfjhz\" (UID: \"e0e8ae97-9364-4396-a9f5-e7bc2acc2147\") " pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.500544 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwt9p\" (UniqueName: \"kubernetes.io/projected/7c0d3e02-522d-4cb7-848f-51b35f02388f-kube-api-access-fwt9p\") pod \"route-controller-manager-66d4fcb69d-99hc8\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.534249 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.550559 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.769860 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz"] Jan 30 05:11:31 crc kubenswrapper[4841]: I0130 05:11:31.936102 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8"] Jan 30 05:11:31 crc kubenswrapper[4841]: W0130 05:11:31.942505 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c0d3e02_522d_4cb7_848f_51b35f02388f.slice/crio-6bf82f3f6d5d75a0846990b07ec7d505d91be6d16b5a8482eacbeb343cdabb06 WatchSource:0}: Error finding container 6bf82f3f6d5d75a0846990b07ec7d505d91be6d16b5a8482eacbeb343cdabb06: Status 404 returned error can't find the container with id 6bf82f3f6d5d75a0846990b07ec7d505d91be6d16b5a8482eacbeb343cdabb06 Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.075289 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.331419 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.437670 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343813f5-7868-4aa9-9d23-6c3f70f6bbd8" path="/var/lib/kubelet/pods/343813f5-7868-4aa9-9d23-6c3f70f6bbd8/volumes" Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.587961 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" event={"ID":"e0e8ae97-9364-4396-a9f5-e7bc2acc2147","Type":"ContainerStarted","Data":"52522c151b385f79a1036bbd42c4d6239bcf1b7512f72dc4fb0c14b182692cbb"} Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.588013 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" event={"ID":"e0e8ae97-9364-4396-a9f5-e7bc2acc2147","Type":"ContainerStarted","Data":"cb3bbf602b582b96e5a60bdaaa456c3a0723c7cfdaadbc84135e5dbd2587c6ce"} Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.588188 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.589594 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" event={"ID":"7c0d3e02-522d-4cb7-848f-51b35f02388f","Type":"ContainerStarted","Data":"56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5"} Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.589633 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" event={"ID":"7c0d3e02-522d-4cb7-848f-51b35f02388f","Type":"ContainerStarted","Data":"6bf82f3f6d5d75a0846990b07ec7d505d91be6d16b5a8482eacbeb343cdabb06"} Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.589743 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.592142 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.610500 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" podStartSLOduration=3.610485497 podStartE2EDuration="3.610485497s" podCreationTimestamp="2026-01-30 05:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:11:32.607735519 +0000 UTC m=+229.601208157" watchObservedRunningTime="2026-01-30 05:11:32.610485497 +0000 UTC m=+229.603958135" Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.619296 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.641482 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" podStartSLOduration=3.641463548 podStartE2EDuration="3.641463548s" podCreationTimestamp="2026-01-30 05:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:11:32.622259735 +0000 UTC m=+229.615732373" watchObservedRunningTime="2026-01-30 05:11:32.641463548 +0000 UTC m=+229.634936186" Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.686719 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:11:32 crc kubenswrapper[4841]: I0130 05:11:32.847605 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 30 05:11:33 crc kubenswrapper[4841]: I0130 05:11:33.149529 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 30 05:11:34 crc kubenswrapper[4841]: I0130 05:11:34.616169 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 30 05:11:34 crc kubenswrapper[4841]: I0130 05:11:34.983625 4841 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 05:11:36 crc kubenswrapper[4841]: I0130 05:11:36.955726 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 30 05:11:38 crc kubenswrapper[4841]: I0130 05:11:38.967504 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 30 05:11:39 crc kubenswrapper[4841]: I0130 05:11:39.134439 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 30 05:11:39 crc kubenswrapper[4841]: I0130 05:11:39.149870 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 30 05:11:39 crc kubenswrapper[4841]: I0130 05:11:39.508702 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 30 05:11:39 crc kubenswrapper[4841]: I0130 05:11:39.907610 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 30 05:11:40 crc kubenswrapper[4841]: I0130 05:11:40.464323 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:11:40 crc kubenswrapper[4841]: I0130 05:11:40.464481 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:11:40 crc kubenswrapper[4841]: I0130 05:11:40.464555 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:11:40 crc kubenswrapper[4841]: I0130 05:11:40.465459 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:11:40 crc kubenswrapper[4841]: I0130 05:11:40.465571 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801" gracePeriod=600 Jan 30 05:11:40 crc kubenswrapper[4841]: I0130 05:11:40.633115 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801" exitCode=0 Jan 30 05:11:40 crc kubenswrapper[4841]: I0130 05:11:40.633175 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801"} Jan 30 05:11:40 crc kubenswrapper[4841]: I0130 05:11:40.638998 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 05:11:41 crc kubenswrapper[4841]: I0130 05:11:41.174585 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 30 05:11:41 crc kubenswrapper[4841]: I0130 05:11:41.643301 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"abf7e2b0da98eaaaac474db21786882f5f51c317ca2c9bd69e78825d71977aaa"} Jan 30 05:11:42 crc kubenswrapper[4841]: I0130 05:11:42.044739 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 30 05:11:42 crc kubenswrapper[4841]: I0130 05:11:42.734799 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 30 05:11:44 crc kubenswrapper[4841]: I0130 05:11:44.584331 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 30 05:11:44 crc kubenswrapper[4841]: I0130 05:11:44.642768 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 30 05:11:47 crc kubenswrapper[4841]: I0130 05:11:47.310233 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 30 05:12:08 crc kubenswrapper[4841]: I0130 05:12:08.805177 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7mnc5"] Jan 30 05:12:08 crc kubenswrapper[4841]: I0130 05:12:08.806904 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:08 crc kubenswrapper[4841]: I0130 05:12:08.828186 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7mnc5"] Jan 30 05:12:08 crc kubenswrapper[4841]: I0130 05:12:08.915136 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4dcd8a3-4eb3-4679-9227-2971d145bf02-bound-sa-token\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:08 crc kubenswrapper[4841]: I0130 05:12:08.915197 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wtcv\" (UniqueName: \"kubernetes.io/projected/b4dcd8a3-4eb3-4679-9227-2971d145bf02-kube-api-access-8wtcv\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:08 crc kubenswrapper[4841]: I0130 05:12:08.915217 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4dcd8a3-4eb3-4679-9227-2971d145bf02-registry-tls\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:08 crc kubenswrapper[4841]: I0130 05:12:08.915246 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4dcd8a3-4eb3-4679-9227-2971d145bf02-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:08 crc kubenswrapper[4841]: I0130 05:12:08.915299 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4dcd8a3-4eb3-4679-9227-2971d145bf02-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:08 crc kubenswrapper[4841]: I0130 05:12:08.915325 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4dcd8a3-4eb3-4679-9227-2971d145bf02-registry-certificates\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:08 crc kubenswrapper[4841]: I0130 05:12:08.915349 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:08 crc kubenswrapper[4841]: I0130 05:12:08.915390 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4dcd8a3-4eb3-4679-9227-2971d145bf02-trusted-ca\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:08 crc kubenswrapper[4841]: I0130 05:12:08.941976 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.017078 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4dcd8a3-4eb3-4679-9227-2971d145bf02-trusted-ca\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.017544 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4dcd8a3-4eb3-4679-9227-2971d145bf02-bound-sa-token\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.017597 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wtcv\" (UniqueName: \"kubernetes.io/projected/b4dcd8a3-4eb3-4679-9227-2971d145bf02-kube-api-access-8wtcv\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.017644 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4dcd8a3-4eb3-4679-9227-2971d145bf02-registry-tls\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.017690 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4dcd8a3-4eb3-4679-9227-2971d145bf02-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.017739 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4dcd8a3-4eb3-4679-9227-2971d145bf02-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.017787 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4dcd8a3-4eb3-4679-9227-2971d145bf02-registry-certificates\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.018703 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4dcd8a3-4eb3-4679-9227-2971d145bf02-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.018827 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4dcd8a3-4eb3-4679-9227-2971d145bf02-trusted-ca\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.019475 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4dcd8a3-4eb3-4679-9227-2971d145bf02-registry-certificates\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.026625 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4dcd8a3-4eb3-4679-9227-2971d145bf02-registry-tls\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.032050 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4dcd8a3-4eb3-4679-9227-2971d145bf02-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.037529 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4dcd8a3-4eb3-4679-9227-2971d145bf02-bound-sa-token\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.050905 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wtcv\" (UniqueName: \"kubernetes.io/projected/b4dcd8a3-4eb3-4679-9227-2971d145bf02-kube-api-access-8wtcv\") pod \"image-registry-66df7c8f76-7mnc5\" (UID: \"b4dcd8a3-4eb3-4679-9227-2971d145bf02\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.126876 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.619281 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7mnc5"] Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.818891 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" event={"ID":"b4dcd8a3-4eb3-4679-9227-2971d145bf02","Type":"ContainerStarted","Data":"077b62460b86e5b4301c43e20f262867d547cacde636bc22e1ea634fb118dd0e"} Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.818970 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" event={"ID":"b4dcd8a3-4eb3-4679-9227-2971d145bf02","Type":"ContainerStarted","Data":"b35df248dfe5f8f4bb7824587f6f17414e58f873cb6340e62356b72f70a53e05"} Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.819087 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:09 crc kubenswrapper[4841]: I0130 05:12:09.836265 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" podStartSLOduration=1.836241316 podStartE2EDuration="1.836241316s" podCreationTimestamp="2026-01-30 05:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:12:09.835903638 +0000 UTC m=+266.829376316" watchObservedRunningTime="2026-01-30 05:12:09.836241316 +0000 UTC m=+266.829713994" Jan 30 05:12:20 crc kubenswrapper[4841]: I0130 05:12:20.628456 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nhvlf"] Jan 30 05:12:20 crc kubenswrapper[4841]: I0130 05:12:20.629896 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nhvlf" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" containerName="registry-server" containerID="cri-o://954b3dae9e656fbd6e938c990df2ecf605a709518726fa0c5a5e715eff693039" gracePeriod=2 Jan 30 05:12:20 crc kubenswrapper[4841]: I0130 05:12:20.891042 4841 generic.go:334] "Generic (PLEG): container finished" podID="f7b9e216-aaea-4222-ab65-efadd17f2f46" containerID="954b3dae9e656fbd6e938c990df2ecf605a709518726fa0c5a5e715eff693039" exitCode=0 Jan 30 05:12:20 crc kubenswrapper[4841]: I0130 05:12:20.891102 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhvlf" event={"ID":"f7b9e216-aaea-4222-ab65-efadd17f2f46","Type":"ContainerDied","Data":"954b3dae9e656fbd6e938c990df2ecf605a709518726fa0c5a5e715eff693039"} Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.126395 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.214089 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg2tp\" (UniqueName: \"kubernetes.io/projected/f7b9e216-aaea-4222-ab65-efadd17f2f46-kube-api-access-wg2tp\") pod \"f7b9e216-aaea-4222-ab65-efadd17f2f46\" (UID: \"f7b9e216-aaea-4222-ab65-efadd17f2f46\") " Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.214247 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b9e216-aaea-4222-ab65-efadd17f2f46-catalog-content\") pod \"f7b9e216-aaea-4222-ab65-efadd17f2f46\" (UID: \"f7b9e216-aaea-4222-ab65-efadd17f2f46\") " Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.214378 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b9e216-aaea-4222-ab65-efadd17f2f46-utilities\") pod \"f7b9e216-aaea-4222-ab65-efadd17f2f46\" (UID: \"f7b9e216-aaea-4222-ab65-efadd17f2f46\") " Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.215691 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7b9e216-aaea-4222-ab65-efadd17f2f46-utilities" (OuterVolumeSpecName: "utilities") pod "f7b9e216-aaea-4222-ab65-efadd17f2f46" (UID: "f7b9e216-aaea-4222-ab65-efadd17f2f46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.225336 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b9e216-aaea-4222-ab65-efadd17f2f46-kube-api-access-wg2tp" (OuterVolumeSpecName: "kube-api-access-wg2tp") pod "f7b9e216-aaea-4222-ab65-efadd17f2f46" (UID: "f7b9e216-aaea-4222-ab65-efadd17f2f46"). InnerVolumeSpecName "kube-api-access-wg2tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.290042 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7b9e216-aaea-4222-ab65-efadd17f2f46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7b9e216-aaea-4222-ab65-efadd17f2f46" (UID: "f7b9e216-aaea-4222-ab65-efadd17f2f46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.316720 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b9e216-aaea-4222-ab65-efadd17f2f46-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.316775 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b9e216-aaea-4222-ab65-efadd17f2f46-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.316796 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg2tp\" (UniqueName: \"kubernetes.io/projected/f7b9e216-aaea-4222-ab65-efadd17f2f46-kube-api-access-wg2tp\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.901893 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhvlf" event={"ID":"f7b9e216-aaea-4222-ab65-efadd17f2f46","Type":"ContainerDied","Data":"d1a377937c8cfb61ed291dbb846284c315e9ce2495e9cfecf096c993fdc27ec3"} Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.901963 4841 scope.go:117] "RemoveContainer" containerID="954b3dae9e656fbd6e938c990df2ecf605a709518726fa0c5a5e715eff693039" Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.901999 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhvlf" Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.926728 4841 scope.go:117] "RemoveContainer" containerID="7a9237339dded0cc16f47c9935d3a3286d5ceac3b7e00245f27c81ba1202991e" Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.955716 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nhvlf"] Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.963247 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nhvlf"] Jan 30 05:12:21 crc kubenswrapper[4841]: I0130 05:12:21.965955 4841 scope.go:117] "RemoveContainer" containerID="e8c0e06eedb659d46fec811ebbbe96bd2efa57cdd4fa3c04b26083b550a1ad44" Jan 30 05:12:22 crc kubenswrapper[4841]: I0130 05:12:22.447368 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" path="/var/lib/kubelet/pods/f7b9e216-aaea-4222-ab65-efadd17f2f46/volumes" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.083291 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8"] Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.084158 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" podUID="7c0d3e02-522d-4cb7-848f-51b35f02388f" containerName="route-controller-manager" containerID="cri-o://56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5" gracePeriod=30 Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.134272 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7mnc5" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.213623 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2f7jj"] Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.445864 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.536723 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c0d3e02-522d-4cb7-848f-51b35f02388f-client-ca\") pod \"7c0d3e02-522d-4cb7-848f-51b35f02388f\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.536814 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0d3e02-522d-4cb7-848f-51b35f02388f-config\") pod \"7c0d3e02-522d-4cb7-848f-51b35f02388f\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.536876 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwt9p\" (UniqueName: \"kubernetes.io/projected/7c0d3e02-522d-4cb7-848f-51b35f02388f-kube-api-access-fwt9p\") pod \"7c0d3e02-522d-4cb7-848f-51b35f02388f\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.536906 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0d3e02-522d-4cb7-848f-51b35f02388f-serving-cert\") pod \"7c0d3e02-522d-4cb7-848f-51b35f02388f\" (UID: \"7c0d3e02-522d-4cb7-848f-51b35f02388f\") " Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.537690 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c0d3e02-522d-4cb7-848f-51b35f02388f-client-ca" (OuterVolumeSpecName: "client-ca") pod "7c0d3e02-522d-4cb7-848f-51b35f02388f" (UID: "7c0d3e02-522d-4cb7-848f-51b35f02388f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.538150 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c0d3e02-522d-4cb7-848f-51b35f02388f-config" (OuterVolumeSpecName: "config") pod "7c0d3e02-522d-4cb7-848f-51b35f02388f" (UID: "7c0d3e02-522d-4cb7-848f-51b35f02388f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.541650 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0d3e02-522d-4cb7-848f-51b35f02388f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7c0d3e02-522d-4cb7-848f-51b35f02388f" (UID: "7c0d3e02-522d-4cb7-848f-51b35f02388f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.547521 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0d3e02-522d-4cb7-848f-51b35f02388f-kube-api-access-fwt9p" (OuterVolumeSpecName: "kube-api-access-fwt9p") pod "7c0d3e02-522d-4cb7-848f-51b35f02388f" (UID: "7c0d3e02-522d-4cb7-848f-51b35f02388f"). InnerVolumeSpecName "kube-api-access-fwt9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.637886 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0d3e02-522d-4cb7-848f-51b35f02388f-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.637923 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwt9p\" (UniqueName: \"kubernetes.io/projected/7c0d3e02-522d-4cb7-848f-51b35f02388f-kube-api-access-fwt9p\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.637935 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0d3e02-522d-4cb7-848f-51b35f02388f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.637945 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c0d3e02-522d-4cb7-848f-51b35f02388f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.959625 4841 generic.go:334] "Generic (PLEG): container finished" podID="7c0d3e02-522d-4cb7-848f-51b35f02388f" containerID="56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5" exitCode=0 Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.959695 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" event={"ID":"7c0d3e02-522d-4cb7-848f-51b35f02388f","Type":"ContainerDied","Data":"56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5"} Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.959732 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" event={"ID":"7c0d3e02-522d-4cb7-848f-51b35f02388f","Type":"ContainerDied","Data":"6bf82f3f6d5d75a0846990b07ec7d505d91be6d16b5a8482eacbeb343cdabb06"} Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.959759 4841 scope.go:117] "RemoveContainer" containerID="56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.959919 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.984348 4841 scope.go:117] "RemoveContainer" containerID="56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5" Jan 30 05:12:29 crc kubenswrapper[4841]: E0130 05:12:29.984918 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5\": container with ID starting with 56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5 not found: ID does not exist" containerID="56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5" Jan 30 05:12:29 crc kubenswrapper[4841]: I0130 05:12:29.985055 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5"} err="failed to get container status \"56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5\": rpc error: code = NotFound desc = could not find container \"56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5\": container with ID starting with 56d3e6565976d52c43d9f1156c33c1d5ea54cb8f2f333620ef04ffba9d6d1dc5 not found: ID does not exist" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.040072 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8"] Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.051240 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66d4fcb69d-99hc8"] Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.237679 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f"] Jan 30 05:12:30 crc kubenswrapper[4841]: E0130 05:12:30.238360 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" containerName="registry-server" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.238382 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" containerName="registry-server" Jan 30 05:12:30 crc kubenswrapper[4841]: E0130 05:12:30.238455 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" containerName="extract-utilities" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.238475 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" containerName="extract-utilities" Jan 30 05:12:30 crc kubenswrapper[4841]: E0130 05:12:30.238501 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0d3e02-522d-4cb7-848f-51b35f02388f" containerName="route-controller-manager" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.238517 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0d3e02-522d-4cb7-848f-51b35f02388f" containerName="route-controller-manager" Jan 30 05:12:30 crc kubenswrapper[4841]: E0130 05:12:30.238539 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" containerName="extract-content" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.238557 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" containerName="extract-content" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.238916 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0d3e02-522d-4cb7-848f-51b35f02388f" containerName="route-controller-manager" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.239000 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b9e216-aaea-4222-ab65-efadd17f2f46" containerName="registry-server" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.240308 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.243791 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.243851 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.244063 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.244442 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.244531 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.244442 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.256274 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f"] Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.347532 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrcg2\" (UniqueName: \"kubernetes.io/projected/868601fb-3285-4bc5-8468-44c22a26bcee-kube-api-access-rrcg2\") pod \"route-controller-manager-7c4c5c5554-qwn9f\" (UID: \"868601fb-3285-4bc5-8468-44c22a26bcee\") " pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.347644 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/868601fb-3285-4bc5-8468-44c22a26bcee-serving-cert\") pod \"route-controller-manager-7c4c5c5554-qwn9f\" (UID: \"868601fb-3285-4bc5-8468-44c22a26bcee\") " pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.347771 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/868601fb-3285-4bc5-8468-44c22a26bcee-config\") pod \"route-controller-manager-7c4c5c5554-qwn9f\" (UID: \"868601fb-3285-4bc5-8468-44c22a26bcee\") " pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.347820 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/868601fb-3285-4bc5-8468-44c22a26bcee-client-ca\") pod \"route-controller-manager-7c4c5c5554-qwn9f\" (UID: \"868601fb-3285-4bc5-8468-44c22a26bcee\") " pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.447915 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0d3e02-522d-4cb7-848f-51b35f02388f" path="/var/lib/kubelet/pods/7c0d3e02-522d-4cb7-848f-51b35f02388f/volumes" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.448952 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/868601fb-3285-4bc5-8468-44c22a26bcee-config\") pod \"route-controller-manager-7c4c5c5554-qwn9f\" (UID: \"868601fb-3285-4bc5-8468-44c22a26bcee\") " pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.449007 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/868601fb-3285-4bc5-8468-44c22a26bcee-client-ca\") pod \"route-controller-manager-7c4c5c5554-qwn9f\" (UID: \"868601fb-3285-4bc5-8468-44c22a26bcee\") " pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.449059 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrcg2\" (UniqueName: \"kubernetes.io/projected/868601fb-3285-4bc5-8468-44c22a26bcee-kube-api-access-rrcg2\") pod \"route-controller-manager-7c4c5c5554-qwn9f\" (UID: \"868601fb-3285-4bc5-8468-44c22a26bcee\") " pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.449182 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/868601fb-3285-4bc5-8468-44c22a26bcee-serving-cert\") pod \"route-controller-manager-7c4c5c5554-qwn9f\" (UID: \"868601fb-3285-4bc5-8468-44c22a26bcee\") " pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.451346 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/868601fb-3285-4bc5-8468-44c22a26bcee-client-ca\") pod \"route-controller-manager-7c4c5c5554-qwn9f\" (UID: \"868601fb-3285-4bc5-8468-44c22a26bcee\") " pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.451442 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/868601fb-3285-4bc5-8468-44c22a26bcee-config\") pod \"route-controller-manager-7c4c5c5554-qwn9f\" (UID: \"868601fb-3285-4bc5-8468-44c22a26bcee\") " pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.457499 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/868601fb-3285-4bc5-8468-44c22a26bcee-serving-cert\") pod \"route-controller-manager-7c4c5c5554-qwn9f\" (UID: \"868601fb-3285-4bc5-8468-44c22a26bcee\") " pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.471590 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrcg2\" (UniqueName: \"kubernetes.io/projected/868601fb-3285-4bc5-8468-44c22a26bcee-kube-api-access-rrcg2\") pod \"route-controller-manager-7c4c5c5554-qwn9f\" (UID: \"868601fb-3285-4bc5-8468-44c22a26bcee\") " pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:30 crc kubenswrapper[4841]: I0130 05:12:30.571020 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:31 crc kubenswrapper[4841]: I0130 05:12:31.053886 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f"] Jan 30 05:12:31 crc kubenswrapper[4841]: W0130 05:12:31.061925 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod868601fb_3285_4bc5_8468_44c22a26bcee.slice/crio-6a05b089c7337bac976b3b33948a6be8ab17a7665f63c3e265d591efecd5888d WatchSource:0}: Error finding container 6a05b089c7337bac976b3b33948a6be8ab17a7665f63c3e265d591efecd5888d: Status 404 returned error can't find the container with id 6a05b089c7337bac976b3b33948a6be8ab17a7665f63c3e265d591efecd5888d Jan 30 05:12:31 crc kubenswrapper[4841]: I0130 05:12:31.974267 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" event={"ID":"868601fb-3285-4bc5-8468-44c22a26bcee","Type":"ContainerStarted","Data":"f8738f515910e322460ec7c0dff2a4ecedcb8a143b6a5559778925dcd86c83aa"} Jan 30 05:12:31 crc kubenswrapper[4841]: I0130 05:12:31.974317 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" event={"ID":"868601fb-3285-4bc5-8468-44c22a26bcee","Type":"ContainerStarted","Data":"6a05b089c7337bac976b3b33948a6be8ab17a7665f63c3e265d591efecd5888d"} Jan 30 05:12:31 crc kubenswrapper[4841]: I0130 05:12:31.974723 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:31 crc kubenswrapper[4841]: I0130 05:12:31.982665 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" Jan 30 05:12:31 crc kubenswrapper[4841]: I0130 05:12:31.999608 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c4c5c5554-qwn9f" podStartSLOduration=2.999582138 podStartE2EDuration="2.999582138s" podCreationTimestamp="2026-01-30 05:12:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:12:31.995731328 +0000 UTC m=+288.989204006" watchObservedRunningTime="2026-01-30 05:12:31.999582138 +0000 UTC m=+288.993054806" Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.871967 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t88dq"] Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.872785 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t88dq" podUID="12231fcc-9527-405e-bac6-734865031f83" containerName="registry-server" containerID="cri-o://c3e3310da4b49162dfc532fc80d8b8d94dae98b15b44de787a1da985dfef4b5f" gracePeriod=30 Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.878358 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8lz7"] Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.878755 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g8lz7" podUID="4a117bdc-99af-4fd8-a810-b2d08e174f77" containerName="registry-server" containerID="cri-o://53c57b5e2798421fc4a66deb59265f89b7723297b8552704b8d55553ae59063d" gracePeriod=30 Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.882609 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4c6xs"] Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.882911 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" podUID="7599df20-a4f3-4f48-b413-8eec3c0bba38" containerName="marketplace-operator" containerID="cri-o://b4c5ec1c563b6c96488f2960c37456f3c0275c8eecb5d48c7068cea77991ebf3" gracePeriod=30 Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.886692 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzz8x"] Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.886916 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xzz8x" podUID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" containerName="registry-server" containerID="cri-o://4ba16a3964468fbeafe5fd4b370713974204613a888d8d447b8c93eca07054f9" gracePeriod=30 Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.911508 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-slkqt"] Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.912220 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.919514 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9pv7l"] Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.921716 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9pv7l" podUID="13c7d511-de5b-4e9d-acdc-615d18346215" containerName="registry-server" containerID="cri-o://70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284" gracePeriod=30 Jan 30 05:12:39 crc kubenswrapper[4841]: I0130 05:12:39.926613 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-slkqt"] Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.004549 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05338f03-2aa7-4afc-80f6-44a70f084420-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-slkqt\" (UID: \"05338f03-2aa7-4afc-80f6-44a70f084420\") " pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.004610 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/05338f03-2aa7-4afc-80f6-44a70f084420-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-slkqt\" (UID: \"05338f03-2aa7-4afc-80f6-44a70f084420\") " pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.004668 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9dms\" (UniqueName: \"kubernetes.io/projected/05338f03-2aa7-4afc-80f6-44a70f084420-kube-api-access-h9dms\") pod \"marketplace-operator-79b997595-slkqt\" (UID: \"05338f03-2aa7-4afc-80f6-44a70f084420\") " pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.022158 4841 generic.go:334] "Generic (PLEG): container finished" podID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" containerID="4ba16a3964468fbeafe5fd4b370713974204613a888d8d447b8c93eca07054f9" exitCode=0 Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.022236 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzz8x" event={"ID":"6a6ce2ed-da59-4d16-8d01-022b22e746f1","Type":"ContainerDied","Data":"4ba16a3964468fbeafe5fd4b370713974204613a888d8d447b8c93eca07054f9"} Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.025453 4841 generic.go:334] "Generic (PLEG): container finished" podID="12231fcc-9527-405e-bac6-734865031f83" containerID="c3e3310da4b49162dfc532fc80d8b8d94dae98b15b44de787a1da985dfef4b5f" exitCode=0 Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.025539 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t88dq" event={"ID":"12231fcc-9527-405e-bac6-734865031f83","Type":"ContainerDied","Data":"c3e3310da4b49162dfc532fc80d8b8d94dae98b15b44de787a1da985dfef4b5f"} Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.028546 4841 generic.go:334] "Generic (PLEG): container finished" podID="7599df20-a4f3-4f48-b413-8eec3c0bba38" containerID="b4c5ec1c563b6c96488f2960c37456f3c0275c8eecb5d48c7068cea77991ebf3" exitCode=0 Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.028618 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" event={"ID":"7599df20-a4f3-4f48-b413-8eec3c0bba38","Type":"ContainerDied","Data":"b4c5ec1c563b6c96488f2960c37456f3c0275c8eecb5d48c7068cea77991ebf3"} Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.028652 4841 scope.go:117] "RemoveContainer" containerID="852c3759a609a71af3531896af0f8229e0870854040facca8847ea946c31339e" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.035170 4841 generic.go:334] "Generic (PLEG): container finished" podID="4a117bdc-99af-4fd8-a810-b2d08e174f77" containerID="53c57b5e2798421fc4a66deb59265f89b7723297b8552704b8d55553ae59063d" exitCode=0 Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.035210 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8lz7" event={"ID":"4a117bdc-99af-4fd8-a810-b2d08e174f77","Type":"ContainerDied","Data":"53c57b5e2798421fc4a66deb59265f89b7723297b8552704b8d55553ae59063d"} Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.105849 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/05338f03-2aa7-4afc-80f6-44a70f084420-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-slkqt\" (UID: \"05338f03-2aa7-4afc-80f6-44a70f084420\") " pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.106198 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9dms\" (UniqueName: \"kubernetes.io/projected/05338f03-2aa7-4afc-80f6-44a70f084420-kube-api-access-h9dms\") pod \"marketplace-operator-79b997595-slkqt\" (UID: \"05338f03-2aa7-4afc-80f6-44a70f084420\") " pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.106256 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05338f03-2aa7-4afc-80f6-44a70f084420-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-slkqt\" (UID: \"05338f03-2aa7-4afc-80f6-44a70f084420\") " pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.107501 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05338f03-2aa7-4afc-80f6-44a70f084420-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-slkqt\" (UID: \"05338f03-2aa7-4afc-80f6-44a70f084420\") " pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.111823 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/05338f03-2aa7-4afc-80f6-44a70f084420-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-slkqt\" (UID: \"05338f03-2aa7-4afc-80f6-44a70f084420\") " pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.126925 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9dms\" (UniqueName: \"kubernetes.io/projected/05338f03-2aa7-4afc-80f6-44a70f084420-kube-api-access-h9dms\") pod \"marketplace-operator-79b997595-slkqt\" (UID: \"05338f03-2aa7-4afc-80f6-44a70f084420\") " pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.266835 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.280946 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.368459 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.379589 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.393107 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.400955 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.410170 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12231fcc-9527-405e-bac6-734865031f83-utilities\") pod \"12231fcc-9527-405e-bac6-734865031f83\" (UID: \"12231fcc-9527-405e-bac6-734865031f83\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.410427 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12231fcc-9527-405e-bac6-734865031f83-catalog-content\") pod \"12231fcc-9527-405e-bac6-734865031f83\" (UID: \"12231fcc-9527-405e-bac6-734865031f83\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.410548 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st6r9\" (UniqueName: \"kubernetes.io/projected/12231fcc-9527-405e-bac6-734865031f83-kube-api-access-st6r9\") pod \"12231fcc-9527-405e-bac6-734865031f83\" (UID: \"12231fcc-9527-405e-bac6-734865031f83\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.412041 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12231fcc-9527-405e-bac6-734865031f83-utilities" (OuterVolumeSpecName: "utilities") pod "12231fcc-9527-405e-bac6-734865031f83" (UID: "12231fcc-9527-405e-bac6-734865031f83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.419056 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12231fcc-9527-405e-bac6-734865031f83-kube-api-access-st6r9" (OuterVolumeSpecName: "kube-api-access-st6r9") pod "12231fcc-9527-405e-bac6-734865031f83" (UID: "12231fcc-9527-405e-bac6-734865031f83"). InnerVolumeSpecName "kube-api-access-st6r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.495388 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12231fcc-9527-405e-bac6-734865031f83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12231fcc-9527-405e-bac6-734865031f83" (UID: "12231fcc-9527-405e-bac6-734865031f83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.511852 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a117bdc-99af-4fd8-a810-b2d08e174f77-utilities\") pod \"4a117bdc-99af-4fd8-a810-b2d08e174f77\" (UID: \"4a117bdc-99af-4fd8-a810-b2d08e174f77\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.511901 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c7d511-de5b-4e9d-acdc-615d18346215-utilities\") pod \"13c7d511-de5b-4e9d-acdc-615d18346215\" (UID: \"13c7d511-de5b-4e9d-acdc-615d18346215\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.511926 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c7d511-de5b-4e9d-acdc-615d18346215-catalog-content\") pod \"13c7d511-de5b-4e9d-acdc-615d18346215\" (UID: \"13c7d511-de5b-4e9d-acdc-615d18346215\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.511972 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7599df20-a4f3-4f48-b413-8eec3c0bba38-marketplace-trusted-ca\") pod \"7599df20-a4f3-4f48-b413-8eec3c0bba38\" (UID: \"7599df20-a4f3-4f48-b413-8eec3c0bba38\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.512003 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkrng\" (UniqueName: \"kubernetes.io/projected/6a6ce2ed-da59-4d16-8d01-022b22e746f1-kube-api-access-fkrng\") pod \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\" (UID: \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.512027 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmv29\" (UniqueName: \"kubernetes.io/projected/13c7d511-de5b-4e9d-acdc-615d18346215-kube-api-access-lmv29\") pod \"13c7d511-de5b-4e9d-acdc-615d18346215\" (UID: \"13c7d511-de5b-4e9d-acdc-615d18346215\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.512054 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btxhr\" (UniqueName: \"kubernetes.io/projected/4a117bdc-99af-4fd8-a810-b2d08e174f77-kube-api-access-btxhr\") pod \"4a117bdc-99af-4fd8-a810-b2d08e174f77\" (UID: \"4a117bdc-99af-4fd8-a810-b2d08e174f77\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.512076 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a117bdc-99af-4fd8-a810-b2d08e174f77-catalog-content\") pod \"4a117bdc-99af-4fd8-a810-b2d08e174f77\" (UID: \"4a117bdc-99af-4fd8-a810-b2d08e174f77\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.512118 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6ce2ed-da59-4d16-8d01-022b22e746f1-utilities\") pod \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\" (UID: \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.512140 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw22t\" (UniqueName: \"kubernetes.io/projected/7599df20-a4f3-4f48-b413-8eec3c0bba38-kube-api-access-tw22t\") pod \"7599df20-a4f3-4f48-b413-8eec3c0bba38\" (UID: \"7599df20-a4f3-4f48-b413-8eec3c0bba38\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.512171 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7599df20-a4f3-4f48-b413-8eec3c0bba38-marketplace-operator-metrics\") pod \"7599df20-a4f3-4f48-b413-8eec3c0bba38\" (UID: \"7599df20-a4f3-4f48-b413-8eec3c0bba38\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.512198 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6ce2ed-da59-4d16-8d01-022b22e746f1-catalog-content\") pod \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\" (UID: \"6a6ce2ed-da59-4d16-8d01-022b22e746f1\") " Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.512575 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c7d511-de5b-4e9d-acdc-615d18346215-utilities" (OuterVolumeSpecName: "utilities") pod "13c7d511-de5b-4e9d-acdc-615d18346215" (UID: "13c7d511-de5b-4e9d-acdc-615d18346215"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.512551 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a117bdc-99af-4fd8-a810-b2d08e174f77-utilities" (OuterVolumeSpecName: "utilities") pod "4a117bdc-99af-4fd8-a810-b2d08e174f77" (UID: "4a117bdc-99af-4fd8-a810-b2d08e174f77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.513678 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a6ce2ed-da59-4d16-8d01-022b22e746f1-utilities" (OuterVolumeSpecName: "utilities") pod "6a6ce2ed-da59-4d16-8d01-022b22e746f1" (UID: "6a6ce2ed-da59-4d16-8d01-022b22e746f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.513850 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7599df20-a4f3-4f48-b413-8eec3c0bba38-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "7599df20-a4f3-4f48-b413-8eec3c0bba38" (UID: "7599df20-a4f3-4f48-b413-8eec3c0bba38"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.513962 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a117bdc-99af-4fd8-a810-b2d08e174f77-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.513983 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c7d511-de5b-4e9d-acdc-615d18346215-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.513993 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12231fcc-9527-405e-bac6-734865031f83-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.514005 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st6r9\" (UniqueName: \"kubernetes.io/projected/12231fcc-9527-405e-bac6-734865031f83-kube-api-access-st6r9\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.514013 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12231fcc-9527-405e-bac6-734865031f83-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.517800 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a117bdc-99af-4fd8-a810-b2d08e174f77-kube-api-access-btxhr" (OuterVolumeSpecName: "kube-api-access-btxhr") pod "4a117bdc-99af-4fd8-a810-b2d08e174f77" (UID: "4a117bdc-99af-4fd8-a810-b2d08e174f77"). InnerVolumeSpecName "kube-api-access-btxhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.520774 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7599df20-a4f3-4f48-b413-8eec3c0bba38-kube-api-access-tw22t" (OuterVolumeSpecName: "kube-api-access-tw22t") pod "7599df20-a4f3-4f48-b413-8eec3c0bba38" (UID: "7599df20-a4f3-4f48-b413-8eec3c0bba38"). InnerVolumeSpecName "kube-api-access-tw22t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.533763 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7599df20-a4f3-4f48-b413-8eec3c0bba38-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "7599df20-a4f3-4f48-b413-8eec3c0bba38" (UID: "7599df20-a4f3-4f48-b413-8eec3c0bba38"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.534613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c7d511-de5b-4e9d-acdc-615d18346215-kube-api-access-lmv29" (OuterVolumeSpecName: "kube-api-access-lmv29") pod "13c7d511-de5b-4e9d-acdc-615d18346215" (UID: "13c7d511-de5b-4e9d-acdc-615d18346215"). InnerVolumeSpecName "kube-api-access-lmv29". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.535766 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6ce2ed-da59-4d16-8d01-022b22e746f1-kube-api-access-fkrng" (OuterVolumeSpecName: "kube-api-access-fkrng") pod "6a6ce2ed-da59-4d16-8d01-022b22e746f1" (UID: "6a6ce2ed-da59-4d16-8d01-022b22e746f1"). InnerVolumeSpecName "kube-api-access-fkrng". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.555049 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a6ce2ed-da59-4d16-8d01-022b22e746f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a6ce2ed-da59-4d16-8d01-022b22e746f1" (UID: "6a6ce2ed-da59-4d16-8d01-022b22e746f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.566278 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-slkqt"] Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.573135 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a117bdc-99af-4fd8-a810-b2d08e174f77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a117bdc-99af-4fd8-a810-b2d08e174f77" (UID: "4a117bdc-99af-4fd8-a810-b2d08e174f77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.615524 4841 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7599df20-a4f3-4f48-b413-8eec3c0bba38-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.615872 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkrng\" (UniqueName: \"kubernetes.io/projected/6a6ce2ed-da59-4d16-8d01-022b22e746f1-kube-api-access-fkrng\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.615957 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmv29\" (UniqueName: \"kubernetes.io/projected/13c7d511-de5b-4e9d-acdc-615d18346215-kube-api-access-lmv29\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.616028 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btxhr\" (UniqueName: \"kubernetes.io/projected/4a117bdc-99af-4fd8-a810-b2d08e174f77-kube-api-access-btxhr\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.616084 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a117bdc-99af-4fd8-a810-b2d08e174f77-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.616140 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6ce2ed-da59-4d16-8d01-022b22e746f1-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.616202 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw22t\" (UniqueName: \"kubernetes.io/projected/7599df20-a4f3-4f48-b413-8eec3c0bba38-kube-api-access-tw22t\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.616260 4841 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7599df20-a4f3-4f48-b413-8eec3c0bba38-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.616321 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6ce2ed-da59-4d16-8d01-022b22e746f1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.691211 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c7d511-de5b-4e9d-acdc-615d18346215-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13c7d511-de5b-4e9d-acdc-615d18346215" (UID: "13c7d511-de5b-4e9d-acdc-615d18346215"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:40 crc kubenswrapper[4841]: I0130 05:12:40.717354 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c7d511-de5b-4e9d-acdc-615d18346215-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.040713 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" event={"ID":"05338f03-2aa7-4afc-80f6-44a70f084420","Type":"ContainerStarted","Data":"3feb21ff5120348d9bd2bca4216da55fef7d1e6e47aff1d3329497a9b348193e"} Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.040928 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" event={"ID":"05338f03-2aa7-4afc-80f6-44a70f084420","Type":"ContainerStarted","Data":"9ab5304e9a13cccc3f96d4f49cfcf276f2f23268f2505c2edb77c2695713bdde"} Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.040951 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.042802 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.043898 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzz8x" event={"ID":"6a6ce2ed-da59-4d16-8d01-022b22e746f1","Type":"ContainerDied","Data":"04839d41d0b38db171354dcaa9510504c7298ab5eb1c6ba11acd1bf902471b9b"} Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.043913 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzz8x" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.043930 4841 scope.go:117] "RemoveContainer" containerID="4ba16a3964468fbeafe5fd4b370713974204613a888d8d447b8c93eca07054f9" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.047027 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t88dq" event={"ID":"12231fcc-9527-405e-bac6-734865031f83","Type":"ContainerDied","Data":"d9cd64144ba1fc5848c24bfa84a57f177fc74339ffc06f4302f221f591f87eab"} Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.047176 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t88dq" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.059819 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" event={"ID":"7599df20-a4f3-4f48-b413-8eec3c0bba38","Type":"ContainerDied","Data":"f9d3180e69bd282fe3b847b6b43b625085b6ee927a0eab391a1483b0339c07da"} Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.059832 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4c6xs" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.062502 4841 generic.go:334] "Generic (PLEG): container finished" podID="13c7d511-de5b-4e9d-acdc-615d18346215" containerID="70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284" exitCode=0 Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.062591 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pv7l" event={"ID":"13c7d511-de5b-4e9d-acdc-615d18346215","Type":"ContainerDied","Data":"70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284"} Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.062629 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pv7l" event={"ID":"13c7d511-de5b-4e9d-acdc-615d18346215","Type":"ContainerDied","Data":"570516297c13d95ae4b70da6b81dcd5101ee8c2acf2068c1ebdd69b190288575"} Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.062747 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pv7l" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.067888 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8lz7" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.067928 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8lz7" event={"ID":"4a117bdc-99af-4fd8-a810-b2d08e174f77","Type":"ContainerDied","Data":"8120d6002c95afbd79abe10ef0c5de7d47d6558d485a4607c2819131d152793a"} Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.075256 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-slkqt" podStartSLOduration=2.075240216 podStartE2EDuration="2.075240216s" podCreationTimestamp="2026-01-30 05:12:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:12:41.074755305 +0000 UTC m=+298.068227993" watchObservedRunningTime="2026-01-30 05:12:41.075240216 +0000 UTC m=+298.068712854" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.092295 4841 scope.go:117] "RemoveContainer" containerID="1f783a52545342ff19e5d16cde005f3b602741f7151f446a8ce5dcdea35fa19a" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.131073 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4c6xs"] Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.134983 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4c6xs"] Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.141591 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzz8x"] Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.145280 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzz8x"] Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.153746 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t88dq"] Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.156721 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t88dq"] Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.158169 4841 scope.go:117] "RemoveContainer" containerID="39b9f7530dc9e32f761d0518b1512207f4775634bc4cb5140afc51abfa02aa8a" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.170756 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9pv7l"] Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.174234 4841 scope.go:117] "RemoveContainer" containerID="c3e3310da4b49162dfc532fc80d8b8d94dae98b15b44de787a1da985dfef4b5f" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.180825 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9pv7l"] Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.189516 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8lz7"] Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.192734 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g8lz7"] Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.201202 4841 scope.go:117] "RemoveContainer" containerID="eabf638e1e003020430b9da88dfa4942f3166ae57a60ec4df1c92e392ec86c4b" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.216246 4841 scope.go:117] "RemoveContainer" containerID="de97f35d7278621768d6fd3f5a32b75aef6db03f5a3f2071ab9995bbfbc2d67d" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.235702 4841 scope.go:117] "RemoveContainer" containerID="b4c5ec1c563b6c96488f2960c37456f3c0275c8eecb5d48c7068cea77991ebf3" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.254877 4841 scope.go:117] "RemoveContainer" containerID="70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.275786 4841 scope.go:117] "RemoveContainer" containerID="7f8f28c4fc6e50e80368f4e8757fbbe47cfd12f574d358b9fdd9d951b29ded2d" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.291593 4841 scope.go:117] "RemoveContainer" containerID="66fc3d09ce39daa2a0cf45cffc19a1655ade83dfb5253832731eb5f43d3fa2a8" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.310166 4841 scope.go:117] "RemoveContainer" containerID="70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284" Jan 30 05:12:41 crc kubenswrapper[4841]: E0130 05:12:41.310549 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284\": container with ID starting with 70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284 not found: ID does not exist" containerID="70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.310593 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284"} err="failed to get container status \"70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284\": rpc error: code = NotFound desc = could not find container \"70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284\": container with ID starting with 70ab8315ea84af53212747a052ef825c62104a5d53bd297a5713ec6bffb30284 not found: ID does not exist" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.310621 4841 scope.go:117] "RemoveContainer" containerID="7f8f28c4fc6e50e80368f4e8757fbbe47cfd12f574d358b9fdd9d951b29ded2d" Jan 30 05:12:41 crc kubenswrapper[4841]: E0130 05:12:41.311003 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8f28c4fc6e50e80368f4e8757fbbe47cfd12f574d358b9fdd9d951b29ded2d\": container with ID starting with 7f8f28c4fc6e50e80368f4e8757fbbe47cfd12f574d358b9fdd9d951b29ded2d not found: ID does not exist" containerID="7f8f28c4fc6e50e80368f4e8757fbbe47cfd12f574d358b9fdd9d951b29ded2d" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.311049 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8f28c4fc6e50e80368f4e8757fbbe47cfd12f574d358b9fdd9d951b29ded2d"} err="failed to get container status \"7f8f28c4fc6e50e80368f4e8757fbbe47cfd12f574d358b9fdd9d951b29ded2d\": rpc error: code = NotFound desc = could not find container \"7f8f28c4fc6e50e80368f4e8757fbbe47cfd12f574d358b9fdd9d951b29ded2d\": container with ID starting with 7f8f28c4fc6e50e80368f4e8757fbbe47cfd12f574d358b9fdd9d951b29ded2d not found: ID does not exist" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.311078 4841 scope.go:117] "RemoveContainer" containerID="66fc3d09ce39daa2a0cf45cffc19a1655ade83dfb5253832731eb5f43d3fa2a8" Jan 30 05:12:41 crc kubenswrapper[4841]: E0130 05:12:41.311518 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66fc3d09ce39daa2a0cf45cffc19a1655ade83dfb5253832731eb5f43d3fa2a8\": container with ID starting with 66fc3d09ce39daa2a0cf45cffc19a1655ade83dfb5253832731eb5f43d3fa2a8 not found: ID does not exist" containerID="66fc3d09ce39daa2a0cf45cffc19a1655ade83dfb5253832731eb5f43d3fa2a8" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.311547 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66fc3d09ce39daa2a0cf45cffc19a1655ade83dfb5253832731eb5f43d3fa2a8"} err="failed to get container status \"66fc3d09ce39daa2a0cf45cffc19a1655ade83dfb5253832731eb5f43d3fa2a8\": rpc error: code = NotFound desc = could not find container \"66fc3d09ce39daa2a0cf45cffc19a1655ade83dfb5253832731eb5f43d3fa2a8\": container with ID starting with 66fc3d09ce39daa2a0cf45cffc19a1655ade83dfb5253832731eb5f43d3fa2a8 not found: ID does not exist" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.311569 4841 scope.go:117] "RemoveContainer" containerID="53c57b5e2798421fc4a66deb59265f89b7723297b8552704b8d55553ae59063d" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.321628 4841 scope.go:117] "RemoveContainer" containerID="2c66e8c8fcd07a8f41b10764c3f1d21ade4ef113625266fc99161e0efb1ce4fe" Jan 30 05:12:41 crc kubenswrapper[4841]: I0130 05:12:41.332714 4841 scope.go:117] "RemoveContainer" containerID="a3e8e7f692a58c41d7bfddcfd37abf5b0f769a8b168cbbc22bf45c0258a054cb" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065365 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8rcwn"] Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065566 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c7d511-de5b-4e9d-acdc-615d18346215" containerName="extract-utilities" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065578 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c7d511-de5b-4e9d-acdc-615d18346215" containerName="extract-utilities" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065588 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7599df20-a4f3-4f48-b413-8eec3c0bba38" containerName="marketplace-operator" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065594 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7599df20-a4f3-4f48-b413-8eec3c0bba38" containerName="marketplace-operator" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065604 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a117bdc-99af-4fd8-a810-b2d08e174f77" containerName="registry-server" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065611 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a117bdc-99af-4fd8-a810-b2d08e174f77" containerName="registry-server" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065620 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12231fcc-9527-405e-bac6-734865031f83" containerName="registry-server" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065626 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="12231fcc-9527-405e-bac6-734865031f83" containerName="registry-server" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065635 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" containerName="extract-content" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065640 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" containerName="extract-content" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065651 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12231fcc-9527-405e-bac6-734865031f83" containerName="extract-content" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065657 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="12231fcc-9527-405e-bac6-734865031f83" containerName="extract-content" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065666 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" containerName="extract-utilities" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065671 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" containerName="extract-utilities" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065679 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" containerName="registry-server" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065685 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" containerName="registry-server" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065692 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12231fcc-9527-405e-bac6-734865031f83" containerName="extract-utilities" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065697 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="12231fcc-9527-405e-bac6-734865031f83" containerName="extract-utilities" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065705 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c7d511-de5b-4e9d-acdc-615d18346215" containerName="registry-server" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065711 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c7d511-de5b-4e9d-acdc-615d18346215" containerName="registry-server" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065718 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c7d511-de5b-4e9d-acdc-615d18346215" containerName="extract-content" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065723 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c7d511-de5b-4e9d-acdc-615d18346215" containerName="extract-content" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065731 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a117bdc-99af-4fd8-a810-b2d08e174f77" containerName="extract-utilities" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065737 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a117bdc-99af-4fd8-a810-b2d08e174f77" containerName="extract-utilities" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065742 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a117bdc-99af-4fd8-a810-b2d08e174f77" containerName="extract-content" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065748 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a117bdc-99af-4fd8-a810-b2d08e174f77" containerName="extract-content" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065824 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7599df20-a4f3-4f48-b413-8eec3c0bba38" containerName="marketplace-operator" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065834 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7599df20-a4f3-4f48-b413-8eec3c0bba38" containerName="marketplace-operator" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065844 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c7d511-de5b-4e9d-acdc-615d18346215" containerName="registry-server" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065851 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" containerName="registry-server" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065860 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="12231fcc-9527-405e-bac6-734865031f83" containerName="registry-server" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065868 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a117bdc-99af-4fd8-a810-b2d08e174f77" containerName="registry-server" Jan 30 05:12:42 crc kubenswrapper[4841]: E0130 05:12:42.065943 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7599df20-a4f3-4f48-b413-8eec3c0bba38" containerName="marketplace-operator" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.065950 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7599df20-a4f3-4f48-b413-8eec3c0bba38" containerName="marketplace-operator" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.066518 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.069278 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.090923 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rcwn"] Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.135982 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d383667a-b7ea-42ea-b91f-4ac4306ddbaf-utilities\") pod \"redhat-marketplace-8rcwn\" (UID: \"d383667a-b7ea-42ea-b91f-4ac4306ddbaf\") " pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.136074 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njw5z\" (UniqueName: \"kubernetes.io/projected/d383667a-b7ea-42ea-b91f-4ac4306ddbaf-kube-api-access-njw5z\") pod \"redhat-marketplace-8rcwn\" (UID: \"d383667a-b7ea-42ea-b91f-4ac4306ddbaf\") " pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.136136 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d383667a-b7ea-42ea-b91f-4ac4306ddbaf-catalog-content\") pod \"redhat-marketplace-8rcwn\" (UID: \"d383667a-b7ea-42ea-b91f-4ac4306ddbaf\") " pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.237777 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d383667a-b7ea-42ea-b91f-4ac4306ddbaf-utilities\") pod \"redhat-marketplace-8rcwn\" (UID: \"d383667a-b7ea-42ea-b91f-4ac4306ddbaf\") " pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.237880 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njw5z\" (UniqueName: \"kubernetes.io/projected/d383667a-b7ea-42ea-b91f-4ac4306ddbaf-kube-api-access-njw5z\") pod \"redhat-marketplace-8rcwn\" (UID: \"d383667a-b7ea-42ea-b91f-4ac4306ddbaf\") " pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.238019 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d383667a-b7ea-42ea-b91f-4ac4306ddbaf-catalog-content\") pod \"redhat-marketplace-8rcwn\" (UID: \"d383667a-b7ea-42ea-b91f-4ac4306ddbaf\") " pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.238663 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d383667a-b7ea-42ea-b91f-4ac4306ddbaf-utilities\") pod \"redhat-marketplace-8rcwn\" (UID: \"d383667a-b7ea-42ea-b91f-4ac4306ddbaf\") " pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.238667 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d383667a-b7ea-42ea-b91f-4ac4306ddbaf-catalog-content\") pod \"redhat-marketplace-8rcwn\" (UID: \"d383667a-b7ea-42ea-b91f-4ac4306ddbaf\") " pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.282667 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lb7sq"] Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.283764 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.284897 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njw5z\" (UniqueName: \"kubernetes.io/projected/d383667a-b7ea-42ea-b91f-4ac4306ddbaf-kube-api-access-njw5z\") pod \"redhat-marketplace-8rcwn\" (UID: \"d383667a-b7ea-42ea-b91f-4ac4306ddbaf\") " pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.288235 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.292460 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lb7sq"] Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.339545 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsh5s\" (UniqueName: \"kubernetes.io/projected/3fd4a2f0-3409-4512-9591-fd515639c1ea-kube-api-access-qsh5s\") pod \"certified-operators-lb7sq\" (UID: \"3fd4a2f0-3409-4512-9591-fd515639c1ea\") " pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.339635 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd4a2f0-3409-4512-9591-fd515639c1ea-catalog-content\") pod \"certified-operators-lb7sq\" (UID: \"3fd4a2f0-3409-4512-9591-fd515639c1ea\") " pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.339714 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd4a2f0-3409-4512-9591-fd515639c1ea-utilities\") pod \"certified-operators-lb7sq\" (UID: \"3fd4a2f0-3409-4512-9591-fd515639c1ea\") " pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.390867 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.439708 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12231fcc-9527-405e-bac6-734865031f83" path="/var/lib/kubelet/pods/12231fcc-9527-405e-bac6-734865031f83/volumes" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.440860 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c7d511-de5b-4e9d-acdc-615d18346215" path="/var/lib/kubelet/pods/13c7d511-de5b-4e9d-acdc-615d18346215/volumes" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.441016 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd4a2f0-3409-4512-9591-fd515639c1ea-utilities\") pod \"certified-operators-lb7sq\" (UID: \"3fd4a2f0-3409-4512-9591-fd515639c1ea\") " pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.441109 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsh5s\" (UniqueName: \"kubernetes.io/projected/3fd4a2f0-3409-4512-9591-fd515639c1ea-kube-api-access-qsh5s\") pod \"certified-operators-lb7sq\" (UID: \"3fd4a2f0-3409-4512-9591-fd515639c1ea\") " pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.441137 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd4a2f0-3409-4512-9591-fd515639c1ea-catalog-content\") pod \"certified-operators-lb7sq\" (UID: \"3fd4a2f0-3409-4512-9591-fd515639c1ea\") " pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.441567 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd4a2f0-3409-4512-9591-fd515639c1ea-catalog-content\") pod \"certified-operators-lb7sq\" (UID: \"3fd4a2f0-3409-4512-9591-fd515639c1ea\") " pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.441831 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd4a2f0-3409-4512-9591-fd515639c1ea-utilities\") pod \"certified-operators-lb7sq\" (UID: \"3fd4a2f0-3409-4512-9591-fd515639c1ea\") " pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.446665 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a117bdc-99af-4fd8-a810-b2d08e174f77" path="/var/lib/kubelet/pods/4a117bdc-99af-4fd8-a810-b2d08e174f77/volumes" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.447480 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a6ce2ed-da59-4d16-8d01-022b22e746f1" path="/var/lib/kubelet/pods/6a6ce2ed-da59-4d16-8d01-022b22e746f1/volumes" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.448375 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7599df20-a4f3-4f48-b413-8eec3c0bba38" path="/var/lib/kubelet/pods/7599df20-a4f3-4f48-b413-8eec3c0bba38/volumes" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.460644 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsh5s\" (UniqueName: \"kubernetes.io/projected/3fd4a2f0-3409-4512-9591-fd515639c1ea-kube-api-access-qsh5s\") pod \"certified-operators-lb7sq\" (UID: \"3fd4a2f0-3409-4512-9591-fd515639c1ea\") " pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.583121 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rcwn"] Jan 30 05:12:42 crc kubenswrapper[4841]: W0130 05:12:42.591776 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd383667a_b7ea_42ea_b91f_4ac4306ddbaf.slice/crio-fcadfce41d31a8413fb24e8b9357fcef8ac71ba26029486fc2324c4fc3369b6e WatchSource:0}: Error finding container fcadfce41d31a8413fb24e8b9357fcef8ac71ba26029486fc2324c4fc3369b6e: Status 404 returned error can't find the container with id fcadfce41d31a8413fb24e8b9357fcef8ac71ba26029486fc2324c4fc3369b6e Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.637458 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:42 crc kubenswrapper[4841]: I0130 05:12:42.829892 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lb7sq"] Jan 30 05:12:42 crc kubenswrapper[4841]: W0130 05:12:42.840771 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fd4a2f0_3409_4512_9591_fd515639c1ea.slice/crio-2230b601320a80b65d840d51d9d1188b1563e0368f72bd81789133f97de32b2e WatchSource:0}: Error finding container 2230b601320a80b65d840d51d9d1188b1563e0368f72bd81789133f97de32b2e: Status 404 returned error can't find the container with id 2230b601320a80b65d840d51d9d1188b1563e0368f72bd81789133f97de32b2e Jan 30 05:12:43 crc kubenswrapper[4841]: I0130 05:12:43.087191 4841 generic.go:334] "Generic (PLEG): container finished" podID="3fd4a2f0-3409-4512-9591-fd515639c1ea" containerID="875232ba39080ca86e1c53bd6686287237c8ead25d46e41625b5c20cc963a2d7" exitCode=0 Jan 30 05:12:43 crc kubenswrapper[4841]: I0130 05:12:43.087254 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lb7sq" event={"ID":"3fd4a2f0-3409-4512-9591-fd515639c1ea","Type":"ContainerDied","Data":"875232ba39080ca86e1c53bd6686287237c8ead25d46e41625b5c20cc963a2d7"} Jan 30 05:12:43 crc kubenswrapper[4841]: I0130 05:12:43.087284 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lb7sq" event={"ID":"3fd4a2f0-3409-4512-9591-fd515639c1ea","Type":"ContainerStarted","Data":"2230b601320a80b65d840d51d9d1188b1563e0368f72bd81789133f97de32b2e"} Jan 30 05:12:43 crc kubenswrapper[4841]: I0130 05:12:43.089019 4841 generic.go:334] "Generic (PLEG): container finished" podID="d383667a-b7ea-42ea-b91f-4ac4306ddbaf" containerID="01eaf95d4f82bf5d61c0e6f5607bdc02fae4302ebe86610b77bafadd9cd3c796" exitCode=0 Jan 30 05:12:43 crc kubenswrapper[4841]: I0130 05:12:43.089036 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rcwn" event={"ID":"d383667a-b7ea-42ea-b91f-4ac4306ddbaf","Type":"ContainerDied","Data":"01eaf95d4f82bf5d61c0e6f5607bdc02fae4302ebe86610b77bafadd9cd3c796"} Jan 30 05:12:43 crc kubenswrapper[4841]: I0130 05:12:43.089051 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rcwn" event={"ID":"d383667a-b7ea-42ea-b91f-4ac4306ddbaf","Type":"ContainerStarted","Data":"fcadfce41d31a8413fb24e8b9357fcef8ac71ba26029486fc2324c4fc3369b6e"} Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.083356 4841 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.094596 4841 generic.go:334] "Generic (PLEG): container finished" podID="3fd4a2f0-3409-4512-9591-fd515639c1ea" containerID="dea90c876a40a6a0401a25310c36fb8ced0274db2dcffd017efc77e8d113d5b2" exitCode=0 Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.094663 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lb7sq" event={"ID":"3fd4a2f0-3409-4512-9591-fd515639c1ea","Type":"ContainerDied","Data":"dea90c876a40a6a0401a25310c36fb8ced0274db2dcffd017efc77e8d113d5b2"} Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.098191 4841 generic.go:334] "Generic (PLEG): container finished" podID="d383667a-b7ea-42ea-b91f-4ac4306ddbaf" containerID="0419c1a3996d966f8cbe0a4b15ea4b9e36699157c526b242155c116ea1c03410" exitCode=0 Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.098217 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rcwn" event={"ID":"d383667a-b7ea-42ea-b91f-4ac4306ddbaf","Type":"ContainerDied","Data":"0419c1a3996d966f8cbe0a4b15ea4b9e36699157c526b242155c116ea1c03410"} Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.468051 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s766p"] Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.469368 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.474383 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.487945 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s766p"] Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.570980 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e34e1d3e-f2d8-466b-b80e-158f344ac558-catalog-content\") pod \"redhat-operators-s766p\" (UID: \"e34e1d3e-f2d8-466b-b80e-158f344ac558\") " pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.571156 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e34e1d3e-f2d8-466b-b80e-158f344ac558-utilities\") pod \"redhat-operators-s766p\" (UID: \"e34e1d3e-f2d8-466b-b80e-158f344ac558\") " pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.571180 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cg94\" (UniqueName: \"kubernetes.io/projected/e34e1d3e-f2d8-466b-b80e-158f344ac558-kube-api-access-6cg94\") pod \"redhat-operators-s766p\" (UID: \"e34e1d3e-f2d8-466b-b80e-158f344ac558\") " pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.672362 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e34e1d3e-f2d8-466b-b80e-158f344ac558-utilities\") pod \"redhat-operators-s766p\" (UID: \"e34e1d3e-f2d8-466b-b80e-158f344ac558\") " pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.672466 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cg94\" (UniqueName: \"kubernetes.io/projected/e34e1d3e-f2d8-466b-b80e-158f344ac558-kube-api-access-6cg94\") pod \"redhat-operators-s766p\" (UID: \"e34e1d3e-f2d8-466b-b80e-158f344ac558\") " pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.672516 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e34e1d3e-f2d8-466b-b80e-158f344ac558-catalog-content\") pod \"redhat-operators-s766p\" (UID: \"e34e1d3e-f2d8-466b-b80e-158f344ac558\") " pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.672906 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e34e1d3e-f2d8-466b-b80e-158f344ac558-utilities\") pod \"redhat-operators-s766p\" (UID: \"e34e1d3e-f2d8-466b-b80e-158f344ac558\") " pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.673190 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e34e1d3e-f2d8-466b-b80e-158f344ac558-catalog-content\") pod \"redhat-operators-s766p\" (UID: \"e34e1d3e-f2d8-466b-b80e-158f344ac558\") " pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.676079 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q52v7"] Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.678246 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.683622 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.684746 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q52v7"] Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.714355 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cg94\" (UniqueName: \"kubernetes.io/projected/e34e1d3e-f2d8-466b-b80e-158f344ac558-kube-api-access-6cg94\") pod \"redhat-operators-s766p\" (UID: \"e34e1d3e-f2d8-466b-b80e-158f344ac558\") " pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.773346 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bb7f95e-112b-4fe7-89eb-398eea0d0472-catalog-content\") pod \"community-operators-q52v7\" (UID: \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\") " pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.773727 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bb7f95e-112b-4fe7-89eb-398eea0d0472-utilities\") pod \"community-operators-q52v7\" (UID: \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\") " pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.773749 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqs2d\" (UniqueName: \"kubernetes.io/projected/0bb7f95e-112b-4fe7-89eb-398eea0d0472-kube-api-access-wqs2d\") pod \"community-operators-q52v7\" (UID: \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\") " pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.835255 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.875111 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bb7f95e-112b-4fe7-89eb-398eea0d0472-catalog-content\") pod \"community-operators-q52v7\" (UID: \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\") " pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.875190 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bb7f95e-112b-4fe7-89eb-398eea0d0472-utilities\") pod \"community-operators-q52v7\" (UID: \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\") " pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.875225 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqs2d\" (UniqueName: \"kubernetes.io/projected/0bb7f95e-112b-4fe7-89eb-398eea0d0472-kube-api-access-wqs2d\") pod \"community-operators-q52v7\" (UID: \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\") " pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.876048 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bb7f95e-112b-4fe7-89eb-398eea0d0472-catalog-content\") pod \"community-operators-q52v7\" (UID: \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\") " pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.876183 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bb7f95e-112b-4fe7-89eb-398eea0d0472-utilities\") pod \"community-operators-q52v7\" (UID: \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\") " pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:44 crc kubenswrapper[4841]: I0130 05:12:44.892097 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqs2d\" (UniqueName: \"kubernetes.io/projected/0bb7f95e-112b-4fe7-89eb-398eea0d0472-kube-api-access-wqs2d\") pod \"community-operators-q52v7\" (UID: \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\") " pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:45 crc kubenswrapper[4841]: I0130 05:12:45.029724 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:45 crc kubenswrapper[4841]: I0130 05:12:45.127235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lb7sq" event={"ID":"3fd4a2f0-3409-4512-9591-fd515639c1ea","Type":"ContainerStarted","Data":"8c3ed8930c3cb01980869a1f73cfde0a5dc05f6516e35821491690f1b699abcf"} Jan 30 05:12:45 crc kubenswrapper[4841]: I0130 05:12:45.129862 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rcwn" event={"ID":"d383667a-b7ea-42ea-b91f-4ac4306ddbaf","Type":"ContainerStarted","Data":"f107a09ccd069442c8f3950a8a3b7fbb2da510bf5c2fbc656f3dde3949b8513e"} Jan 30 05:12:45 crc kubenswrapper[4841]: I0130 05:12:45.148086 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lb7sq" podStartSLOduration=1.680352868 podStartE2EDuration="3.148069173s" podCreationTimestamp="2026-01-30 05:12:42 +0000 UTC" firstStartedPulling="2026-01-30 05:12:43.088410116 +0000 UTC m=+300.081882754" lastFinishedPulling="2026-01-30 05:12:44.556126391 +0000 UTC m=+301.549599059" observedRunningTime="2026-01-30 05:12:45.145052163 +0000 UTC m=+302.138524801" watchObservedRunningTime="2026-01-30 05:12:45.148069173 +0000 UTC m=+302.141541811" Jan 30 05:12:45 crc kubenswrapper[4841]: I0130 05:12:45.169014 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8rcwn" podStartSLOduration=1.552974217 podStartE2EDuration="3.168997203s" podCreationTimestamp="2026-01-30 05:12:42 +0000 UTC" firstStartedPulling="2026-01-30 05:12:43.089932511 +0000 UTC m=+300.083405149" lastFinishedPulling="2026-01-30 05:12:44.705955497 +0000 UTC m=+301.699428135" observedRunningTime="2026-01-30 05:12:45.164696962 +0000 UTC m=+302.158169620" watchObservedRunningTime="2026-01-30 05:12:45.168997203 +0000 UTC m=+302.162469831" Jan 30 05:12:45 crc kubenswrapper[4841]: I0130 05:12:45.249574 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s766p"] Jan 30 05:12:45 crc kubenswrapper[4841]: I0130 05:12:45.254205 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q52v7"] Jan 30 05:12:46 crc kubenswrapper[4841]: I0130 05:12:46.136290 4841 generic.go:334] "Generic (PLEG): container finished" podID="e34e1d3e-f2d8-466b-b80e-158f344ac558" containerID="d5bc853131657c7148ddbf511ed1670faefcc7fd272c27a449e76aac835a2f5e" exitCode=0 Jan 30 05:12:46 crc kubenswrapper[4841]: I0130 05:12:46.136597 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s766p" event={"ID":"e34e1d3e-f2d8-466b-b80e-158f344ac558","Type":"ContainerDied","Data":"d5bc853131657c7148ddbf511ed1670faefcc7fd272c27a449e76aac835a2f5e"} Jan 30 05:12:46 crc kubenswrapper[4841]: I0130 05:12:46.136623 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s766p" event={"ID":"e34e1d3e-f2d8-466b-b80e-158f344ac558","Type":"ContainerStarted","Data":"0687a99ef8c70b4225025bc3429615d166db802f79586966adda6404e5ab2f27"} Jan 30 05:12:46 crc kubenswrapper[4841]: I0130 05:12:46.141932 4841 generic.go:334] "Generic (PLEG): container finished" podID="0bb7f95e-112b-4fe7-89eb-398eea0d0472" containerID="db1e7cb6a870303d33a9e8f4ae67db7fafee5ae7f5ba69ff56b910eaf9e3c55e" exitCode=0 Jan 30 05:12:46 crc kubenswrapper[4841]: I0130 05:12:46.142950 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q52v7" event={"ID":"0bb7f95e-112b-4fe7-89eb-398eea0d0472","Type":"ContainerDied","Data":"db1e7cb6a870303d33a9e8f4ae67db7fafee5ae7f5ba69ff56b910eaf9e3c55e"} Jan 30 05:12:46 crc kubenswrapper[4841]: I0130 05:12:46.142986 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q52v7" event={"ID":"0bb7f95e-112b-4fe7-89eb-398eea0d0472","Type":"ContainerStarted","Data":"42e3f78653573f945163d6a7de5c5140e1f008cee1e99223579aaca8b9d77c46"} Jan 30 05:12:47 crc kubenswrapper[4841]: I0130 05:12:47.148293 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q52v7" event={"ID":"0bb7f95e-112b-4fe7-89eb-398eea0d0472","Type":"ContainerStarted","Data":"9362a199c72b87cec3b2d1c31aeabfd87693215e2170d87296de65ee3ebe8cc3"} Jan 30 05:12:47 crc kubenswrapper[4841]: I0130 05:12:47.151812 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s766p" event={"ID":"e34e1d3e-f2d8-466b-b80e-158f344ac558","Type":"ContainerStarted","Data":"8d30c52aa629b9bfa347b757498d232ace5d38dcfbba442e2098e8db8666e0be"} Jan 30 05:12:48 crc kubenswrapper[4841]: I0130 05:12:48.163439 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s766p" event={"ID":"e34e1d3e-f2d8-466b-b80e-158f344ac558","Type":"ContainerDied","Data":"8d30c52aa629b9bfa347b757498d232ace5d38dcfbba442e2098e8db8666e0be"} Jan 30 05:12:48 crc kubenswrapper[4841]: I0130 05:12:48.163393 4841 generic.go:334] "Generic (PLEG): container finished" podID="e34e1d3e-f2d8-466b-b80e-158f344ac558" containerID="8d30c52aa629b9bfa347b757498d232ace5d38dcfbba442e2098e8db8666e0be" exitCode=0 Jan 30 05:12:48 crc kubenswrapper[4841]: I0130 05:12:48.165482 4841 generic.go:334] "Generic (PLEG): container finished" podID="0bb7f95e-112b-4fe7-89eb-398eea0d0472" containerID="9362a199c72b87cec3b2d1c31aeabfd87693215e2170d87296de65ee3ebe8cc3" exitCode=0 Jan 30 05:12:48 crc kubenswrapper[4841]: I0130 05:12:48.165516 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q52v7" event={"ID":"0bb7f95e-112b-4fe7-89eb-398eea0d0472","Type":"ContainerDied","Data":"9362a199c72b87cec3b2d1c31aeabfd87693215e2170d87296de65ee3ebe8cc3"} Jan 30 05:12:49 crc kubenswrapper[4841]: I0130 05:12:49.173049 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s766p" event={"ID":"e34e1d3e-f2d8-466b-b80e-158f344ac558","Type":"ContainerStarted","Data":"92d800ef233652bf4cdaa1ddcf3818e61c50d48fc3e3f2995d9b5a647fe92fc9"} Jan 30 05:12:49 crc kubenswrapper[4841]: I0130 05:12:49.175144 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q52v7" event={"ID":"0bb7f95e-112b-4fe7-89eb-398eea0d0472","Type":"ContainerStarted","Data":"2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67"} Jan 30 05:12:49 crc kubenswrapper[4841]: I0130 05:12:49.195710 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s766p" podStartSLOduration=2.758882548 podStartE2EDuration="5.195691131s" podCreationTimestamp="2026-01-30 05:12:44 +0000 UTC" firstStartedPulling="2026-01-30 05:12:46.138963201 +0000 UTC m=+303.132435849" lastFinishedPulling="2026-01-30 05:12:48.575771784 +0000 UTC m=+305.569244432" observedRunningTime="2026-01-30 05:12:49.194451241 +0000 UTC m=+306.187923879" watchObservedRunningTime="2026-01-30 05:12:49.195691131 +0000 UTC m=+306.189163789" Jan 30 05:12:49 crc kubenswrapper[4841]: I0130 05:12:49.218690 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q52v7" podStartSLOduration=2.599414586 podStartE2EDuration="5.218673108s" podCreationTimestamp="2026-01-30 05:12:44 +0000 UTC" firstStartedPulling="2026-01-30 05:12:46.144495671 +0000 UTC m=+303.137968319" lastFinishedPulling="2026-01-30 05:12:48.763754203 +0000 UTC m=+305.757226841" observedRunningTime="2026-01-30 05:12:49.216964128 +0000 UTC m=+306.210436766" watchObservedRunningTime="2026-01-30 05:12:49.218673108 +0000 UTC m=+306.212145746" Jan 30 05:12:52 crc kubenswrapper[4841]: I0130 05:12:52.391843 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:52 crc kubenswrapper[4841]: I0130 05:12:52.394761 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:52 crc kubenswrapper[4841]: I0130 05:12:52.454591 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:52 crc kubenswrapper[4841]: I0130 05:12:52.650547 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:52 crc kubenswrapper[4841]: I0130 05:12:52.650583 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:52 crc kubenswrapper[4841]: I0130 05:12:52.705936 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:53 crc kubenswrapper[4841]: I0130 05:12:53.251102 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lb7sq" Jan 30 05:12:53 crc kubenswrapper[4841]: I0130 05:12:53.252370 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8rcwn" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.276372 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" podUID="79ec5e12-1868-4efd-a76c-e7a06360cb3b" containerName="registry" containerID="cri-o://f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28" gracePeriod=30 Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.738329 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.820311 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79ec5e12-1868-4efd-a76c-e7a06360cb3b-ca-trust-extracted\") pod \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.820386 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tss7\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-kube-api-access-7tss7\") pod \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.820435 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79ec5e12-1868-4efd-a76c-e7a06360cb3b-registry-certificates\") pod \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.820455 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79ec5e12-1868-4efd-a76c-e7a06360cb3b-installation-pull-secrets\") pod \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.820482 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-registry-tls\") pod \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.820636 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.820655 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-bound-sa-token\") pod \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.820694 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79ec5e12-1868-4efd-a76c-e7a06360cb3b-trusted-ca\") pod \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\" (UID: \"79ec5e12-1868-4efd-a76c-e7a06360cb3b\") " Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.821818 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79ec5e12-1868-4efd-a76c-e7a06360cb3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "79ec5e12-1868-4efd-a76c-e7a06360cb3b" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.821849 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79ec5e12-1868-4efd-a76c-e7a06360cb3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "79ec5e12-1868-4efd-a76c-e7a06360cb3b" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.826454 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "79ec5e12-1868-4efd-a76c-e7a06360cb3b" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.826684 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "79ec5e12-1868-4efd-a76c-e7a06360cb3b" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.828492 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-kube-api-access-7tss7" (OuterVolumeSpecName: "kube-api-access-7tss7") pod "79ec5e12-1868-4efd-a76c-e7a06360cb3b" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b"). InnerVolumeSpecName "kube-api-access-7tss7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.835688 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.835722 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.835747 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79ec5e12-1868-4efd-a76c-e7a06360cb3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "79ec5e12-1868-4efd-a76c-e7a06360cb3b" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.841969 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "79ec5e12-1868-4efd-a76c-e7a06360cb3b" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.861077 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79ec5e12-1868-4efd-a76c-e7a06360cb3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "79ec5e12-1868-4efd-a76c-e7a06360cb3b" (UID: "79ec5e12-1868-4efd-a76c-e7a06360cb3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.922255 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tss7\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-kube-api-access-7tss7\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.922292 4841 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79ec5e12-1868-4efd-a76c-e7a06360cb3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.922308 4841 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79ec5e12-1868-4efd-a76c-e7a06360cb3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.922320 4841 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.922335 4841 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79ec5e12-1868-4efd-a76c-e7a06360cb3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.922346 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79ec5e12-1868-4efd-a76c-e7a06360cb3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:54 crc kubenswrapper[4841]: I0130 05:12:54.922357 4841 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79ec5e12-1868-4efd-a76c-e7a06360cb3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.030376 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.030514 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.088890 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.209244 4841 generic.go:334] "Generic (PLEG): container finished" podID="79ec5e12-1868-4efd-a76c-e7a06360cb3b" containerID="f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28" exitCode=0 Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.209425 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" event={"ID":"79ec5e12-1868-4efd-a76c-e7a06360cb3b","Type":"ContainerDied","Data":"f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28"} Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.210270 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" event={"ID":"79ec5e12-1868-4efd-a76c-e7a06360cb3b","Type":"ContainerDied","Data":"a083385be97f65aa7fe56523cba5017456bef1b3b22c07eab2fffa9bd9ada7f9"} Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.210291 4841 scope.go:117] "RemoveContainer" containerID="f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28" Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.209509 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2f7jj" Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.230734 4841 scope.go:117] "RemoveContainer" containerID="f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28" Jan 30 05:12:55 crc kubenswrapper[4841]: E0130 05:12:55.231178 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28\": container with ID starting with f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28 not found: ID does not exist" containerID="f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28" Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.231221 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28"} err="failed to get container status \"f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28\": rpc error: code = NotFound desc = could not find container \"f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28\": container with ID starting with f025c92b3e3bd9bd07722993cacc183ed0e5e91f5e06876f19968533ce047b28 not found: ID does not exist" Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.237564 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2f7jj"] Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.241469 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2f7jj"] Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.268155 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q52v7" Jan 30 05:12:55 crc kubenswrapper[4841]: I0130 05:12:55.880569 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s766p" podUID="e34e1d3e-f2d8-466b-b80e-158f344ac558" containerName="registry-server" probeResult="failure" output=< Jan 30 05:12:55 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 05:12:55 crc kubenswrapper[4841]: > Jan 30 05:12:56 crc kubenswrapper[4841]: I0130 05:12:56.440869 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79ec5e12-1868-4efd-a76c-e7a06360cb3b" path="/var/lib/kubelet/pods/79ec5e12-1868-4efd-a76c-e7a06360cb3b/volumes" Jan 30 05:13:04 crc kubenswrapper[4841]: I0130 05:13:04.903728 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:13:04 crc kubenswrapper[4841]: I0130 05:13:04.977315 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s766p" Jan 30 05:13:40 crc kubenswrapper[4841]: I0130 05:13:40.463621 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:13:40 crc kubenswrapper[4841]: I0130 05:13:40.464251 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:14:10 crc kubenswrapper[4841]: I0130 05:14:10.464021 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:14:10 crc kubenswrapper[4841]: I0130 05:14:10.464888 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:14:40 crc kubenswrapper[4841]: I0130 05:14:40.463493 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:14:40 crc kubenswrapper[4841]: I0130 05:14:40.464101 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:14:40 crc kubenswrapper[4841]: I0130 05:14:40.464157 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:14:40 crc kubenswrapper[4841]: I0130 05:14:40.464997 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"abf7e2b0da98eaaaac474db21786882f5f51c317ca2c9bd69e78825d71977aaa"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:14:40 crc kubenswrapper[4841]: I0130 05:14:40.465080 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://abf7e2b0da98eaaaac474db21786882f5f51c317ca2c9bd69e78825d71977aaa" gracePeriod=600 Jan 30 05:14:40 crc kubenswrapper[4841]: I0130 05:14:40.925288 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="abf7e2b0da98eaaaac474db21786882f5f51c317ca2c9bd69e78825d71977aaa" exitCode=0 Jan 30 05:14:40 crc kubenswrapper[4841]: I0130 05:14:40.925455 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"abf7e2b0da98eaaaac474db21786882f5f51c317ca2c9bd69e78825d71977aaa"} Jan 30 05:14:40 crc kubenswrapper[4841]: I0130 05:14:40.926223 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"8c604b3b136e322bd1c3a2747b527f8fe3d354491e2d2280a44ada1ad6f6b10d"} Jan 30 05:14:40 crc kubenswrapper[4841]: I0130 05:14:40.926301 4841 scope.go:117] "RemoveContainer" containerID="9d62de2a8f6dec1ef3fd77500025a4ce24d2bd1a9a737dc90e90c85810ebf801" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.186158 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t"] Jan 30 05:15:00 crc kubenswrapper[4841]: E0130 05:15:00.186970 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec5e12-1868-4efd-a76c-e7a06360cb3b" containerName="registry" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.186990 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec5e12-1868-4efd-a76c-e7a06360cb3b" containerName="registry" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.187162 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec5e12-1868-4efd-a76c-e7a06360cb3b" containerName="registry" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.187790 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.191214 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.192019 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.212500 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t"] Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.294208 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f91b1aae-b372-4348-b07b-0afb79ecfc61-config-volume\") pod \"collect-profiles-29495835-dqj6t\" (UID: \"f91b1aae-b372-4348-b07b-0afb79ecfc61\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.294296 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmbmw\" (UniqueName: \"kubernetes.io/projected/f91b1aae-b372-4348-b07b-0afb79ecfc61-kube-api-access-xmbmw\") pod \"collect-profiles-29495835-dqj6t\" (UID: \"f91b1aae-b372-4348-b07b-0afb79ecfc61\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.294357 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f91b1aae-b372-4348-b07b-0afb79ecfc61-secret-volume\") pod \"collect-profiles-29495835-dqj6t\" (UID: \"f91b1aae-b372-4348-b07b-0afb79ecfc61\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.395466 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f91b1aae-b372-4348-b07b-0afb79ecfc61-config-volume\") pod \"collect-profiles-29495835-dqj6t\" (UID: \"f91b1aae-b372-4348-b07b-0afb79ecfc61\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.395531 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmbmw\" (UniqueName: \"kubernetes.io/projected/f91b1aae-b372-4348-b07b-0afb79ecfc61-kube-api-access-xmbmw\") pod \"collect-profiles-29495835-dqj6t\" (UID: \"f91b1aae-b372-4348-b07b-0afb79ecfc61\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.395597 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f91b1aae-b372-4348-b07b-0afb79ecfc61-secret-volume\") pod \"collect-profiles-29495835-dqj6t\" (UID: \"f91b1aae-b372-4348-b07b-0afb79ecfc61\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.397135 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f91b1aae-b372-4348-b07b-0afb79ecfc61-config-volume\") pod \"collect-profiles-29495835-dqj6t\" (UID: \"f91b1aae-b372-4348-b07b-0afb79ecfc61\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.405440 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f91b1aae-b372-4348-b07b-0afb79ecfc61-secret-volume\") pod \"collect-profiles-29495835-dqj6t\" (UID: \"f91b1aae-b372-4348-b07b-0afb79ecfc61\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.446203 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmbmw\" (UniqueName: \"kubernetes.io/projected/f91b1aae-b372-4348-b07b-0afb79ecfc61-kube-api-access-xmbmw\") pod \"collect-profiles-29495835-dqj6t\" (UID: \"f91b1aae-b372-4348-b07b-0afb79ecfc61\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.521308 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:15:00 crc kubenswrapper[4841]: I0130 05:15:00.772787 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t"] Jan 30 05:15:01 crc kubenswrapper[4841]: I0130 05:15:01.061801 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" event={"ID":"f91b1aae-b372-4348-b07b-0afb79ecfc61","Type":"ContainerStarted","Data":"a9dfc72753b604180f840fb28e8d834420846f09f8cc3ecf5371ce85c5fdf6d8"} Jan 30 05:15:01 crc kubenswrapper[4841]: I0130 05:15:01.062312 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" event={"ID":"f91b1aae-b372-4348-b07b-0afb79ecfc61","Type":"ContainerStarted","Data":"3ea5eadcb041f4fbf2786a2324fb29fd72dadafd3cb2386b0b82fcc3dc5e4e3c"} Jan 30 05:15:01 crc kubenswrapper[4841]: I0130 05:15:01.082486 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" podStartSLOduration=1.082426038 podStartE2EDuration="1.082426038s" podCreationTimestamp="2026-01-30 05:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:15:01.08004156 +0000 UTC m=+438.073514208" watchObservedRunningTime="2026-01-30 05:15:01.082426038 +0000 UTC m=+438.075898716" Jan 30 05:15:02 crc kubenswrapper[4841]: I0130 05:15:02.080907 4841 generic.go:334] "Generic (PLEG): container finished" podID="f91b1aae-b372-4348-b07b-0afb79ecfc61" containerID="a9dfc72753b604180f840fb28e8d834420846f09f8cc3ecf5371ce85c5fdf6d8" exitCode=0 Jan 30 05:15:02 crc kubenswrapper[4841]: I0130 05:15:02.080968 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" event={"ID":"f91b1aae-b372-4348-b07b-0afb79ecfc61","Type":"ContainerDied","Data":"a9dfc72753b604180f840fb28e8d834420846f09f8cc3ecf5371ce85c5fdf6d8"} Jan 30 05:15:03 crc kubenswrapper[4841]: I0130 05:15:03.422607 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:15:03 crc kubenswrapper[4841]: I0130 05:15:03.536786 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f91b1aae-b372-4348-b07b-0afb79ecfc61-config-volume\") pod \"f91b1aae-b372-4348-b07b-0afb79ecfc61\" (UID: \"f91b1aae-b372-4348-b07b-0afb79ecfc61\") " Jan 30 05:15:03 crc kubenswrapper[4841]: I0130 05:15:03.536943 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmbmw\" (UniqueName: \"kubernetes.io/projected/f91b1aae-b372-4348-b07b-0afb79ecfc61-kube-api-access-xmbmw\") pod \"f91b1aae-b372-4348-b07b-0afb79ecfc61\" (UID: \"f91b1aae-b372-4348-b07b-0afb79ecfc61\") " Jan 30 05:15:03 crc kubenswrapper[4841]: I0130 05:15:03.537038 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f91b1aae-b372-4348-b07b-0afb79ecfc61-secret-volume\") pod \"f91b1aae-b372-4348-b07b-0afb79ecfc61\" (UID: \"f91b1aae-b372-4348-b07b-0afb79ecfc61\") " Jan 30 05:15:03 crc kubenswrapper[4841]: I0130 05:15:03.537612 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91b1aae-b372-4348-b07b-0afb79ecfc61-config-volume" (OuterVolumeSpecName: "config-volume") pod "f91b1aae-b372-4348-b07b-0afb79ecfc61" (UID: "f91b1aae-b372-4348-b07b-0afb79ecfc61"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:15:03 crc kubenswrapper[4841]: I0130 05:15:03.544877 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91b1aae-b372-4348-b07b-0afb79ecfc61-kube-api-access-xmbmw" (OuterVolumeSpecName: "kube-api-access-xmbmw") pod "f91b1aae-b372-4348-b07b-0afb79ecfc61" (UID: "f91b1aae-b372-4348-b07b-0afb79ecfc61"). InnerVolumeSpecName "kube-api-access-xmbmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:15:03 crc kubenswrapper[4841]: I0130 05:15:03.545334 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91b1aae-b372-4348-b07b-0afb79ecfc61-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f91b1aae-b372-4348-b07b-0afb79ecfc61" (UID: "f91b1aae-b372-4348-b07b-0afb79ecfc61"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:15:03 crc kubenswrapper[4841]: I0130 05:15:03.638977 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmbmw\" (UniqueName: \"kubernetes.io/projected/f91b1aae-b372-4348-b07b-0afb79ecfc61-kube-api-access-xmbmw\") on node \"crc\" DevicePath \"\"" Jan 30 05:15:03 crc kubenswrapper[4841]: I0130 05:15:03.639132 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f91b1aae-b372-4348-b07b-0afb79ecfc61-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:15:03 crc kubenswrapper[4841]: I0130 05:15:03.639153 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f91b1aae-b372-4348-b07b-0afb79ecfc61-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:15:04 crc kubenswrapper[4841]: I0130 05:15:04.104810 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" event={"ID":"f91b1aae-b372-4348-b07b-0afb79ecfc61","Type":"ContainerDied","Data":"3ea5eadcb041f4fbf2786a2324fb29fd72dadafd3cb2386b0b82fcc3dc5e4e3c"} Jan 30 05:15:04 crc kubenswrapper[4841]: I0130 05:15:04.105654 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ea5eadcb041f4fbf2786a2324fb29fd72dadafd3cb2386b0b82fcc3dc5e4e3c" Jan 30 05:15:04 crc kubenswrapper[4841]: I0130 05:15:04.104868 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t" Jan 30 05:16:40 crc kubenswrapper[4841]: I0130 05:16:40.463827 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:16:40 crc kubenswrapper[4841]: I0130 05:16:40.464484 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:17:10 crc kubenswrapper[4841]: I0130 05:17:10.463210 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:17:10 crc kubenswrapper[4841]: I0130 05:17:10.464070 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:17:40 crc kubenswrapper[4841]: I0130 05:17:40.463599 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:17:40 crc kubenswrapper[4841]: I0130 05:17:40.464278 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:17:40 crc kubenswrapper[4841]: I0130 05:17:40.464335 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:17:40 crc kubenswrapper[4841]: I0130 05:17:40.465119 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c604b3b136e322bd1c3a2747b527f8fe3d354491e2d2280a44ada1ad6f6b10d"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:17:40 crc kubenswrapper[4841]: I0130 05:17:40.465214 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://8c604b3b136e322bd1c3a2747b527f8fe3d354491e2d2280a44ada1ad6f6b10d" gracePeriod=600 Jan 30 05:17:41 crc kubenswrapper[4841]: I0130 05:17:41.192138 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="8c604b3b136e322bd1c3a2747b527f8fe3d354491e2d2280a44ada1ad6f6b10d" exitCode=0 Jan 30 05:17:41 crc kubenswrapper[4841]: I0130 05:17:41.192667 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"8c604b3b136e322bd1c3a2747b527f8fe3d354491e2d2280a44ada1ad6f6b10d"} Jan 30 05:17:41 crc kubenswrapper[4841]: I0130 05:17:41.192719 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"59fbdcdf822ce8767e625a9fdb4978cefa9eaf5250b991d0c4b4b761a1a7e71e"} Jan 30 05:17:41 crc kubenswrapper[4841]: I0130 05:17:41.192748 4841 scope.go:117] "RemoveContainer" containerID="abf7e2b0da98eaaaac474db21786882f5f51c317ca2c9bd69e78825d71977aaa" Jan 30 05:18:30 crc kubenswrapper[4841]: I0130 05:18:30.747860 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4fl5g"] Jan 30 05:18:30 crc kubenswrapper[4841]: I0130 05:18:30.749182 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="ovn-controller" containerID="cri-o://4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1" gracePeriod=30 Jan 30 05:18:30 crc kubenswrapper[4841]: I0130 05:18:30.749325 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="northd" containerID="cri-o://65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5" gracePeriod=30 Jan 30 05:18:30 crc kubenswrapper[4841]: I0130 05:18:30.749390 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71" gracePeriod=30 Jan 30 05:18:30 crc kubenswrapper[4841]: I0130 05:18:30.749366 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="sbdb" containerID="cri-o://dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86" gracePeriod=30 Jan 30 05:18:30 crc kubenswrapper[4841]: I0130 05:18:30.749410 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="ovn-acl-logging" containerID="cri-o://099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676" gracePeriod=30 Jan 30 05:18:30 crc kubenswrapper[4841]: I0130 05:18:30.749352 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="kube-rbac-proxy-node" containerID="cri-o://9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15" gracePeriod=30 Jan 30 05:18:30 crc kubenswrapper[4841]: I0130 05:18:30.749688 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="nbdb" containerID="cri-o://9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026" gracePeriod=30 Jan 30 05:18:30 crc kubenswrapper[4841]: I0130 05:18:30.800477 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="ovnkube-controller" containerID="cri-o://b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47" gracePeriod=30 Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.139013 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fl5g_8f5d8664-d53a-4e96-9458-fd915cec77b5/ovn-acl-logging/0.log" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.139714 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fl5g_8f5d8664-d53a-4e96-9458-fd915cec77b5/ovn-controller/0.log" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.140213 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.209869 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mq6pb"] Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.210136 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="kubecfg-setup" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210156 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="kubecfg-setup" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.210172 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="kube-rbac-proxy-node" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210185 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="kube-rbac-proxy-node" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.210202 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="northd" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210215 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="northd" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.210241 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210253 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.210268 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="nbdb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210279 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="nbdb" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.210296 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="sbdb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210309 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="sbdb" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.210323 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="ovn-acl-logging" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210335 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="ovn-acl-logging" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.210358 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91b1aae-b372-4348-b07b-0afb79ecfc61" containerName="collect-profiles" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210370 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91b1aae-b372-4348-b07b-0afb79ecfc61" containerName="collect-profiles" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.210388 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="ovn-controller" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210428 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="ovn-controller" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.210444 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="ovnkube-controller" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210457 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="ovnkube-controller" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210623 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="sbdb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210644 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="kube-rbac-proxy-node" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210660 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="nbdb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210676 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="ovnkube-controller" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210696 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f91b1aae-b372-4348-b07b-0afb79ecfc61" containerName="collect-profiles" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210709 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="ovn-controller" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210744 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210765 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="ovn-acl-logging" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.210781 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerName="northd" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.213786 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.304561 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-kubelet\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.304664 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305138 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-ovn\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305186 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-openvswitch\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305231 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-env-overrides\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305264 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovnkube-config\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305297 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-etc-openvswitch\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305326 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-run-netns\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305329 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305364 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-var-lib-openvswitch\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305381 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305425 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qldm9\" (UniqueName: \"kubernetes.io/projected/8f5d8664-d53a-4e96-9458-fd915cec77b5-kube-api-access-qldm9\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305453 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-systemd\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305455 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305482 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305516 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovnkube-script-lib\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305542 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-node-log\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305569 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-slash\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305597 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-systemd-units\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305645 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-run-ovn-kubernetes\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305676 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-log-socket\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305710 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-cni-bin\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305737 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-cni-netd\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305768 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovn-node-metrics-cert\") pod \"8f5d8664-d53a-4e96-9458-fd915cec77b5\" (UID: \"8f5d8664-d53a-4e96-9458-fd915cec77b5\") " Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305878 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-run-systemd\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305926 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-log-socket\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305957 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-cni-netd\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.305988 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-run-openvswitch\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306062 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306085 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2574241-5ed8-4957-baab-16031cf6e340-env-overrides\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306130 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306159 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-kubelet\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306187 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-node-log\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306221 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-var-lib-openvswitch\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306261 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-run-netns\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306292 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-run-ovn-kubernetes\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306332 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2574241-5ed8-4957-baab-16031cf6e340-ovnkube-config\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306371 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-slash\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306441 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-systemd-units\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306470 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-etc-openvswitch\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306508 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2574241-5ed8-4957-baab-16031cf6e340-ovnkube-script-lib\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306524 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306538 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2574241-5ed8-4957-baab-16031cf6e340-ovn-node-metrics-cert\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306571 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-run-ovn\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306592 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306659 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306868 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.307425 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.307501 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.307524 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-log-socket" (OuterVolumeSpecName: "log-socket") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.306599 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p69t5\" (UniqueName: \"kubernetes.io/projected/a2574241-5ed8-4957-baab-16031cf6e340-kube-api-access-p69t5\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.307761 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-slash" (OuterVolumeSpecName: "host-slash") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.307882 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.307889 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-cni-bin\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.307798 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.307989 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-node-log" (OuterVolumeSpecName: "node-log") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308240 4841 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308266 4841 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-slash\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308286 4841 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308310 4841 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308329 4841 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-log-socket\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308347 4841 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308368 4841 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308388 4841 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308432 4841 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308450 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308468 4841 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308487 4841 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308505 4841 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.308532 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.312856 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.314271 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5d8664-d53a-4e96-9458-fd915cec77b5-kube-api-access-qldm9" (OuterVolumeSpecName: "kube-api-access-qldm9") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "kube-api-access-qldm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.326708 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8f5d8664-d53a-4e96-9458-fd915cec77b5" (UID: "8f5d8664-d53a-4e96-9458-fd915cec77b5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.409786 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.409848 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-kubelet\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.409882 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-node-log\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.409917 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-var-lib-openvswitch\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.409941 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.409979 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-kubelet\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410014 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-node-log\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410011 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-run-netns\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.409960 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-run-netns\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410089 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-run-ovn-kubernetes\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410132 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2574241-5ed8-4957-baab-16031cf6e340-ovnkube-config\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410159 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-run-ovn-kubernetes\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410175 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-slash\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410050 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-var-lib-openvswitch\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410214 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-systemd-units\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410243 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-etc-openvswitch\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2574241-5ed8-4957-baab-16031cf6e340-ovnkube-script-lib\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410308 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2574241-5ed8-4957-baab-16031cf6e340-ovn-node-metrics-cert\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410340 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-run-ovn\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410368 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p69t5\" (UniqueName: \"kubernetes.io/projected/a2574241-5ed8-4957-baab-16031cf6e340-kube-api-access-p69t5\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410437 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-cni-bin\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410469 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-run-systemd\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410536 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-log-socket\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410569 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-cni-netd\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410599 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-run-openvswitch\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410635 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2574241-5ed8-4957-baab-16031cf6e340-env-overrides\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410706 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qldm9\" (UniqueName: \"kubernetes.io/projected/8f5d8664-d53a-4e96-9458-fd915cec77b5-kube-api-access-qldm9\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410730 4841 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410751 4841 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410770 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410788 4841 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-node-log\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410806 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f5d8664-d53a-4e96-9458-fd915cec77b5-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.410824 4841 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f5d8664-d53a-4e96-9458-fd915cec77b5-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.411318 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2574241-5ed8-4957-baab-16031cf6e340-ovnkube-config\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.411395 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-run-ovn\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.411510 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-slash\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.411559 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-systemd-units\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.411608 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-etc-openvswitch\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.411637 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2574241-5ed8-4957-baab-16031cf6e340-env-overrides\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.412060 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-cni-bin\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.412113 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-run-systemd\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.412155 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-log-socket\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.412193 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-host-cni-netd\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.412232 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2574241-5ed8-4957-baab-16031cf6e340-run-openvswitch\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.412506 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2574241-5ed8-4957-baab-16031cf6e340-ovnkube-script-lib\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.418294 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2574241-5ed8-4957-baab-16031cf6e340-ovn-node-metrics-cert\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.440211 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p69t5\" (UniqueName: \"kubernetes.io/projected/a2574241-5ed8-4957-baab-16031cf6e340-kube-api-access-p69t5\") pod \"ovnkube-node-mq6pb\" (UID: \"a2574241-5ed8-4957-baab-16031cf6e340\") " pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.531699 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.532259 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c49cw_262e0db9-4560-4557-823d-8a4145e03fd1/kube-multus/0.log" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.532317 4841 generic.go:334] "Generic (PLEG): container finished" podID="262e0db9-4560-4557-823d-8a4145e03fd1" containerID="0a1d11eb03ded7773dd8e3beb118ebf6b2b97975cd4e82206ab01c0b0b9e88f7" exitCode=2 Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.532383 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c49cw" event={"ID":"262e0db9-4560-4557-823d-8a4145e03fd1","Type":"ContainerDied","Data":"0a1d11eb03ded7773dd8e3beb118ebf6b2b97975cd4e82206ab01c0b0b9e88f7"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.533221 4841 scope.go:117] "RemoveContainer" containerID="0a1d11eb03ded7773dd8e3beb118ebf6b2b97975cd4e82206ab01c0b0b9e88f7" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.539384 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fl5g_8f5d8664-d53a-4e96-9458-fd915cec77b5/ovn-acl-logging/0.log" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.540878 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fl5g_8f5d8664-d53a-4e96-9458-fd915cec77b5/ovn-controller/0.log" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.541940 4841 generic.go:334] "Generic (PLEG): container finished" podID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerID="b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47" exitCode=0 Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.541961 4841 generic.go:334] "Generic (PLEG): container finished" podID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerID="dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86" exitCode=0 Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.541971 4841 generic.go:334] "Generic (PLEG): container finished" podID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerID="9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026" exitCode=0 Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.541980 4841 generic.go:334] "Generic (PLEG): container finished" podID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerID="65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5" exitCode=0 Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.541989 4841 generic.go:334] "Generic (PLEG): container finished" podID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerID="752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71" exitCode=0 Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.541998 4841 generic.go:334] "Generic (PLEG): container finished" podID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerID="9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15" exitCode=0 Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542035 4841 generic.go:334] "Generic (PLEG): container finished" podID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerID="099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676" exitCode=143 Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542044 4841 generic.go:334] "Generic (PLEG): container finished" podID="8f5d8664-d53a-4e96-9458-fd915cec77b5" containerID="4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1" exitCode=143 Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542078 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerDied","Data":"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542110 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerDied","Data":"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542125 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerDied","Data":"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542138 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerDied","Data":"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542152 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerDied","Data":"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542167 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerDied","Data":"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542180 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542193 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542200 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542210 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerDied","Data":"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542220 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542228 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542235 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542241 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542249 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542255 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542261 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542268 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542275 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542284 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerDied","Data":"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542295 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542302 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542309 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542317 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542324 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542331 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542338 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542346 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542374 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542384 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" event={"ID":"8f5d8664-d53a-4e96-9458-fd915cec77b5","Type":"ContainerDied","Data":"835b0026c306346f11b0b21630d16cee154fc8d3e6e7b6f10631f09b7c217c02"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542417 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542426 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542433 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542440 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542449 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542456 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542464 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542471 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542477 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a"} Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542493 4841 scope.go:117] "RemoveContainer" containerID="b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.542725 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4fl5g" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.578629 4841 scope.go:117] "RemoveContainer" containerID="dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86" Jan 30 05:18:31 crc kubenswrapper[4841]: W0130 05:18:31.601791 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2574241_5ed8_4957_baab_16031cf6e340.slice/crio-8a256fc6c0193d11897b7f9816532412f534fd8fb933761cdb023e3a1a9f2af2 WatchSource:0}: Error finding container 8a256fc6c0193d11897b7f9816532412f534fd8fb933761cdb023e3a1a9f2af2: Status 404 returned error can't find the container with id 8a256fc6c0193d11897b7f9816532412f534fd8fb933761cdb023e3a1a9f2af2 Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.603164 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4fl5g"] Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.615635 4841 scope.go:117] "RemoveContainer" containerID="9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.618487 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4fl5g"] Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.666134 4841 scope.go:117] "RemoveContainer" containerID="65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.699706 4841 scope.go:117] "RemoveContainer" containerID="752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.739083 4841 scope.go:117] "RemoveContainer" containerID="9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.760514 4841 scope.go:117] "RemoveContainer" containerID="099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.787131 4841 scope.go:117] "RemoveContainer" containerID="4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.845157 4841 scope.go:117] "RemoveContainer" containerID="11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.867796 4841 scope.go:117] "RemoveContainer" containerID="b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.868680 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47\": container with ID starting with b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47 not found: ID does not exist" containerID="b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.868755 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47"} err="failed to get container status \"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47\": rpc error: code = NotFound desc = could not find container \"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47\": container with ID starting with b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.868806 4841 scope.go:117] "RemoveContainer" containerID="dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.869505 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86\": container with ID starting with dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86 not found: ID does not exist" containerID="dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.869573 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86"} err="failed to get container status \"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86\": rpc error: code = NotFound desc = could not find container \"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86\": container with ID starting with dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.869622 4841 scope.go:117] "RemoveContainer" containerID="9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.870751 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026\": container with ID starting with 9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026 not found: ID does not exist" containerID="9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.870813 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026"} err="failed to get container status \"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026\": rpc error: code = NotFound desc = could not find container \"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026\": container with ID starting with 9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.870845 4841 scope.go:117] "RemoveContainer" containerID="65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.871627 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5\": container with ID starting with 65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5 not found: ID does not exist" containerID="65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.871670 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5"} err="failed to get container status \"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5\": rpc error: code = NotFound desc = could not find container \"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5\": container with ID starting with 65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.871699 4841 scope.go:117] "RemoveContainer" containerID="752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.872210 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71\": container with ID starting with 752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71 not found: ID does not exist" containerID="752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.872263 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71"} err="failed to get container status \"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71\": rpc error: code = NotFound desc = could not find container \"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71\": container with ID starting with 752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.872301 4841 scope.go:117] "RemoveContainer" containerID="9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.872779 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15\": container with ID starting with 9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15 not found: ID does not exist" containerID="9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.872815 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15"} err="failed to get container status \"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15\": rpc error: code = NotFound desc = could not find container \"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15\": container with ID starting with 9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.872840 4841 scope.go:117] "RemoveContainer" containerID="099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.873284 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676\": container with ID starting with 099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676 not found: ID does not exist" containerID="099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.873333 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676"} err="failed to get container status \"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676\": rpc error: code = NotFound desc = could not find container \"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676\": container with ID starting with 099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.873363 4841 scope.go:117] "RemoveContainer" containerID="4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.873774 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1\": container with ID starting with 4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1 not found: ID does not exist" containerID="4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.873831 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1"} err="failed to get container status \"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1\": rpc error: code = NotFound desc = could not find container \"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1\": container with ID starting with 4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.873860 4841 scope.go:117] "RemoveContainer" containerID="11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a" Jan 30 05:18:31 crc kubenswrapper[4841]: E0130 05:18:31.874284 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\": container with ID starting with 11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a not found: ID does not exist" containerID="11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.874345 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a"} err="failed to get container status \"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\": rpc error: code = NotFound desc = could not find container \"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\": container with ID starting with 11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.874379 4841 scope.go:117] "RemoveContainer" containerID="b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.874887 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47"} err="failed to get container status \"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47\": rpc error: code = NotFound desc = could not find container \"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47\": container with ID starting with b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.874922 4841 scope.go:117] "RemoveContainer" containerID="dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.875523 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86"} err="failed to get container status \"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86\": rpc error: code = NotFound desc = could not find container \"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86\": container with ID starting with dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.875568 4841 scope.go:117] "RemoveContainer" containerID="9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.876528 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026"} err="failed to get container status \"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026\": rpc error: code = NotFound desc = could not find container \"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026\": container with ID starting with 9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.876571 4841 scope.go:117] "RemoveContainer" containerID="65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.877123 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5"} err="failed to get container status \"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5\": rpc error: code = NotFound desc = could not find container \"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5\": container with ID starting with 65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.877161 4841 scope.go:117] "RemoveContainer" containerID="752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.877595 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71"} err="failed to get container status \"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71\": rpc error: code = NotFound desc = could not find container \"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71\": container with ID starting with 752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.877637 4841 scope.go:117] "RemoveContainer" containerID="9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.878273 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15"} err="failed to get container status \"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15\": rpc error: code = NotFound desc = could not find container \"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15\": container with ID starting with 9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.878314 4841 scope.go:117] "RemoveContainer" containerID="099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.878712 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676"} err="failed to get container status \"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676\": rpc error: code = NotFound desc = could not find container \"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676\": container with ID starting with 099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.878749 4841 scope.go:117] "RemoveContainer" containerID="4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.885277 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1"} err="failed to get container status \"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1\": rpc error: code = NotFound desc = could not find container \"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1\": container with ID starting with 4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.885324 4841 scope.go:117] "RemoveContainer" containerID="11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.886023 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a"} err="failed to get container status \"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\": rpc error: code = NotFound desc = could not find container \"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\": container with ID starting with 11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.886063 4841 scope.go:117] "RemoveContainer" containerID="b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.886775 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47"} err="failed to get container status \"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47\": rpc error: code = NotFound desc = could not find container \"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47\": container with ID starting with b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.886857 4841 scope.go:117] "RemoveContainer" containerID="dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.887601 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86"} err="failed to get container status \"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86\": rpc error: code = NotFound desc = could not find container \"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86\": container with ID starting with dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.887667 4841 scope.go:117] "RemoveContainer" containerID="9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.888660 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026"} err="failed to get container status \"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026\": rpc error: code = NotFound desc = could not find container \"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026\": container with ID starting with 9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.888700 4841 scope.go:117] "RemoveContainer" containerID="65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.889680 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5"} err="failed to get container status \"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5\": rpc error: code = NotFound desc = could not find container \"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5\": container with ID starting with 65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.889765 4841 scope.go:117] "RemoveContainer" containerID="752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.890879 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71"} err="failed to get container status \"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71\": rpc error: code = NotFound desc = could not find container \"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71\": container with ID starting with 752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.890915 4841 scope.go:117] "RemoveContainer" containerID="9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.896935 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15"} err="failed to get container status \"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15\": rpc error: code = NotFound desc = could not find container \"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15\": container with ID starting with 9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.896968 4841 scope.go:117] "RemoveContainer" containerID="099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.897880 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676"} err="failed to get container status \"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676\": rpc error: code = NotFound desc = could not find container \"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676\": container with ID starting with 099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.897912 4841 scope.go:117] "RemoveContainer" containerID="4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.898335 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1"} err="failed to get container status \"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1\": rpc error: code = NotFound desc = could not find container \"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1\": container with ID starting with 4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.898362 4841 scope.go:117] "RemoveContainer" containerID="11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.899164 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a"} err="failed to get container status \"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\": rpc error: code = NotFound desc = could not find container \"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\": container with ID starting with 11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.899536 4841 scope.go:117] "RemoveContainer" containerID="b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.900713 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47"} err="failed to get container status \"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47\": rpc error: code = NotFound desc = could not find container \"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47\": container with ID starting with b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.900744 4841 scope.go:117] "RemoveContainer" containerID="dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.901276 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86"} err="failed to get container status \"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86\": rpc error: code = NotFound desc = could not find container \"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86\": container with ID starting with dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.901336 4841 scope.go:117] "RemoveContainer" containerID="9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.902138 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026"} err="failed to get container status \"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026\": rpc error: code = NotFound desc = could not find container \"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026\": container with ID starting with 9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.902201 4841 scope.go:117] "RemoveContainer" containerID="65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.902738 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5"} err="failed to get container status \"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5\": rpc error: code = NotFound desc = could not find container \"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5\": container with ID starting with 65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.902796 4841 scope.go:117] "RemoveContainer" containerID="752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.903304 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71"} err="failed to get container status \"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71\": rpc error: code = NotFound desc = could not find container \"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71\": container with ID starting with 752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.903346 4841 scope.go:117] "RemoveContainer" containerID="9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.903802 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15"} err="failed to get container status \"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15\": rpc error: code = NotFound desc = could not find container \"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15\": container with ID starting with 9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.903846 4841 scope.go:117] "RemoveContainer" containerID="099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.904378 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676"} err="failed to get container status \"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676\": rpc error: code = NotFound desc = could not find container \"099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676\": container with ID starting with 099c2aa001e7321cf5cd936072bf04641d23269a2c8469068d3655d138af0676 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.904471 4841 scope.go:117] "RemoveContainer" containerID="4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.904963 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1"} err="failed to get container status \"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1\": rpc error: code = NotFound desc = could not find container \"4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1\": container with ID starting with 4d077e6def81ff85c46a6d2a7210ec7b3b9e47cd20c86eca750a384481d572e1 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.904995 4841 scope.go:117] "RemoveContainer" containerID="11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.905458 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a"} err="failed to get container status \"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\": rpc error: code = NotFound desc = could not find container \"11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a\": container with ID starting with 11c42484d1b12a9e46297f6af13343fda179a3f6aaaced7bee96ae9c071f430a not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.905497 4841 scope.go:117] "RemoveContainer" containerID="b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.905926 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47"} err="failed to get container status \"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47\": rpc error: code = NotFound desc = could not find container \"b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47\": container with ID starting with b0dee8b68bd043dcebe8700a7fbcbd5eb899fbc5b44fd3175980bda9bd9d6d47 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.905959 4841 scope.go:117] "RemoveContainer" containerID="dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.906941 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86"} err="failed to get container status \"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86\": rpc error: code = NotFound desc = could not find container \"dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86\": container with ID starting with dc9953513daab54fffaedcd0a21936457f861830a5b2e1c78e4f422ec4557e86 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.906985 4841 scope.go:117] "RemoveContainer" containerID="9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.907558 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026"} err="failed to get container status \"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026\": rpc error: code = NotFound desc = could not find container \"9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026\": container with ID starting with 9d7242533d7516e81d9b95f660bae80e3b88fce9ca42f0261671fac1ac027026 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.907605 4841 scope.go:117] "RemoveContainer" containerID="65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.909039 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5"} err="failed to get container status \"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5\": rpc error: code = NotFound desc = could not find container \"65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5\": container with ID starting with 65e903811e43e383948e387c0697657b7fa8a4bdaf6170611aaec01d93a0daf5 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.909079 4841 scope.go:117] "RemoveContainer" containerID="752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.909512 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71"} err="failed to get container status \"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71\": rpc error: code = NotFound desc = could not find container \"752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71\": container with ID starting with 752c81820de5cb447201db09b3edd1db992d6a52270c8dae79dc1c8846b01f71 not found: ID does not exist" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.909545 4841 scope.go:117] "RemoveContainer" containerID="9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15" Jan 30 05:18:31 crc kubenswrapper[4841]: I0130 05:18:31.909951 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15"} err="failed to get container status \"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15\": rpc error: code = NotFound desc = could not find container \"9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15\": container with ID starting with 9f8810aac81e4297a765d625a1a57aef8c0cbda1396e9900bdd41ba63c609e15 not found: ID does not exist" Jan 30 05:18:32 crc kubenswrapper[4841]: I0130 05:18:32.444939 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f5d8664-d53a-4e96-9458-fd915cec77b5" path="/var/lib/kubelet/pods/8f5d8664-d53a-4e96-9458-fd915cec77b5/volumes" Jan 30 05:18:32 crc kubenswrapper[4841]: I0130 05:18:32.550026 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c49cw_262e0db9-4560-4557-823d-8a4145e03fd1/kube-multus/0.log" Jan 30 05:18:32 crc kubenswrapper[4841]: I0130 05:18:32.550191 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c49cw" event={"ID":"262e0db9-4560-4557-823d-8a4145e03fd1","Type":"ContainerStarted","Data":"4d256f556052195985c682603cd6a96c5c570991705d1fd0ab9defdfbea09fe4"} Jan 30 05:18:32 crc kubenswrapper[4841]: I0130 05:18:32.553537 4841 generic.go:334] "Generic (PLEG): container finished" podID="a2574241-5ed8-4957-baab-16031cf6e340" containerID="b13b9d6d67a2eb6151200258cca1d66739c5fe38bff670b828395396ae24e1f8" exitCode=0 Jan 30 05:18:32 crc kubenswrapper[4841]: I0130 05:18:32.553599 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" event={"ID":"a2574241-5ed8-4957-baab-16031cf6e340","Type":"ContainerDied","Data":"b13b9d6d67a2eb6151200258cca1d66739c5fe38bff670b828395396ae24e1f8"} Jan 30 05:18:32 crc kubenswrapper[4841]: I0130 05:18:32.553774 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" event={"ID":"a2574241-5ed8-4957-baab-16031cf6e340","Type":"ContainerStarted","Data":"8a256fc6c0193d11897b7f9816532412f534fd8fb933761cdb023e3a1a9f2af2"} Jan 30 05:18:33 crc kubenswrapper[4841]: I0130 05:18:33.560949 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" event={"ID":"a2574241-5ed8-4957-baab-16031cf6e340","Type":"ContainerStarted","Data":"f73ac965d962d32a596d643346fd05c61c30cda4edfdc2ca37b852f38ef8e216"} Jan 30 05:18:33 crc kubenswrapper[4841]: I0130 05:18:33.561506 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" event={"ID":"a2574241-5ed8-4957-baab-16031cf6e340","Type":"ContainerStarted","Data":"3858cd55df23fcb4906f8a3b9b43822adfde7938b131fe4b5b3dd1fea68c1c8c"} Jan 30 05:18:33 crc kubenswrapper[4841]: I0130 05:18:33.561519 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" event={"ID":"a2574241-5ed8-4957-baab-16031cf6e340","Type":"ContainerStarted","Data":"c394cfef334f58f9da63fac10710fb5e6d246452de6ff617988d8d19d26ed676"} Jan 30 05:18:33 crc kubenswrapper[4841]: I0130 05:18:33.561530 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" event={"ID":"a2574241-5ed8-4957-baab-16031cf6e340","Type":"ContainerStarted","Data":"505ec9476434de748f42839833bca68c953a41b052cff39e0cc8a055f58cba49"} Jan 30 05:18:33 crc kubenswrapper[4841]: I0130 05:18:33.561545 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" event={"ID":"a2574241-5ed8-4957-baab-16031cf6e340","Type":"ContainerStarted","Data":"9c59c7c2f31f931fcdb5ac3020a2a4f69320a08718332afc4658755667c09c59"} Jan 30 05:18:33 crc kubenswrapper[4841]: I0130 05:18:33.561559 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" event={"ID":"a2574241-5ed8-4957-baab-16031cf6e340","Type":"ContainerStarted","Data":"839a39226c6475fe051b4cc9d5080c1c3b29fd2673025f065e878a5d66bcc374"} Jan 30 05:18:36 crc kubenswrapper[4841]: I0130 05:18:36.594650 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" event={"ID":"a2574241-5ed8-4957-baab-16031cf6e340","Type":"ContainerStarted","Data":"7c32835b95d26cfb935beaf05ef72a0ee04c5d0ba9aa009787e49f6116eebb93"} Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.533175 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-j6z2x"] Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.534232 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.537388 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.537555 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.537669 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.539022 4841 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pw6w2" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.622172 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv5f6\" (UniqueName: \"kubernetes.io/projected/ced3c1ba-5caa-433f-b303-3513827bbb37-kube-api-access-tv5f6\") pod \"crc-storage-crc-j6z2x\" (UID: \"ced3c1ba-5caa-433f-b303-3513827bbb37\") " pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.622354 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ced3c1ba-5caa-433f-b303-3513827bbb37-node-mnt\") pod \"crc-storage-crc-j6z2x\" (UID: \"ced3c1ba-5caa-433f-b303-3513827bbb37\") " pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.622436 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ced3c1ba-5caa-433f-b303-3513827bbb37-crc-storage\") pod \"crc-storage-crc-j6z2x\" (UID: \"ced3c1ba-5caa-433f-b303-3513827bbb37\") " pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.723758 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv5f6\" (UniqueName: \"kubernetes.io/projected/ced3c1ba-5caa-433f-b303-3513827bbb37-kube-api-access-tv5f6\") pod \"crc-storage-crc-j6z2x\" (UID: \"ced3c1ba-5caa-433f-b303-3513827bbb37\") " pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.723939 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ced3c1ba-5caa-433f-b303-3513827bbb37-node-mnt\") pod \"crc-storage-crc-j6z2x\" (UID: \"ced3c1ba-5caa-433f-b303-3513827bbb37\") " pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.723996 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ced3c1ba-5caa-433f-b303-3513827bbb37-crc-storage\") pod \"crc-storage-crc-j6z2x\" (UID: \"ced3c1ba-5caa-433f-b303-3513827bbb37\") " pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.725607 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ced3c1ba-5caa-433f-b303-3513827bbb37-crc-storage\") pod \"crc-storage-crc-j6z2x\" (UID: \"ced3c1ba-5caa-433f-b303-3513827bbb37\") " pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.725987 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ced3c1ba-5caa-433f-b303-3513827bbb37-node-mnt\") pod \"crc-storage-crc-j6z2x\" (UID: \"ced3c1ba-5caa-433f-b303-3513827bbb37\") " pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.766256 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv5f6\" (UniqueName: \"kubernetes.io/projected/ced3c1ba-5caa-433f-b303-3513827bbb37-kube-api-access-tv5f6\") pod \"crc-storage-crc-j6z2x\" (UID: \"ced3c1ba-5caa-433f-b303-3513827bbb37\") " pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: I0130 05:18:37.862184 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: E0130 05:18:37.898861 4841 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j6z2x_crc-storage_ced3c1ba-5caa-433f-b303-3513827bbb37_0(add6b031784892ec193e25f9347c82e913191fd7311de9141b6877e4ba506c3b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 05:18:37 crc kubenswrapper[4841]: E0130 05:18:37.898931 4841 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j6z2x_crc-storage_ced3c1ba-5caa-433f-b303-3513827bbb37_0(add6b031784892ec193e25f9347c82e913191fd7311de9141b6877e4ba506c3b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: E0130 05:18:37.898959 4841 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j6z2x_crc-storage_ced3c1ba-5caa-433f-b303-3513827bbb37_0(add6b031784892ec193e25f9347c82e913191fd7311de9141b6877e4ba506c3b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:37 crc kubenswrapper[4841]: E0130 05:18:37.899028 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-j6z2x_crc-storage(ced3c1ba-5caa-433f-b303-3513827bbb37)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-j6z2x_crc-storage(ced3c1ba-5caa-433f-b303-3513827bbb37)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j6z2x_crc-storage_ced3c1ba-5caa-433f-b303-3513827bbb37_0(add6b031784892ec193e25f9347c82e913191fd7311de9141b6877e4ba506c3b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-j6z2x" podUID="ced3c1ba-5caa-433f-b303-3513827bbb37" Jan 30 05:18:38 crc kubenswrapper[4841]: I0130 05:18:38.615751 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" event={"ID":"a2574241-5ed8-4957-baab-16031cf6e340","Type":"ContainerStarted","Data":"ee553995aea631c031a0a5f2236cb3cdb3b19d23ad2bcbf30802f7aed56ccb94"} Jan 30 05:18:38 crc kubenswrapper[4841]: I0130 05:18:38.616450 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:38 crc kubenswrapper[4841]: I0130 05:18:38.616569 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:38 crc kubenswrapper[4841]: I0130 05:18:38.616650 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:38 crc kubenswrapper[4841]: I0130 05:18:38.661676 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:38 crc kubenswrapper[4841]: I0130 05:18:38.664575 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:18:38 crc kubenswrapper[4841]: I0130 05:18:38.670925 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" podStartSLOduration=7.670906125 podStartE2EDuration="7.670906125s" podCreationTimestamp="2026-01-30 05:18:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:18:38.665279435 +0000 UTC m=+655.658752073" watchObservedRunningTime="2026-01-30 05:18:38.670906125 +0000 UTC m=+655.664378763" Jan 30 05:18:39 crc kubenswrapper[4841]: I0130 05:18:39.436722 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-j6z2x"] Jan 30 05:18:39 crc kubenswrapper[4841]: I0130 05:18:39.436826 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:39 crc kubenswrapper[4841]: I0130 05:18:39.437237 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:39 crc kubenswrapper[4841]: E0130 05:18:39.467547 4841 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j6z2x_crc-storage_ced3c1ba-5caa-433f-b303-3513827bbb37_0(4ca30ba58346b99fa5a36442421de7b66f90290ac3bc46f3ef6314efca4de3d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 05:18:39 crc kubenswrapper[4841]: E0130 05:18:39.467636 4841 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j6z2x_crc-storage_ced3c1ba-5caa-433f-b303-3513827bbb37_0(4ca30ba58346b99fa5a36442421de7b66f90290ac3bc46f3ef6314efca4de3d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:39 crc kubenswrapper[4841]: E0130 05:18:39.467680 4841 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j6z2x_crc-storage_ced3c1ba-5caa-433f-b303-3513827bbb37_0(4ca30ba58346b99fa5a36442421de7b66f90290ac3bc46f3ef6314efca4de3d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:39 crc kubenswrapper[4841]: E0130 05:18:39.467751 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-j6z2x_crc-storage(ced3c1ba-5caa-433f-b303-3513827bbb37)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-j6z2x_crc-storage(ced3c1ba-5caa-433f-b303-3513827bbb37)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-j6z2x_crc-storage_ced3c1ba-5caa-433f-b303-3513827bbb37_0(4ca30ba58346b99fa5a36442421de7b66f90290ac3bc46f3ef6314efca4de3d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-j6z2x" podUID="ced3c1ba-5caa-433f-b303-3513827bbb37" Jan 30 05:18:51 crc kubenswrapper[4841]: I0130 05:18:51.288092 4841 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 05:18:52 crc kubenswrapper[4841]: I0130 05:18:52.433625 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:52 crc kubenswrapper[4841]: I0130 05:18:52.434601 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:52 crc kubenswrapper[4841]: I0130 05:18:52.707469 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-j6z2x"] Jan 30 05:18:52 crc kubenswrapper[4841]: I0130 05:18:52.714468 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:18:53 crc kubenswrapper[4841]: I0130 05:18:53.720121 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-j6z2x" event={"ID":"ced3c1ba-5caa-433f-b303-3513827bbb37","Type":"ContainerStarted","Data":"fd81ea1d73f87e2fae619c0ff85b023691b55b0a3a9b2db0594e64c8c5908f81"} Jan 30 05:18:54 crc kubenswrapper[4841]: I0130 05:18:54.731664 4841 generic.go:334] "Generic (PLEG): container finished" podID="ced3c1ba-5caa-433f-b303-3513827bbb37" containerID="08f903c700f19c8be6dcb8a9d2516ee34a4d54ef1ac4910e363afcd5175145d3" exitCode=0 Jan 30 05:18:54 crc kubenswrapper[4841]: I0130 05:18:54.731792 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-j6z2x" event={"ID":"ced3c1ba-5caa-433f-b303-3513827bbb37","Type":"ContainerDied","Data":"08f903c700f19c8be6dcb8a9d2516ee34a4d54ef1ac4910e363afcd5175145d3"} Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.071248 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.088625 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv5f6\" (UniqueName: \"kubernetes.io/projected/ced3c1ba-5caa-433f-b303-3513827bbb37-kube-api-access-tv5f6\") pod \"ced3c1ba-5caa-433f-b303-3513827bbb37\" (UID: \"ced3c1ba-5caa-433f-b303-3513827bbb37\") " Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.088820 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ced3c1ba-5caa-433f-b303-3513827bbb37-crc-storage\") pod \"ced3c1ba-5caa-433f-b303-3513827bbb37\" (UID: \"ced3c1ba-5caa-433f-b303-3513827bbb37\") " Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.088855 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ced3c1ba-5caa-433f-b303-3513827bbb37-node-mnt\") pod \"ced3c1ba-5caa-433f-b303-3513827bbb37\" (UID: \"ced3c1ba-5caa-433f-b303-3513827bbb37\") " Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.089166 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ced3c1ba-5caa-433f-b303-3513827bbb37-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "ced3c1ba-5caa-433f-b303-3513827bbb37" (UID: "ced3c1ba-5caa-433f-b303-3513827bbb37"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.134637 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced3c1ba-5caa-433f-b303-3513827bbb37-kube-api-access-tv5f6" (OuterVolumeSpecName: "kube-api-access-tv5f6") pod "ced3c1ba-5caa-433f-b303-3513827bbb37" (UID: "ced3c1ba-5caa-433f-b303-3513827bbb37"). InnerVolumeSpecName "kube-api-access-tv5f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.135566 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced3c1ba-5caa-433f-b303-3513827bbb37-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "ced3c1ba-5caa-433f-b303-3513827bbb37" (UID: "ced3c1ba-5caa-433f-b303-3513827bbb37"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.190040 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv5f6\" (UniqueName: \"kubernetes.io/projected/ced3c1ba-5caa-433f-b303-3513827bbb37-kube-api-access-tv5f6\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.190082 4841 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ced3c1ba-5caa-433f-b303-3513827bbb37-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.190107 4841 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ced3c1ba-5caa-433f-b303-3513827bbb37-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.747460 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-j6z2x" event={"ID":"ced3c1ba-5caa-433f-b303-3513827bbb37","Type":"ContainerDied","Data":"fd81ea1d73f87e2fae619c0ff85b023691b55b0a3a9b2db0594e64c8c5908f81"} Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.747497 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd81ea1d73f87e2fae619c0ff85b023691b55b0a3a9b2db0594e64c8c5908f81" Jan 30 05:18:56 crc kubenswrapper[4841]: I0130 05:18:56.747597 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-j6z2x" Jan 30 05:19:01 crc kubenswrapper[4841]: I0130 05:19:01.569931 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mq6pb" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.173890 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn"] Jan 30 05:19:04 crc kubenswrapper[4841]: E0130 05:19:04.175351 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced3c1ba-5caa-433f-b303-3513827bbb37" containerName="storage" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.175538 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced3c1ba-5caa-433f-b303-3513827bbb37" containerName="storage" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.175818 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced3c1ba-5caa-433f-b303-3513827bbb37" containerName="storage" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.177125 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.179980 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.193771 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn"] Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.301913 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57724098-c506-430f-977a-11a306b6044c-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn\" (UID: \"57724098-c506-430f-977a-11a306b6044c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.302002 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57724098-c506-430f-977a-11a306b6044c-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn\" (UID: \"57724098-c506-430f-977a-11a306b6044c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.302480 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltfnw\" (UniqueName: \"kubernetes.io/projected/57724098-c506-430f-977a-11a306b6044c-kube-api-access-ltfnw\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn\" (UID: \"57724098-c506-430f-977a-11a306b6044c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.403505 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltfnw\" (UniqueName: \"kubernetes.io/projected/57724098-c506-430f-977a-11a306b6044c-kube-api-access-ltfnw\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn\" (UID: \"57724098-c506-430f-977a-11a306b6044c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.403847 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57724098-c506-430f-977a-11a306b6044c-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn\" (UID: \"57724098-c506-430f-977a-11a306b6044c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.403875 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57724098-c506-430f-977a-11a306b6044c-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn\" (UID: \"57724098-c506-430f-977a-11a306b6044c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.404383 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57724098-c506-430f-977a-11a306b6044c-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn\" (UID: \"57724098-c506-430f-977a-11a306b6044c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.404650 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57724098-c506-430f-977a-11a306b6044c-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn\" (UID: \"57724098-c506-430f-977a-11a306b6044c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.438087 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltfnw\" (UniqueName: \"kubernetes.io/projected/57724098-c506-430f-977a-11a306b6044c-kube-api-access-ltfnw\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn\" (UID: \"57724098-c506-430f-977a-11a306b6044c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.509226 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:04 crc kubenswrapper[4841]: I0130 05:19:04.807495 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn"] Jan 30 05:19:05 crc kubenswrapper[4841]: I0130 05:19:05.808879 4841 generic.go:334] "Generic (PLEG): container finished" podID="57724098-c506-430f-977a-11a306b6044c" containerID="e94845c88db74afc952dedaa52066a3b9a79afc703bd70a71d15d5fba64c30fb" exitCode=0 Jan 30 05:19:05 crc kubenswrapper[4841]: I0130 05:19:05.808945 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" event={"ID":"57724098-c506-430f-977a-11a306b6044c","Type":"ContainerDied","Data":"e94845c88db74afc952dedaa52066a3b9a79afc703bd70a71d15d5fba64c30fb"} Jan 30 05:19:05 crc kubenswrapper[4841]: I0130 05:19:05.808986 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" event={"ID":"57724098-c506-430f-977a-11a306b6044c","Type":"ContainerStarted","Data":"c586cc71a924082933ffcd8c28d2857e5e9073b3dbd9f4e3920ed760bbe718e7"} Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.555557 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t2625"] Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.565615 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.573494 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t2625"] Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.659256 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f45sv\" (UniqueName: \"kubernetes.io/projected/834476c9-bb58-49b3-80e2-6f3c5bac4238-kube-api-access-f45sv\") pod \"redhat-operators-t2625\" (UID: \"834476c9-bb58-49b3-80e2-6f3c5bac4238\") " pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.659313 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/834476c9-bb58-49b3-80e2-6f3c5bac4238-catalog-content\") pod \"redhat-operators-t2625\" (UID: \"834476c9-bb58-49b3-80e2-6f3c5bac4238\") " pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.659371 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/834476c9-bb58-49b3-80e2-6f3c5bac4238-utilities\") pod \"redhat-operators-t2625\" (UID: \"834476c9-bb58-49b3-80e2-6f3c5bac4238\") " pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.760196 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f45sv\" (UniqueName: \"kubernetes.io/projected/834476c9-bb58-49b3-80e2-6f3c5bac4238-kube-api-access-f45sv\") pod \"redhat-operators-t2625\" (UID: \"834476c9-bb58-49b3-80e2-6f3c5bac4238\") " pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.760249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/834476c9-bb58-49b3-80e2-6f3c5bac4238-catalog-content\") pod \"redhat-operators-t2625\" (UID: \"834476c9-bb58-49b3-80e2-6f3c5bac4238\") " pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.760313 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/834476c9-bb58-49b3-80e2-6f3c5bac4238-utilities\") pod \"redhat-operators-t2625\" (UID: \"834476c9-bb58-49b3-80e2-6f3c5bac4238\") " pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.760918 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/834476c9-bb58-49b3-80e2-6f3c5bac4238-utilities\") pod \"redhat-operators-t2625\" (UID: \"834476c9-bb58-49b3-80e2-6f3c5bac4238\") " pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.761111 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/834476c9-bb58-49b3-80e2-6f3c5bac4238-catalog-content\") pod \"redhat-operators-t2625\" (UID: \"834476c9-bb58-49b3-80e2-6f3c5bac4238\") " pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.784064 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f45sv\" (UniqueName: \"kubernetes.io/projected/834476c9-bb58-49b3-80e2-6f3c5bac4238-kube-api-access-f45sv\") pod \"redhat-operators-t2625\" (UID: \"834476c9-bb58-49b3-80e2-6f3c5bac4238\") " pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:06 crc kubenswrapper[4841]: I0130 05:19:06.902801 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:07 crc kubenswrapper[4841]: I0130 05:19:07.133887 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t2625"] Jan 30 05:19:07 crc kubenswrapper[4841]: W0130 05:19:07.138181 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod834476c9_bb58_49b3_80e2_6f3c5bac4238.slice/crio-d7daa9c4e291e2318c4a6f6ac402bf3711c611061f7d5c041818e502a780d707 WatchSource:0}: Error finding container d7daa9c4e291e2318c4a6f6ac402bf3711c611061f7d5c041818e502a780d707: Status 404 returned error can't find the container with id d7daa9c4e291e2318c4a6f6ac402bf3711c611061f7d5c041818e502a780d707 Jan 30 05:19:07 crc kubenswrapper[4841]: I0130 05:19:07.822763 4841 generic.go:334] "Generic (PLEG): container finished" podID="57724098-c506-430f-977a-11a306b6044c" containerID="4ca19165b85dd604ee68d482debe431223ae753def44e5906a4631c0e890b0ab" exitCode=0 Jan 30 05:19:07 crc kubenswrapper[4841]: I0130 05:19:07.822899 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" event={"ID":"57724098-c506-430f-977a-11a306b6044c","Type":"ContainerDied","Data":"4ca19165b85dd604ee68d482debe431223ae753def44e5906a4631c0e890b0ab"} Jan 30 05:19:07 crc kubenswrapper[4841]: I0130 05:19:07.824124 4841 generic.go:334] "Generic (PLEG): container finished" podID="834476c9-bb58-49b3-80e2-6f3c5bac4238" containerID="f1cc977da953d560c519d6bbe04d02a662355d6c3353585b576c13e37fbf5160" exitCode=0 Jan 30 05:19:07 crc kubenswrapper[4841]: I0130 05:19:07.824159 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2625" event={"ID":"834476c9-bb58-49b3-80e2-6f3c5bac4238","Type":"ContainerDied","Data":"f1cc977da953d560c519d6bbe04d02a662355d6c3353585b576c13e37fbf5160"} Jan 30 05:19:07 crc kubenswrapper[4841]: I0130 05:19:07.824207 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2625" event={"ID":"834476c9-bb58-49b3-80e2-6f3c5bac4238","Type":"ContainerStarted","Data":"d7daa9c4e291e2318c4a6f6ac402bf3711c611061f7d5c041818e502a780d707"} Jan 30 05:19:08 crc kubenswrapper[4841]: I0130 05:19:08.835200 4841 generic.go:334] "Generic (PLEG): container finished" podID="57724098-c506-430f-977a-11a306b6044c" containerID="e5eb627288c6672042e5670bbfd8bbfb8ef36ec855d0cf82a3d226df1a4e0cd2" exitCode=0 Jan 30 05:19:08 crc kubenswrapper[4841]: I0130 05:19:08.835333 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" event={"ID":"57724098-c506-430f-977a-11a306b6044c","Type":"ContainerDied","Data":"e5eb627288c6672042e5670bbfd8bbfb8ef36ec855d0cf82a3d226df1a4e0cd2"} Jan 30 05:19:08 crc kubenswrapper[4841]: I0130 05:19:08.839267 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2625" event={"ID":"834476c9-bb58-49b3-80e2-6f3c5bac4238","Type":"ContainerStarted","Data":"3fe2d112dd9089358170b65b0c85ac8e792ff03656d07eeee6985802a96283fa"} Jan 30 05:19:09 crc kubenswrapper[4841]: I0130 05:19:09.848914 4841 generic.go:334] "Generic (PLEG): container finished" podID="834476c9-bb58-49b3-80e2-6f3c5bac4238" containerID="3fe2d112dd9089358170b65b0c85ac8e792ff03656d07eeee6985802a96283fa" exitCode=0 Jan 30 05:19:09 crc kubenswrapper[4841]: I0130 05:19:09.849197 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2625" event={"ID":"834476c9-bb58-49b3-80e2-6f3c5bac4238","Type":"ContainerDied","Data":"3fe2d112dd9089358170b65b0c85ac8e792ff03656d07eeee6985802a96283fa"} Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.213251 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.414008 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57724098-c506-430f-977a-11a306b6044c-bundle\") pod \"57724098-c506-430f-977a-11a306b6044c\" (UID: \"57724098-c506-430f-977a-11a306b6044c\") " Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.414671 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltfnw\" (UniqueName: \"kubernetes.io/projected/57724098-c506-430f-977a-11a306b6044c-kube-api-access-ltfnw\") pod \"57724098-c506-430f-977a-11a306b6044c\" (UID: \"57724098-c506-430f-977a-11a306b6044c\") " Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.414812 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57724098-c506-430f-977a-11a306b6044c-util\") pod \"57724098-c506-430f-977a-11a306b6044c\" (UID: \"57724098-c506-430f-977a-11a306b6044c\") " Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.415279 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57724098-c506-430f-977a-11a306b6044c-bundle" (OuterVolumeSpecName: "bundle") pod "57724098-c506-430f-977a-11a306b6044c" (UID: "57724098-c506-430f-977a-11a306b6044c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.435707 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57724098-c506-430f-977a-11a306b6044c-kube-api-access-ltfnw" (OuterVolumeSpecName: "kube-api-access-ltfnw") pod "57724098-c506-430f-977a-11a306b6044c" (UID: "57724098-c506-430f-977a-11a306b6044c"). InnerVolumeSpecName "kube-api-access-ltfnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.489997 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57724098-c506-430f-977a-11a306b6044c-util" (OuterVolumeSpecName: "util") pod "57724098-c506-430f-977a-11a306b6044c" (UID: "57724098-c506-430f-977a-11a306b6044c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.515974 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltfnw\" (UniqueName: \"kubernetes.io/projected/57724098-c506-430f-977a-11a306b6044c-kube-api-access-ltfnw\") on node \"crc\" DevicePath \"\"" Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.516025 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57724098-c506-430f-977a-11a306b6044c-util\") on node \"crc\" DevicePath \"\"" Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.516048 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57724098-c506-430f-977a-11a306b6044c-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.859040 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2625" event={"ID":"834476c9-bb58-49b3-80e2-6f3c5bac4238","Type":"ContainerStarted","Data":"4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8"} Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.862982 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" event={"ID":"57724098-c506-430f-977a-11a306b6044c","Type":"ContainerDied","Data":"c586cc71a924082933ffcd8c28d2857e5e9073b3dbd9f4e3920ed760bbe718e7"} Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.863036 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c586cc71a924082933ffcd8c28d2857e5e9073b3dbd9f4e3920ed760bbe718e7" Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.863208 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn" Jan 30 05:19:10 crc kubenswrapper[4841]: I0130 05:19:10.896826 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t2625" podStartSLOduration=2.427325562 podStartE2EDuration="4.896801249s" podCreationTimestamp="2026-01-30 05:19:06 +0000 UTC" firstStartedPulling="2026-01-30 05:19:07.825447156 +0000 UTC m=+684.818919804" lastFinishedPulling="2026-01-30 05:19:10.294922843 +0000 UTC m=+687.288395491" observedRunningTime="2026-01-30 05:19:10.890657325 +0000 UTC m=+687.884130023" watchObservedRunningTime="2026-01-30 05:19:10.896801249 +0000 UTC m=+687.890273917" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.454178 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-gkcjz"] Jan 30 05:19:14 crc kubenswrapper[4841]: E0130 05:19:14.454779 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57724098-c506-430f-977a-11a306b6044c" containerName="pull" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.454797 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="57724098-c506-430f-977a-11a306b6044c" containerName="pull" Jan 30 05:19:14 crc kubenswrapper[4841]: E0130 05:19:14.454826 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57724098-c506-430f-977a-11a306b6044c" containerName="util" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.454837 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="57724098-c506-430f-977a-11a306b6044c" containerName="util" Jan 30 05:19:14 crc kubenswrapper[4841]: E0130 05:19:14.454855 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57724098-c506-430f-977a-11a306b6044c" containerName="extract" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.454865 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="57724098-c506-430f-977a-11a306b6044c" containerName="extract" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.454985 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="57724098-c506-430f-977a-11a306b6044c" containerName="extract" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.455445 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-gkcjz" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.457470 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-2scs5" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.458954 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.460930 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.469078 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv492\" (UniqueName: \"kubernetes.io/projected/9e2f7f43-61e4-481f-8015-170a5af14054-kube-api-access-vv492\") pod \"nmstate-operator-646758c888-gkcjz\" (UID: \"9e2f7f43-61e4-481f-8015-170a5af14054\") " pod="openshift-nmstate/nmstate-operator-646758c888-gkcjz" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.469442 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-gkcjz"] Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.570215 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv492\" (UniqueName: \"kubernetes.io/projected/9e2f7f43-61e4-481f-8015-170a5af14054-kube-api-access-vv492\") pod \"nmstate-operator-646758c888-gkcjz\" (UID: \"9e2f7f43-61e4-481f-8015-170a5af14054\") " pod="openshift-nmstate/nmstate-operator-646758c888-gkcjz" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.590700 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv492\" (UniqueName: \"kubernetes.io/projected/9e2f7f43-61e4-481f-8015-170a5af14054-kube-api-access-vv492\") pod \"nmstate-operator-646758c888-gkcjz\" (UID: \"9e2f7f43-61e4-481f-8015-170a5af14054\") " pod="openshift-nmstate/nmstate-operator-646758c888-gkcjz" Jan 30 05:19:14 crc kubenswrapper[4841]: I0130 05:19:14.775552 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-gkcjz" Jan 30 05:19:15 crc kubenswrapper[4841]: I0130 05:19:15.260798 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-gkcjz"] Jan 30 05:19:15 crc kubenswrapper[4841]: W0130 05:19:15.265391 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e2f7f43_61e4_481f_8015_170a5af14054.slice/crio-8102161b780387c9361522ee10e956f407393f6af9b064eef293d4f651b3d6d3 WatchSource:0}: Error finding container 8102161b780387c9361522ee10e956f407393f6af9b064eef293d4f651b3d6d3: Status 404 returned error can't find the container with id 8102161b780387c9361522ee10e956f407393f6af9b064eef293d4f651b3d6d3 Jan 30 05:19:15 crc kubenswrapper[4841]: I0130 05:19:15.904770 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-gkcjz" event={"ID":"9e2f7f43-61e4-481f-8015-170a5af14054","Type":"ContainerStarted","Data":"8102161b780387c9361522ee10e956f407393f6af9b064eef293d4f651b3d6d3"} Jan 30 05:19:16 crc kubenswrapper[4841]: I0130 05:19:16.909074 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:16 crc kubenswrapper[4841]: I0130 05:19:16.909952 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:17 crc kubenswrapper[4841]: I0130 05:19:17.918645 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-gkcjz" event={"ID":"9e2f7f43-61e4-481f-8015-170a5af14054","Type":"ContainerStarted","Data":"67f84c564d8f35ad21d24f09a460649d9f490e145319f9ac9836141907188f11"} Jan 30 05:19:17 crc kubenswrapper[4841]: I0130 05:19:17.944327 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-gkcjz" podStartSLOduration=1.584065303 podStartE2EDuration="3.944305203s" podCreationTimestamp="2026-01-30 05:19:14 +0000 UTC" firstStartedPulling="2026-01-30 05:19:15.268027883 +0000 UTC m=+692.261500531" lastFinishedPulling="2026-01-30 05:19:17.628267783 +0000 UTC m=+694.621740431" observedRunningTime="2026-01-30 05:19:17.941874578 +0000 UTC m=+694.935347256" watchObservedRunningTime="2026-01-30 05:19:17.944305203 +0000 UTC m=+694.937777871" Jan 30 05:19:17 crc kubenswrapper[4841]: I0130 05:19:17.966641 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t2625" podUID="834476c9-bb58-49b3-80e2-6f3c5bac4238" containerName="registry-server" probeResult="failure" output=< Jan 30 05:19:17 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 05:19:17 crc kubenswrapper[4841]: > Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.359835 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-bmmlb"] Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.362374 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-bmmlb" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.364004 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-bmmlb"] Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.368247 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2hgk4" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.381805 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v"] Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.383489 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.387466 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.412784 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v"] Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.425572 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-766rn"] Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.427126 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.499769 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2"] Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.500903 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.502707 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-m22zz" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.503184 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.503471 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.507922 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2"] Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.539454 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e802f3fc-3992-4900-b7db-8fc0938a3433-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-mnmq2\" (UID: \"e802f3fc-3992-4900-b7db-8fc0938a3433\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.539495 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cc8g\" (UniqueName: \"kubernetes.io/projected/b3cf04d4-9d2c-47cd-9bac-6a4e13850ff5-kube-api-access-7cc8g\") pod \"nmstate-metrics-54757c584b-bmmlb\" (UID: \"b3cf04d4-9d2c-47cd-9bac-6a4e13850ff5\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-bmmlb" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.539522 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1eb9465f-7705-4888-ab88-9da6eb8d9f5b-dbus-socket\") pod \"nmstate-handler-766rn\" (UID: \"1eb9465f-7705-4888-ab88-9da6eb8d9f5b\") " pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.539543 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6w4m\" (UniqueName: \"kubernetes.io/projected/e802f3fc-3992-4900-b7db-8fc0938a3433-kube-api-access-v6w4m\") pod \"nmstate-console-plugin-7754f76f8b-mnmq2\" (UID: \"e802f3fc-3992-4900-b7db-8fc0938a3433\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.539560 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gnq4\" (UniqueName: \"kubernetes.io/projected/1eb9465f-7705-4888-ab88-9da6eb8d9f5b-kube-api-access-2gnq4\") pod \"nmstate-handler-766rn\" (UID: \"1eb9465f-7705-4888-ab88-9da6eb8d9f5b\") " pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.539579 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42nbk\" (UniqueName: \"kubernetes.io/projected/c0d1633d-39be-4f82-a1b8-472d8c578b0c-kube-api-access-42nbk\") pod \"nmstate-webhook-8474b5b9d8-7jj7v\" (UID: \"c0d1633d-39be-4f82-a1b8-472d8c578b0c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.539600 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e802f3fc-3992-4900-b7db-8fc0938a3433-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-mnmq2\" (UID: \"e802f3fc-3992-4900-b7db-8fc0938a3433\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.539673 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c0d1633d-39be-4f82-a1b8-472d8c578b0c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-7jj7v\" (UID: \"c0d1633d-39be-4f82-a1b8-472d8c578b0c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.539693 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1eb9465f-7705-4888-ab88-9da6eb8d9f5b-ovs-socket\") pod \"nmstate-handler-766rn\" (UID: \"1eb9465f-7705-4888-ab88-9da6eb8d9f5b\") " pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.539725 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1eb9465f-7705-4888-ab88-9da6eb8d9f5b-nmstate-lock\") pod \"nmstate-handler-766rn\" (UID: \"1eb9465f-7705-4888-ab88-9da6eb8d9f5b\") " pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.640314 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1eb9465f-7705-4888-ab88-9da6eb8d9f5b-nmstate-lock\") pod \"nmstate-handler-766rn\" (UID: \"1eb9465f-7705-4888-ab88-9da6eb8d9f5b\") " pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.640371 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e802f3fc-3992-4900-b7db-8fc0938a3433-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-mnmq2\" (UID: \"e802f3fc-3992-4900-b7db-8fc0938a3433\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.640389 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cc8g\" (UniqueName: \"kubernetes.io/projected/b3cf04d4-9d2c-47cd-9bac-6a4e13850ff5-kube-api-access-7cc8g\") pod \"nmstate-metrics-54757c584b-bmmlb\" (UID: \"b3cf04d4-9d2c-47cd-9bac-6a4e13850ff5\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-bmmlb" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.640422 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1eb9465f-7705-4888-ab88-9da6eb8d9f5b-dbus-socket\") pod \"nmstate-handler-766rn\" (UID: \"1eb9465f-7705-4888-ab88-9da6eb8d9f5b\") " pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.640439 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6w4m\" (UniqueName: \"kubernetes.io/projected/e802f3fc-3992-4900-b7db-8fc0938a3433-kube-api-access-v6w4m\") pod \"nmstate-console-plugin-7754f76f8b-mnmq2\" (UID: \"e802f3fc-3992-4900-b7db-8fc0938a3433\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.640453 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gnq4\" (UniqueName: \"kubernetes.io/projected/1eb9465f-7705-4888-ab88-9da6eb8d9f5b-kube-api-access-2gnq4\") pod \"nmstate-handler-766rn\" (UID: \"1eb9465f-7705-4888-ab88-9da6eb8d9f5b\") " pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.640472 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42nbk\" (UniqueName: \"kubernetes.io/projected/c0d1633d-39be-4f82-a1b8-472d8c578b0c-kube-api-access-42nbk\") pod \"nmstate-webhook-8474b5b9d8-7jj7v\" (UID: \"c0d1633d-39be-4f82-a1b8-472d8c578b0c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.640493 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e802f3fc-3992-4900-b7db-8fc0938a3433-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-mnmq2\" (UID: \"e802f3fc-3992-4900-b7db-8fc0938a3433\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.640524 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c0d1633d-39be-4f82-a1b8-472d8c578b0c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-7jj7v\" (UID: \"c0d1633d-39be-4f82-a1b8-472d8c578b0c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.640542 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1eb9465f-7705-4888-ab88-9da6eb8d9f5b-ovs-socket\") pod \"nmstate-handler-766rn\" (UID: \"1eb9465f-7705-4888-ab88-9da6eb8d9f5b\") " pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.640639 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1eb9465f-7705-4888-ab88-9da6eb8d9f5b-ovs-socket\") pod \"nmstate-handler-766rn\" (UID: \"1eb9465f-7705-4888-ab88-9da6eb8d9f5b\") " pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.640677 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1eb9465f-7705-4888-ab88-9da6eb8d9f5b-nmstate-lock\") pod \"nmstate-handler-766rn\" (UID: \"1eb9465f-7705-4888-ab88-9da6eb8d9f5b\") " pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.641841 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1eb9465f-7705-4888-ab88-9da6eb8d9f5b-dbus-socket\") pod \"nmstate-handler-766rn\" (UID: \"1eb9465f-7705-4888-ab88-9da6eb8d9f5b\") " pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.642989 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e802f3fc-3992-4900-b7db-8fc0938a3433-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-mnmq2\" (UID: \"e802f3fc-3992-4900-b7db-8fc0938a3433\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.655369 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c0d1633d-39be-4f82-a1b8-472d8c578b0c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-7jj7v\" (UID: \"c0d1633d-39be-4f82-a1b8-472d8c578b0c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.657961 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e802f3fc-3992-4900-b7db-8fc0938a3433-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-mnmq2\" (UID: \"e802f3fc-3992-4900-b7db-8fc0938a3433\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.660040 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cc8g\" (UniqueName: \"kubernetes.io/projected/b3cf04d4-9d2c-47cd-9bac-6a4e13850ff5-kube-api-access-7cc8g\") pod \"nmstate-metrics-54757c584b-bmmlb\" (UID: \"b3cf04d4-9d2c-47cd-9bac-6a4e13850ff5\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-bmmlb" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.660747 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6w4m\" (UniqueName: \"kubernetes.io/projected/e802f3fc-3992-4900-b7db-8fc0938a3433-kube-api-access-v6w4m\") pod \"nmstate-console-plugin-7754f76f8b-mnmq2\" (UID: \"e802f3fc-3992-4900-b7db-8fc0938a3433\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.662546 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gnq4\" (UniqueName: \"kubernetes.io/projected/1eb9465f-7705-4888-ab88-9da6eb8d9f5b-kube-api-access-2gnq4\") pod \"nmstate-handler-766rn\" (UID: \"1eb9465f-7705-4888-ab88-9da6eb8d9f5b\") " pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.669576 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42nbk\" (UniqueName: \"kubernetes.io/projected/c0d1633d-39be-4f82-a1b8-472d8c578b0c-kube-api-access-42nbk\") pod \"nmstate-webhook-8474b5b9d8-7jj7v\" (UID: \"c0d1633d-39be-4f82-a1b8-472d8c578b0c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.681046 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-67c44bb664-mm5ch"] Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.681884 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.690713 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67c44bb664-mm5ch"] Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.696241 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-bmmlb" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.708919 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.742061 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/624b613a-e66f-4683-9317-8e3a73cfc078-console-serving-cert\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.742110 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/624b613a-e66f-4683-9317-8e3a73cfc078-console-oauth-config\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.742186 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/624b613a-e66f-4683-9317-8e3a73cfc078-trusted-ca-bundle\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.742203 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ggnw\" (UniqueName: \"kubernetes.io/projected/624b613a-e66f-4683-9317-8e3a73cfc078-kube-api-access-5ggnw\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.742223 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/624b613a-e66f-4683-9317-8e3a73cfc078-service-ca\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.742240 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/624b613a-e66f-4683-9317-8e3a73cfc078-oauth-serving-cert\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.742259 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/624b613a-e66f-4683-9317-8e3a73cfc078-console-config\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.756934 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.814970 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.844132 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/624b613a-e66f-4683-9317-8e3a73cfc078-trusted-ca-bundle\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.844170 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ggnw\" (UniqueName: \"kubernetes.io/projected/624b613a-e66f-4683-9317-8e3a73cfc078-kube-api-access-5ggnw\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.844190 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/624b613a-e66f-4683-9317-8e3a73cfc078-service-ca\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.844340 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/624b613a-e66f-4683-9317-8e3a73cfc078-oauth-serving-cert\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.844434 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/624b613a-e66f-4683-9317-8e3a73cfc078-console-config\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.844458 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/624b613a-e66f-4683-9317-8e3a73cfc078-console-serving-cert\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.844515 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/624b613a-e66f-4683-9317-8e3a73cfc078-console-oauth-config\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.845473 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/624b613a-e66f-4683-9317-8e3a73cfc078-console-config\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.845513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/624b613a-e66f-4683-9317-8e3a73cfc078-oauth-serving-cert\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.845651 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/624b613a-e66f-4683-9317-8e3a73cfc078-trusted-ca-bundle\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.846096 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/624b613a-e66f-4683-9317-8e3a73cfc078-service-ca\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.849674 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/624b613a-e66f-4683-9317-8e3a73cfc078-console-oauth-config\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.849701 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/624b613a-e66f-4683-9317-8e3a73cfc078-console-serving-cert\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.858267 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ggnw\" (UniqueName: \"kubernetes.io/projected/624b613a-e66f-4683-9317-8e3a73cfc078-kube-api-access-5ggnw\") pod \"console-67c44bb664-mm5ch\" (UID: \"624b613a-e66f-4683-9317-8e3a73cfc078\") " pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.891996 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v"] Jan 30 05:19:24 crc kubenswrapper[4841]: W0130 05:19:24.898839 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0d1633d_39be_4f82_a1b8_472d8c578b0c.slice/crio-fb93f2f4082adf884fa15e1e7f5b8092548281268e65ee828dfa9c39659fbbd6 WatchSource:0}: Error finding container fb93f2f4082adf884fa15e1e7f5b8092548281268e65ee828dfa9c39659fbbd6: Status 404 returned error can't find the container with id fb93f2f4082adf884fa15e1e7f5b8092548281268e65ee828dfa9c39659fbbd6 Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.929745 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-bmmlb"] Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.965163 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-766rn" event={"ID":"1eb9465f-7705-4888-ab88-9da6eb8d9f5b","Type":"ContainerStarted","Data":"1bcaad6e219520c735321c4bb1302cdf7be6bb068558bea7a5c9c9a705bd0de7"} Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.968255 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" event={"ID":"c0d1633d-39be-4f82-a1b8-472d8c578b0c","Type":"ContainerStarted","Data":"fb93f2f4082adf884fa15e1e7f5b8092548281268e65ee828dfa9c39659fbbd6"} Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.969188 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-bmmlb" event={"ID":"b3cf04d4-9d2c-47cd-9bac-6a4e13850ff5","Type":"ContainerStarted","Data":"46006e5a350053b56f571d959b947f085025c0a2a0d0bae975085e95b07554b8"} Jan 30 05:19:24 crc kubenswrapper[4841]: I0130 05:19:24.990919 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2"] Jan 30 05:19:25 crc kubenswrapper[4841]: I0130 05:19:25.054039 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:25 crc kubenswrapper[4841]: I0130 05:19:25.236243 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67c44bb664-mm5ch"] Jan 30 05:19:25 crc kubenswrapper[4841]: W0130 05:19:25.240750 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod624b613a_e66f_4683_9317_8e3a73cfc078.slice/crio-f748b33b3fefcc2905b17ff02849e30c2128652c40634db372a3023f0ce93a19 WatchSource:0}: Error finding container f748b33b3fefcc2905b17ff02849e30c2128652c40634db372a3023f0ce93a19: Status 404 returned error can't find the container with id f748b33b3fefcc2905b17ff02849e30c2128652c40634db372a3023f0ce93a19 Jan 30 05:19:25 crc kubenswrapper[4841]: I0130 05:19:25.980177 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67c44bb664-mm5ch" event={"ID":"624b613a-e66f-4683-9317-8e3a73cfc078","Type":"ContainerStarted","Data":"0c92df8b41a8aff3208c303515565e71707029c8b6a2ef50645aaee2664ad011"} Jan 30 05:19:25 crc kubenswrapper[4841]: I0130 05:19:25.980253 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67c44bb664-mm5ch" event={"ID":"624b613a-e66f-4683-9317-8e3a73cfc078","Type":"ContainerStarted","Data":"f748b33b3fefcc2905b17ff02849e30c2128652c40634db372a3023f0ce93a19"} Jan 30 05:19:25 crc kubenswrapper[4841]: I0130 05:19:25.981658 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" event={"ID":"e802f3fc-3992-4900-b7db-8fc0938a3433","Type":"ContainerStarted","Data":"a8c7cad39bd5904276f5c406c0d24d2105939d718452d929fbcdad88c8110735"} Jan 30 05:19:26 crc kubenswrapper[4841]: I0130 05:19:26.022649 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67c44bb664-mm5ch" podStartSLOduration=2.022614211 podStartE2EDuration="2.022614211s" podCreationTimestamp="2026-01-30 05:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:19:26.008495075 +0000 UTC m=+703.001967783" watchObservedRunningTime="2026-01-30 05:19:26.022614211 +0000 UTC m=+703.016086889" Jan 30 05:19:26 crc kubenswrapper[4841]: I0130 05:19:26.939001 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:26 crc kubenswrapper[4841]: I0130 05:19:26.990888 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:27 crc kubenswrapper[4841]: I0130 05:19:27.170116 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t2625"] Jan 30 05:19:27 crc kubenswrapper[4841]: I0130 05:19:27.990955 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t2625" podUID="834476c9-bb58-49b3-80e2-6f3c5bac4238" containerName="registry-server" containerID="cri-o://4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8" gracePeriod=2 Jan 30 05:19:28 crc kubenswrapper[4841]: I0130 05:19:28.316677 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:28 crc kubenswrapper[4841]: I0130 05:19:28.495887 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f45sv\" (UniqueName: \"kubernetes.io/projected/834476c9-bb58-49b3-80e2-6f3c5bac4238-kube-api-access-f45sv\") pod \"834476c9-bb58-49b3-80e2-6f3c5bac4238\" (UID: \"834476c9-bb58-49b3-80e2-6f3c5bac4238\") " Jan 30 05:19:28 crc kubenswrapper[4841]: I0130 05:19:28.496119 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/834476c9-bb58-49b3-80e2-6f3c5bac4238-catalog-content\") pod \"834476c9-bb58-49b3-80e2-6f3c5bac4238\" (UID: \"834476c9-bb58-49b3-80e2-6f3c5bac4238\") " Jan 30 05:19:28 crc kubenswrapper[4841]: I0130 05:19:28.496339 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/834476c9-bb58-49b3-80e2-6f3c5bac4238-utilities\") pod \"834476c9-bb58-49b3-80e2-6f3c5bac4238\" (UID: \"834476c9-bb58-49b3-80e2-6f3c5bac4238\") " Jan 30 05:19:28 crc kubenswrapper[4841]: I0130 05:19:28.497904 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/834476c9-bb58-49b3-80e2-6f3c5bac4238-utilities" (OuterVolumeSpecName: "utilities") pod "834476c9-bb58-49b3-80e2-6f3c5bac4238" (UID: "834476c9-bb58-49b3-80e2-6f3c5bac4238"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:19:28 crc kubenswrapper[4841]: I0130 05:19:28.506857 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/834476c9-bb58-49b3-80e2-6f3c5bac4238-kube-api-access-f45sv" (OuterVolumeSpecName: "kube-api-access-f45sv") pod "834476c9-bb58-49b3-80e2-6f3c5bac4238" (UID: "834476c9-bb58-49b3-80e2-6f3c5bac4238"). InnerVolumeSpecName "kube-api-access-f45sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:19:28 crc kubenswrapper[4841]: I0130 05:19:28.598179 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f45sv\" (UniqueName: \"kubernetes.io/projected/834476c9-bb58-49b3-80e2-6f3c5bac4238-kube-api-access-f45sv\") on node \"crc\" DevicePath \"\"" Jan 30 05:19:28 crc kubenswrapper[4841]: I0130 05:19:28.598221 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/834476c9-bb58-49b3-80e2-6f3c5bac4238-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:19:28 crc kubenswrapper[4841]: I0130 05:19:28.640334 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/834476c9-bb58-49b3-80e2-6f3c5bac4238-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "834476c9-bb58-49b3-80e2-6f3c5bac4238" (UID: "834476c9-bb58-49b3-80e2-6f3c5bac4238"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:19:28 crc kubenswrapper[4841]: I0130 05:19:28.699058 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/834476c9-bb58-49b3-80e2-6f3c5bac4238-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.003257 4841 generic.go:334] "Generic (PLEG): container finished" podID="834476c9-bb58-49b3-80e2-6f3c5bac4238" containerID="4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8" exitCode=0 Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.003328 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2625" event={"ID":"834476c9-bb58-49b3-80e2-6f3c5bac4238","Type":"ContainerDied","Data":"4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8"} Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.003375 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2625" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.003393 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2625" event={"ID":"834476c9-bb58-49b3-80e2-6f3c5bac4238","Type":"ContainerDied","Data":"d7daa9c4e291e2318c4a6f6ac402bf3711c611061f7d5c041818e502a780d707"} Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.003453 4841 scope.go:117] "RemoveContainer" containerID="4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.008846 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-766rn" event={"ID":"1eb9465f-7705-4888-ab88-9da6eb8d9f5b","Type":"ContainerStarted","Data":"3dc3de27618bf72700e9f5e97f7c859d29f6f2b14566e4dab0ca05c39a59d4a6"} Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.009155 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.013328 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" event={"ID":"c0d1633d-39be-4f82-a1b8-472d8c578b0c","Type":"ContainerStarted","Data":"2648f0ada6e1e6f6d45862f26cefdcd7e7cf4495632d9f1246a51f4db867fe45"} Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.014002 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.018021 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-bmmlb" event={"ID":"b3cf04d4-9d2c-47cd-9bac-6a4e13850ff5","Type":"ContainerStarted","Data":"c73abb353f759eacf50a4d882ac4512d0e657399a3a43ee989cef4e3c7b7fc38"} Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.020592 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" event={"ID":"e802f3fc-3992-4900-b7db-8fc0938a3433","Type":"ContainerStarted","Data":"8af2686b58ce4b832ec0108acf7297c26765cb04bbe959eab46f56b666833aeb"} Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.039063 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-766rn" podStartSLOduration=1.92638818 podStartE2EDuration="5.039037592s" podCreationTimestamp="2026-01-30 05:19:24 +0000 UTC" firstStartedPulling="2026-01-30 05:19:24.78025995 +0000 UTC m=+701.773732588" lastFinishedPulling="2026-01-30 05:19:27.892909322 +0000 UTC m=+704.886382000" observedRunningTime="2026-01-30 05:19:29.030470305 +0000 UTC m=+706.023943003" watchObservedRunningTime="2026-01-30 05:19:29.039037592 +0000 UTC m=+706.032510270" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.040471 4841 scope.go:117] "RemoveContainer" containerID="3fe2d112dd9089358170b65b0c85ac8e792ff03656d07eeee6985802a96283fa" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.070723 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" podStartSLOduration=2.079169166 podStartE2EDuration="5.070704915s" podCreationTimestamp="2026-01-30 05:19:24 +0000 UTC" firstStartedPulling="2026-01-30 05:19:24.901969409 +0000 UTC m=+701.895442047" lastFinishedPulling="2026-01-30 05:19:27.893505168 +0000 UTC m=+704.886977796" observedRunningTime="2026-01-30 05:19:29.06903455 +0000 UTC m=+706.062507198" watchObservedRunningTime="2026-01-30 05:19:29.070704915 +0000 UTC m=+706.064177563" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.089318 4841 scope.go:117] "RemoveContainer" containerID="f1cc977da953d560c519d6bbe04d02a662355d6c3353585b576c13e37fbf5160" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.092259 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-mnmq2" podStartSLOduration=2.216440509 podStartE2EDuration="5.092243958s" podCreationTimestamp="2026-01-30 05:19:24 +0000 UTC" firstStartedPulling="2026-01-30 05:19:24.997247885 +0000 UTC m=+701.990720523" lastFinishedPulling="2026-01-30 05:19:27.873051334 +0000 UTC m=+704.866523972" observedRunningTime="2026-01-30 05:19:29.09005019 +0000 UTC m=+706.083522838" watchObservedRunningTime="2026-01-30 05:19:29.092243958 +0000 UTC m=+706.085716616" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.117220 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t2625"] Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.120789 4841 scope.go:117] "RemoveContainer" containerID="4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.124159 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t2625"] Jan 30 05:19:29 crc kubenswrapper[4841]: E0130 05:19:29.126842 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8\": container with ID starting with 4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8 not found: ID does not exist" containerID="4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.126888 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8"} err="failed to get container status \"4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8\": rpc error: code = NotFound desc = could not find container \"4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8\": container with ID starting with 4e1a3470ecdb8e07f85e484433097aa37d3defd392f74122a335f1757e4015b8 not found: ID does not exist" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.126949 4841 scope.go:117] "RemoveContainer" containerID="3fe2d112dd9089358170b65b0c85ac8e792ff03656d07eeee6985802a96283fa" Jan 30 05:19:29 crc kubenswrapper[4841]: E0130 05:19:29.128393 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe2d112dd9089358170b65b0c85ac8e792ff03656d07eeee6985802a96283fa\": container with ID starting with 3fe2d112dd9089358170b65b0c85ac8e792ff03656d07eeee6985802a96283fa not found: ID does not exist" containerID="3fe2d112dd9089358170b65b0c85ac8e792ff03656d07eeee6985802a96283fa" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.128662 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe2d112dd9089358170b65b0c85ac8e792ff03656d07eeee6985802a96283fa"} err="failed to get container status \"3fe2d112dd9089358170b65b0c85ac8e792ff03656d07eeee6985802a96283fa\": rpc error: code = NotFound desc = could not find container \"3fe2d112dd9089358170b65b0c85ac8e792ff03656d07eeee6985802a96283fa\": container with ID starting with 3fe2d112dd9089358170b65b0c85ac8e792ff03656d07eeee6985802a96283fa not found: ID does not exist" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.128683 4841 scope.go:117] "RemoveContainer" containerID="f1cc977da953d560c519d6bbe04d02a662355d6c3353585b576c13e37fbf5160" Jan 30 05:19:29 crc kubenswrapper[4841]: E0130 05:19:29.129198 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1cc977da953d560c519d6bbe04d02a662355d6c3353585b576c13e37fbf5160\": container with ID starting with f1cc977da953d560c519d6bbe04d02a662355d6c3353585b576c13e37fbf5160 not found: ID does not exist" containerID="f1cc977da953d560c519d6bbe04d02a662355d6c3353585b576c13e37fbf5160" Jan 30 05:19:29 crc kubenswrapper[4841]: I0130 05:19:29.129433 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1cc977da953d560c519d6bbe04d02a662355d6c3353585b576c13e37fbf5160"} err="failed to get container status \"f1cc977da953d560c519d6bbe04d02a662355d6c3353585b576c13e37fbf5160\": rpc error: code = NotFound desc = could not find container \"f1cc977da953d560c519d6bbe04d02a662355d6c3353585b576c13e37fbf5160\": container with ID starting with f1cc977da953d560c519d6bbe04d02a662355d6c3353585b576c13e37fbf5160 not found: ID does not exist" Jan 30 05:19:30 crc kubenswrapper[4841]: I0130 05:19:30.442260 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="834476c9-bb58-49b3-80e2-6f3c5bac4238" path="/var/lib/kubelet/pods/834476c9-bb58-49b3-80e2-6f3c5bac4238/volumes" Jan 30 05:19:31 crc kubenswrapper[4841]: I0130 05:19:31.037951 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-bmmlb" event={"ID":"b3cf04d4-9d2c-47cd-9bac-6a4e13850ff5","Type":"ContainerStarted","Data":"b65b79180a9fadec9c956bfa43c23de39cc6dbac9c5287b6d3f4b8f329aba673"} Jan 30 05:19:31 crc kubenswrapper[4841]: I0130 05:19:31.068243 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-bmmlb" podStartSLOduration=1.151924652 podStartE2EDuration="7.068214482s" podCreationTimestamp="2026-01-30 05:19:24 +0000 UTC" firstStartedPulling="2026-01-30 05:19:24.932304427 +0000 UTC m=+701.925777065" lastFinishedPulling="2026-01-30 05:19:30.848594257 +0000 UTC m=+707.842066895" observedRunningTime="2026-01-30 05:19:31.063011773 +0000 UTC m=+708.056484501" watchObservedRunningTime="2026-01-30 05:19:31.068214482 +0000 UTC m=+708.061687150" Jan 30 05:19:34 crc kubenswrapper[4841]: I0130 05:19:34.791552 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-766rn" Jan 30 05:19:35 crc kubenswrapper[4841]: I0130 05:19:35.054949 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:35 crc kubenswrapper[4841]: I0130 05:19:35.055049 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:35 crc kubenswrapper[4841]: I0130 05:19:35.065336 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:35 crc kubenswrapper[4841]: I0130 05:19:35.075971 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67c44bb664-mm5ch" Jan 30 05:19:35 crc kubenswrapper[4841]: I0130 05:19:35.174842 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7zvv6"] Jan 30 05:19:40 crc kubenswrapper[4841]: I0130 05:19:40.463359 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:19:40 crc kubenswrapper[4841]: I0130 05:19:40.464133 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:19:44 crc kubenswrapper[4841]: I0130 05:19:44.717189 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7jj7v" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.226101 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-7zvv6" podUID="68fe97c1-4b26-445b-af5b-73808e119f0b" containerName="console" containerID="cri-o://004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924" gracePeriod=15 Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.663035 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7zvv6_68fe97c1-4b26-445b-af5b-73808e119f0b/console/0.log" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.663385 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.666279 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7zvv6_68fe97c1-4b26-445b-af5b-73808e119f0b/console/0.log" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.666355 4841 generic.go:334] "Generic (PLEG): container finished" podID="68fe97c1-4b26-445b-af5b-73808e119f0b" containerID="004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924" exitCode=2 Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.666428 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7zvv6" event={"ID":"68fe97c1-4b26-445b-af5b-73808e119f0b","Type":"ContainerDied","Data":"004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924"} Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.666476 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7zvv6" event={"ID":"68fe97c1-4b26-445b-af5b-73808e119f0b","Type":"ContainerDied","Data":"fefa0a6241043bd13dbe2618ca4ec292ba1ba9cc62d292fd9a8943c7774487e0"} Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.666505 4841 scope.go:117] "RemoveContainer" containerID="004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.702100 4841 scope.go:117] "RemoveContainer" containerID="004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924" Jan 30 05:20:00 crc kubenswrapper[4841]: E0130 05:20:00.702783 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924\": container with ID starting with 004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924 not found: ID does not exist" containerID="004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.702850 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924"} err="failed to get container status \"004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924\": rpc error: code = NotFound desc = could not find container \"004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924\": container with ID starting with 004954a78fb412a08ad1fc1a442f9375578d029ed15e4274327ac53eb8cb4924 not found: ID does not exist" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.858274 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-service-ca\") pod \"68fe97c1-4b26-445b-af5b-73808e119f0b\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.858360 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-console-config\") pod \"68fe97c1-4b26-445b-af5b-73808e119f0b\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.858535 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-oauth-serving-cert\") pod \"68fe97c1-4b26-445b-af5b-73808e119f0b\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.858597 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68fe97c1-4b26-445b-af5b-73808e119f0b-console-serving-cert\") pod \"68fe97c1-4b26-445b-af5b-73808e119f0b\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.858688 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmh4m\" (UniqueName: \"kubernetes.io/projected/68fe97c1-4b26-445b-af5b-73808e119f0b-kube-api-access-gmh4m\") pod \"68fe97c1-4b26-445b-af5b-73808e119f0b\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.859803 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-service-ca" (OuterVolumeSpecName: "service-ca") pod "68fe97c1-4b26-445b-af5b-73808e119f0b" (UID: "68fe97c1-4b26-445b-af5b-73808e119f0b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.859830 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "68fe97c1-4b26-445b-af5b-73808e119f0b" (UID: "68fe97c1-4b26-445b-af5b-73808e119f0b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.860365 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68fe97c1-4b26-445b-af5b-73808e119f0b-console-oauth-config\") pod \"68fe97c1-4b26-445b-af5b-73808e119f0b\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.860472 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-trusted-ca-bundle\") pod \"68fe97c1-4b26-445b-af5b-73808e119f0b\" (UID: \"68fe97c1-4b26-445b-af5b-73808e119f0b\") " Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.860482 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-console-config" (OuterVolumeSpecName: "console-config") pod "68fe97c1-4b26-445b-af5b-73808e119f0b" (UID: "68fe97c1-4b26-445b-af5b-73808e119f0b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.860815 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.860844 4841 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-console-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.860865 4841 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.861390 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "68fe97c1-4b26-445b-af5b-73808e119f0b" (UID: "68fe97c1-4b26-445b-af5b-73808e119f0b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.867443 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68fe97c1-4b26-445b-af5b-73808e119f0b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "68fe97c1-4b26-445b-af5b-73808e119f0b" (UID: "68fe97c1-4b26-445b-af5b-73808e119f0b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.868456 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68fe97c1-4b26-445b-af5b-73808e119f0b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "68fe97c1-4b26-445b-af5b-73808e119f0b" (UID: "68fe97c1-4b26-445b-af5b-73808e119f0b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.868528 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68fe97c1-4b26-445b-af5b-73808e119f0b-kube-api-access-gmh4m" (OuterVolumeSpecName: "kube-api-access-gmh4m") pod "68fe97c1-4b26-445b-af5b-73808e119f0b" (UID: "68fe97c1-4b26-445b-af5b-73808e119f0b"). InnerVolumeSpecName "kube-api-access-gmh4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.962849 4841 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68fe97c1-4b26-445b-af5b-73808e119f0b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.962900 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmh4m\" (UniqueName: \"kubernetes.io/projected/68fe97c1-4b26-445b-af5b-73808e119f0b-kube-api-access-gmh4m\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.962921 4841 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68fe97c1-4b26-445b-af5b-73808e119f0b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:00 crc kubenswrapper[4841]: I0130 05:20:00.962939 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68fe97c1-4b26-445b-af5b-73808e119f0b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:01 crc kubenswrapper[4841]: I0130 05:20:01.681270 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7zvv6" Jan 30 05:20:01 crc kubenswrapper[4841]: I0130 05:20:01.725485 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7zvv6"] Jan 30 05:20:01 crc kubenswrapper[4841]: I0130 05:20:01.732182 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-7zvv6"] Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.383344 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr"] Jan 30 05:20:02 crc kubenswrapper[4841]: E0130 05:20:02.383615 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834476c9-bb58-49b3-80e2-6f3c5bac4238" containerName="extract-content" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.383633 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="834476c9-bb58-49b3-80e2-6f3c5bac4238" containerName="extract-content" Jan 30 05:20:02 crc kubenswrapper[4841]: E0130 05:20:02.383651 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834476c9-bb58-49b3-80e2-6f3c5bac4238" containerName="registry-server" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.383660 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="834476c9-bb58-49b3-80e2-6f3c5bac4238" containerName="registry-server" Jan 30 05:20:02 crc kubenswrapper[4841]: E0130 05:20:02.383678 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68fe97c1-4b26-445b-af5b-73808e119f0b" containerName="console" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.383686 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="68fe97c1-4b26-445b-af5b-73808e119f0b" containerName="console" Jan 30 05:20:02 crc kubenswrapper[4841]: E0130 05:20:02.383701 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834476c9-bb58-49b3-80e2-6f3c5bac4238" containerName="extract-utilities" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.383706 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="834476c9-bb58-49b3-80e2-6f3c5bac4238" containerName="extract-utilities" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.383796 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="834476c9-bb58-49b3-80e2-6f3c5bac4238" containerName="registry-server" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.383807 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="68fe97c1-4b26-445b-af5b-73808e119f0b" containerName="console" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.384549 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.387724 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.401699 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr"] Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.439433 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68fe97c1-4b26-445b-af5b-73808e119f0b" path="/var/lib/kubelet/pods/68fe97c1-4b26-445b-af5b-73808e119f0b/volumes" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.583892 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bef20871-4c82-4eeb-83c9-0f47f86b41e1-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr\" (UID: \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.584458 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkq8z\" (UniqueName: \"kubernetes.io/projected/bef20871-4c82-4eeb-83c9-0f47f86b41e1-kube-api-access-mkq8z\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr\" (UID: \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.584558 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bef20871-4c82-4eeb-83c9-0f47f86b41e1-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr\" (UID: \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.685534 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bef20871-4c82-4eeb-83c9-0f47f86b41e1-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr\" (UID: \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.685665 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkq8z\" (UniqueName: \"kubernetes.io/projected/bef20871-4c82-4eeb-83c9-0f47f86b41e1-kube-api-access-mkq8z\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr\" (UID: \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.685735 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bef20871-4c82-4eeb-83c9-0f47f86b41e1-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr\" (UID: \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.686521 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bef20871-4c82-4eeb-83c9-0f47f86b41e1-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr\" (UID: \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.686595 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bef20871-4c82-4eeb-83c9-0f47f86b41e1-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr\" (UID: \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:02 crc kubenswrapper[4841]: I0130 05:20:02.720673 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkq8z\" (UniqueName: \"kubernetes.io/projected/bef20871-4c82-4eeb-83c9-0f47f86b41e1-kube-api-access-mkq8z\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr\" (UID: \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:03 crc kubenswrapper[4841]: I0130 05:20:03.003793 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:03 crc kubenswrapper[4841]: I0130 05:20:03.294824 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr"] Jan 30 05:20:03 crc kubenswrapper[4841]: I0130 05:20:03.701309 4841 generic.go:334] "Generic (PLEG): container finished" podID="bef20871-4c82-4eeb-83c9-0f47f86b41e1" containerID="8535c4d3fdef7d5abce4a9fb903859e728314235771738330bb447f98197b9c2" exitCode=0 Jan 30 05:20:03 crc kubenswrapper[4841]: I0130 05:20:03.701361 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" event={"ID":"bef20871-4c82-4eeb-83c9-0f47f86b41e1","Type":"ContainerDied","Data":"8535c4d3fdef7d5abce4a9fb903859e728314235771738330bb447f98197b9c2"} Jan 30 05:20:03 crc kubenswrapper[4841]: I0130 05:20:03.701421 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" event={"ID":"bef20871-4c82-4eeb-83c9-0f47f86b41e1","Type":"ContainerStarted","Data":"4144452ee77ebed1d85d6a45030c1e57e21702d193f2abea3cbd4715513fdea7"} Jan 30 05:20:05 crc kubenswrapper[4841]: I0130 05:20:05.718237 4841 generic.go:334] "Generic (PLEG): container finished" podID="bef20871-4c82-4eeb-83c9-0f47f86b41e1" containerID="56de80149b38338e394e3ebc95f14b285fb7a0500a2fd7daec800eec5066535a" exitCode=0 Jan 30 05:20:05 crc kubenswrapper[4841]: I0130 05:20:05.718678 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" event={"ID":"bef20871-4c82-4eeb-83c9-0f47f86b41e1","Type":"ContainerDied","Data":"56de80149b38338e394e3ebc95f14b285fb7a0500a2fd7daec800eec5066535a"} Jan 30 05:20:06 crc kubenswrapper[4841]: I0130 05:20:06.729594 4841 generic.go:334] "Generic (PLEG): container finished" podID="bef20871-4c82-4eeb-83c9-0f47f86b41e1" containerID="5c46b4a9f6e51ec33561f4dc1d37aaef87a5fdec3ec9e8d1bb12d43937dc3251" exitCode=0 Jan 30 05:20:06 crc kubenswrapper[4841]: I0130 05:20:06.729679 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" event={"ID":"bef20871-4c82-4eeb-83c9-0f47f86b41e1","Type":"ContainerDied","Data":"5c46b4a9f6e51ec33561f4dc1d37aaef87a5fdec3ec9e8d1bb12d43937dc3251"} Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.066029 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.080026 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bef20871-4c82-4eeb-83c9-0f47f86b41e1-bundle\") pod \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\" (UID: \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\") " Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.080083 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bef20871-4c82-4eeb-83c9-0f47f86b41e1-util\") pod \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\" (UID: \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\") " Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.080161 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkq8z\" (UniqueName: \"kubernetes.io/projected/bef20871-4c82-4eeb-83c9-0f47f86b41e1-kube-api-access-mkq8z\") pod \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\" (UID: \"bef20871-4c82-4eeb-83c9-0f47f86b41e1\") " Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.081006 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bef20871-4c82-4eeb-83c9-0f47f86b41e1-bundle" (OuterVolumeSpecName: "bundle") pod "bef20871-4c82-4eeb-83c9-0f47f86b41e1" (UID: "bef20871-4c82-4eeb-83c9-0f47f86b41e1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.086854 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef20871-4c82-4eeb-83c9-0f47f86b41e1-kube-api-access-mkq8z" (OuterVolumeSpecName: "kube-api-access-mkq8z") pod "bef20871-4c82-4eeb-83c9-0f47f86b41e1" (UID: "bef20871-4c82-4eeb-83c9-0f47f86b41e1"). InnerVolumeSpecName "kube-api-access-mkq8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.098717 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bef20871-4c82-4eeb-83c9-0f47f86b41e1-util" (OuterVolumeSpecName: "util") pod "bef20871-4c82-4eeb-83c9-0f47f86b41e1" (UID: "bef20871-4c82-4eeb-83c9-0f47f86b41e1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.181805 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkq8z\" (UniqueName: \"kubernetes.io/projected/bef20871-4c82-4eeb-83c9-0f47f86b41e1-kube-api-access-mkq8z\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.182233 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bef20871-4c82-4eeb-83c9-0f47f86b41e1-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.182259 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bef20871-4c82-4eeb-83c9-0f47f86b41e1-util\") on node \"crc\" DevicePath \"\"" Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.750762 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" event={"ID":"bef20871-4c82-4eeb-83c9-0f47f86b41e1","Type":"ContainerDied","Data":"4144452ee77ebed1d85d6a45030c1e57e21702d193f2abea3cbd4715513fdea7"} Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.750837 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4144452ee77ebed1d85d6a45030c1e57e21702d193f2abea3cbd4715513fdea7" Jan 30 05:20:08 crc kubenswrapper[4841]: I0130 05:20:08.750929 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr" Jan 30 05:20:10 crc kubenswrapper[4841]: I0130 05:20:10.463928 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:20:10 crc kubenswrapper[4841]: I0130 05:20:10.464020 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.736864 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss"] Jan 30 05:20:17 crc kubenswrapper[4841]: E0130 05:20:17.737449 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef20871-4c82-4eeb-83c9-0f47f86b41e1" containerName="pull" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.737461 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef20871-4c82-4eeb-83c9-0f47f86b41e1" containerName="pull" Jan 30 05:20:17 crc kubenswrapper[4841]: E0130 05:20:17.737471 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef20871-4c82-4eeb-83c9-0f47f86b41e1" containerName="util" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.737477 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef20871-4c82-4eeb-83c9-0f47f86b41e1" containerName="util" Jan 30 05:20:17 crc kubenswrapper[4841]: E0130 05:20:17.737491 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef20871-4c82-4eeb-83c9-0f47f86b41e1" containerName="extract" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.737499 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef20871-4c82-4eeb-83c9-0f47f86b41e1" containerName="extract" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.737586 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef20871-4c82-4eeb-83c9-0f47f86b41e1" containerName="extract" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.737926 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.740052 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.740183 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.740355 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.740389 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5mfhp" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.740454 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.759314 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss"] Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.854815 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/581356ab-8116-4326-b032-e5cc2b5ce488-webhook-cert\") pod \"metallb-operator-controller-manager-6b59845b8f-g7kss\" (UID: \"581356ab-8116-4326-b032-e5cc2b5ce488\") " pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.854867 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/581356ab-8116-4326-b032-e5cc2b5ce488-apiservice-cert\") pod \"metallb-operator-controller-manager-6b59845b8f-g7kss\" (UID: \"581356ab-8116-4326-b032-e5cc2b5ce488\") " pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.855024 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngx6m\" (UniqueName: \"kubernetes.io/projected/581356ab-8116-4326-b032-e5cc2b5ce488-kube-api-access-ngx6m\") pod \"metallb-operator-controller-manager-6b59845b8f-g7kss\" (UID: \"581356ab-8116-4326-b032-e5cc2b5ce488\") " pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.955827 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/581356ab-8116-4326-b032-e5cc2b5ce488-apiservice-cert\") pod \"metallb-operator-controller-manager-6b59845b8f-g7kss\" (UID: \"581356ab-8116-4326-b032-e5cc2b5ce488\") " pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.955901 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngx6m\" (UniqueName: \"kubernetes.io/projected/581356ab-8116-4326-b032-e5cc2b5ce488-kube-api-access-ngx6m\") pod \"metallb-operator-controller-manager-6b59845b8f-g7kss\" (UID: \"581356ab-8116-4326-b032-e5cc2b5ce488\") " pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.955947 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/581356ab-8116-4326-b032-e5cc2b5ce488-webhook-cert\") pod \"metallb-operator-controller-manager-6b59845b8f-g7kss\" (UID: \"581356ab-8116-4326-b032-e5cc2b5ce488\") " pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.964343 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/581356ab-8116-4326-b032-e5cc2b5ce488-webhook-cert\") pod \"metallb-operator-controller-manager-6b59845b8f-g7kss\" (UID: \"581356ab-8116-4326-b032-e5cc2b5ce488\") " pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.965716 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/581356ab-8116-4326-b032-e5cc2b5ce488-apiservice-cert\") pod \"metallb-operator-controller-manager-6b59845b8f-g7kss\" (UID: \"581356ab-8116-4326-b032-e5cc2b5ce488\") " pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.974271 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngx6m\" (UniqueName: \"kubernetes.io/projected/581356ab-8116-4326-b032-e5cc2b5ce488-kube-api-access-ngx6m\") pod \"metallb-operator-controller-manager-6b59845b8f-g7kss\" (UID: \"581356ab-8116-4326-b032-e5cc2b5ce488\") " pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.980572 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw"] Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.981174 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.984478 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.984501 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-thrwr" Jan 30 05:20:17 crc kubenswrapper[4841]: I0130 05:20:17.997852 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.046512 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw"] Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.051534 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.056785 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/de2ec55e-51fe-48f2-87f5-06fb1ceed00a-webhook-cert\") pod \"metallb-operator-webhook-server-7c6fcc658d-pprtw\" (UID: \"de2ec55e-51fe-48f2-87f5-06fb1ceed00a\") " pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.056821 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sghb7\" (UniqueName: \"kubernetes.io/projected/de2ec55e-51fe-48f2-87f5-06fb1ceed00a-kube-api-access-sghb7\") pod \"metallb-operator-webhook-server-7c6fcc658d-pprtw\" (UID: \"de2ec55e-51fe-48f2-87f5-06fb1ceed00a\") " pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.056864 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/de2ec55e-51fe-48f2-87f5-06fb1ceed00a-apiservice-cert\") pod \"metallb-operator-webhook-server-7c6fcc658d-pprtw\" (UID: \"de2ec55e-51fe-48f2-87f5-06fb1ceed00a\") " pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.163099 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/de2ec55e-51fe-48f2-87f5-06fb1ceed00a-webhook-cert\") pod \"metallb-operator-webhook-server-7c6fcc658d-pprtw\" (UID: \"de2ec55e-51fe-48f2-87f5-06fb1ceed00a\") " pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.163140 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sghb7\" (UniqueName: \"kubernetes.io/projected/de2ec55e-51fe-48f2-87f5-06fb1ceed00a-kube-api-access-sghb7\") pod \"metallb-operator-webhook-server-7c6fcc658d-pprtw\" (UID: \"de2ec55e-51fe-48f2-87f5-06fb1ceed00a\") " pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.163186 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/de2ec55e-51fe-48f2-87f5-06fb1ceed00a-apiservice-cert\") pod \"metallb-operator-webhook-server-7c6fcc658d-pprtw\" (UID: \"de2ec55e-51fe-48f2-87f5-06fb1ceed00a\") " pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.168009 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/de2ec55e-51fe-48f2-87f5-06fb1ceed00a-webhook-cert\") pod \"metallb-operator-webhook-server-7c6fcc658d-pprtw\" (UID: \"de2ec55e-51fe-48f2-87f5-06fb1ceed00a\") " pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.183753 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/de2ec55e-51fe-48f2-87f5-06fb1ceed00a-apiservice-cert\") pod \"metallb-operator-webhook-server-7c6fcc658d-pprtw\" (UID: \"de2ec55e-51fe-48f2-87f5-06fb1ceed00a\") " pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.185165 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sghb7\" (UniqueName: \"kubernetes.io/projected/de2ec55e-51fe-48f2-87f5-06fb1ceed00a-kube-api-access-sghb7\") pod \"metallb-operator-webhook-server-7c6fcc658d-pprtw\" (UID: \"de2ec55e-51fe-48f2-87f5-06fb1ceed00a\") " pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.310723 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.332065 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss"] Jan 30 05:20:18 crc kubenswrapper[4841]: W0130 05:20:18.349158 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod581356ab_8116_4326_b032_e5cc2b5ce488.slice/crio-453d05b3a5437a6589d96a6df2928224ef226937f3a3b0fc11ec24e7cdb3854f WatchSource:0}: Error finding container 453d05b3a5437a6589d96a6df2928224ef226937f3a3b0fc11ec24e7cdb3854f: Status 404 returned error can't find the container with id 453d05b3a5437a6589d96a6df2928224ef226937f3a3b0fc11ec24e7cdb3854f Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.594808 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw"] Jan 30 05:20:18 crc kubenswrapper[4841]: W0130 05:20:18.601997 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde2ec55e_51fe_48f2_87f5_06fb1ceed00a.slice/crio-99ad2ec1a3b7f7bc1ccd14a797b3cbf7cfed8d0b987c0445536c25c183a04793 WatchSource:0}: Error finding container 99ad2ec1a3b7f7bc1ccd14a797b3cbf7cfed8d0b987c0445536c25c183a04793: Status 404 returned error can't find the container with id 99ad2ec1a3b7f7bc1ccd14a797b3cbf7cfed8d0b987c0445536c25c183a04793 Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.812985 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" event={"ID":"de2ec55e-51fe-48f2-87f5-06fb1ceed00a","Type":"ContainerStarted","Data":"99ad2ec1a3b7f7bc1ccd14a797b3cbf7cfed8d0b987c0445536c25c183a04793"} Jan 30 05:20:18 crc kubenswrapper[4841]: I0130 05:20:18.814796 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" event={"ID":"581356ab-8116-4326-b032-e5cc2b5ce488","Type":"ContainerStarted","Data":"453d05b3a5437a6589d96a6df2928224ef226937f3a3b0fc11ec24e7cdb3854f"} Jan 30 05:20:24 crc kubenswrapper[4841]: I0130 05:20:24.873521 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" event={"ID":"581356ab-8116-4326-b032-e5cc2b5ce488","Type":"ContainerStarted","Data":"ce76af69be5d2114439af2889d3bf831efdc7602196bb029c2f52f0025a41220"} Jan 30 05:20:24 crc kubenswrapper[4841]: I0130 05:20:24.874012 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:24 crc kubenswrapper[4841]: I0130 05:20:24.895757 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" podStartSLOduration=1.927289263 podStartE2EDuration="7.895739641s" podCreationTimestamp="2026-01-30 05:20:17 +0000 UTC" firstStartedPulling="2026-01-30 05:20:18.350752195 +0000 UTC m=+755.344224833" lastFinishedPulling="2026-01-30 05:20:24.319202563 +0000 UTC m=+761.312675211" observedRunningTime="2026-01-30 05:20:24.890385251 +0000 UTC m=+761.883857889" watchObservedRunningTime="2026-01-30 05:20:24.895739641 +0000 UTC m=+761.889212279" Jan 30 05:20:26 crc kubenswrapper[4841]: I0130 05:20:26.885229 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" event={"ID":"de2ec55e-51fe-48f2-87f5-06fb1ceed00a","Type":"ContainerStarted","Data":"09c76153fd1a0fd0da9b1c274548e94c831aba9e7f96d4b13cee55e9dd3980d9"} Jan 30 05:20:26 crc kubenswrapper[4841]: I0130 05:20:26.885571 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:26 crc kubenswrapper[4841]: I0130 05:20:26.916926 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" podStartSLOduration=2.193878515 podStartE2EDuration="9.916902564s" podCreationTimestamp="2026-01-30 05:20:17 +0000 UTC" firstStartedPulling="2026-01-30 05:20:18.605215976 +0000 UTC m=+755.598688614" lastFinishedPulling="2026-01-30 05:20:26.328240015 +0000 UTC m=+763.321712663" observedRunningTime="2026-01-30 05:20:26.912088936 +0000 UTC m=+763.905561574" watchObservedRunningTime="2026-01-30 05:20:26.916902564 +0000 UTC m=+763.910375202" Jan 30 05:20:38 crc kubenswrapper[4841]: I0130 05:20:38.318137 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7c6fcc658d-pprtw" Jan 30 05:20:40 crc kubenswrapper[4841]: I0130 05:20:40.463293 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:20:40 crc kubenswrapper[4841]: I0130 05:20:40.463709 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:20:40 crc kubenswrapper[4841]: I0130 05:20:40.463771 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:20:40 crc kubenswrapper[4841]: I0130 05:20:40.464605 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"59fbdcdf822ce8767e625a9fdb4978cefa9eaf5250b991d0c4b4b761a1a7e71e"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:20:40 crc kubenswrapper[4841]: I0130 05:20:40.464730 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://59fbdcdf822ce8767e625a9fdb4978cefa9eaf5250b991d0c4b4b761a1a7e71e" gracePeriod=600 Jan 30 05:20:40 crc kubenswrapper[4841]: I0130 05:20:40.976910 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="59fbdcdf822ce8767e625a9fdb4978cefa9eaf5250b991d0c4b4b761a1a7e71e" exitCode=0 Jan 30 05:20:40 crc kubenswrapper[4841]: I0130 05:20:40.976980 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"59fbdcdf822ce8767e625a9fdb4978cefa9eaf5250b991d0c4b4b761a1a7e71e"} Jan 30 05:20:40 crc kubenswrapper[4841]: I0130 05:20:40.977214 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"6383000b3c9a91b0197a8dcc28ebef33bc3b14f4f0dc8fac3dcfc3c8dcddb775"} Jan 30 05:20:40 crc kubenswrapper[4841]: I0130 05:20:40.977239 4841 scope.go:117] "RemoveContainer" containerID="8c604b3b136e322bd1c3a2747b527f8fe3d354491e2d2280a44ada1ad6f6b10d" Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.141825 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pt86s"] Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.143629 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.161408 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pt86s"] Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.311033 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0334a91b-fcc2-449e-8116-10ab1b289168-catalog-content\") pod \"certified-operators-pt86s\" (UID: \"0334a91b-fcc2-449e-8116-10ab1b289168\") " pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.311111 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vljc\" (UniqueName: \"kubernetes.io/projected/0334a91b-fcc2-449e-8116-10ab1b289168-kube-api-access-4vljc\") pod \"certified-operators-pt86s\" (UID: \"0334a91b-fcc2-449e-8116-10ab1b289168\") " pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.311142 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0334a91b-fcc2-449e-8116-10ab1b289168-utilities\") pod \"certified-operators-pt86s\" (UID: \"0334a91b-fcc2-449e-8116-10ab1b289168\") " pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.413471 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0334a91b-fcc2-449e-8116-10ab1b289168-utilities\") pod \"certified-operators-pt86s\" (UID: \"0334a91b-fcc2-449e-8116-10ab1b289168\") " pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.413965 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0334a91b-fcc2-449e-8116-10ab1b289168-catalog-content\") pod \"certified-operators-pt86s\" (UID: \"0334a91b-fcc2-449e-8116-10ab1b289168\") " pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.413997 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vljc\" (UniqueName: \"kubernetes.io/projected/0334a91b-fcc2-449e-8116-10ab1b289168-kube-api-access-4vljc\") pod \"certified-operators-pt86s\" (UID: \"0334a91b-fcc2-449e-8116-10ab1b289168\") " pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.415076 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0334a91b-fcc2-449e-8116-10ab1b289168-utilities\") pod \"certified-operators-pt86s\" (UID: \"0334a91b-fcc2-449e-8116-10ab1b289168\") " pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.415568 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0334a91b-fcc2-449e-8116-10ab1b289168-catalog-content\") pod \"certified-operators-pt86s\" (UID: \"0334a91b-fcc2-449e-8116-10ab1b289168\") " pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.437995 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vljc\" (UniqueName: \"kubernetes.io/projected/0334a91b-fcc2-449e-8116-10ab1b289168-kube-api-access-4vljc\") pod \"certified-operators-pt86s\" (UID: \"0334a91b-fcc2-449e-8116-10ab1b289168\") " pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.471680 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:49 crc kubenswrapper[4841]: I0130 05:20:49.683696 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pt86s"] Jan 30 05:20:50 crc kubenswrapper[4841]: I0130 05:20:50.032387 4841 generic.go:334] "Generic (PLEG): container finished" podID="0334a91b-fcc2-449e-8116-10ab1b289168" containerID="74237f13422957bc2160490954c47552b3aaa99a90e0ac851e8de7e7c29c9648" exitCode=0 Jan 30 05:20:50 crc kubenswrapper[4841]: I0130 05:20:50.032460 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt86s" event={"ID":"0334a91b-fcc2-449e-8116-10ab1b289168","Type":"ContainerDied","Data":"74237f13422957bc2160490954c47552b3aaa99a90e0ac851e8de7e7c29c9648"} Jan 30 05:20:50 crc kubenswrapper[4841]: I0130 05:20:50.032491 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt86s" event={"ID":"0334a91b-fcc2-449e-8116-10ab1b289168","Type":"ContainerStarted","Data":"df859ee966401506ac4be30cff57f89d046e0d923976373123bf02550e2c27d5"} Jan 30 05:20:51 crc kubenswrapper[4841]: I0130 05:20:51.047775 4841 generic.go:334] "Generic (PLEG): container finished" podID="0334a91b-fcc2-449e-8116-10ab1b289168" containerID="3a107043f465429ab2b499a599175c858b4233bb0f72cea2967e64d6ff4786c8" exitCode=0 Jan 30 05:20:51 crc kubenswrapper[4841]: I0130 05:20:51.047830 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt86s" event={"ID":"0334a91b-fcc2-449e-8116-10ab1b289168","Type":"ContainerDied","Data":"3a107043f465429ab2b499a599175c858b4233bb0f72cea2967e64d6ff4786c8"} Jan 30 05:20:52 crc kubenswrapper[4841]: I0130 05:20:52.056792 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt86s" event={"ID":"0334a91b-fcc2-449e-8116-10ab1b289168","Type":"ContainerStarted","Data":"9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360"} Jan 30 05:20:52 crc kubenswrapper[4841]: I0130 05:20:52.079832 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pt86s" podStartSLOduration=1.538003708 podStartE2EDuration="3.079813591s" podCreationTimestamp="2026-01-30 05:20:49 +0000 UTC" firstStartedPulling="2026-01-30 05:20:50.033950256 +0000 UTC m=+787.027422904" lastFinishedPulling="2026-01-30 05:20:51.575760109 +0000 UTC m=+788.569232787" observedRunningTime="2026-01-30 05:20:52.076644834 +0000 UTC m=+789.070117512" watchObservedRunningTime="2026-01-30 05:20:52.079813591 +0000 UTC m=+789.073286239" Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.122422 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-plcnk"] Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.123784 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.144025 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-plcnk"] Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.179232 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smbvv\" (UniqueName: \"kubernetes.io/projected/a4263196-88ed-4050-82b9-2cde6e41f3a8-kube-api-access-smbvv\") pod \"redhat-marketplace-plcnk\" (UID: \"a4263196-88ed-4050-82b9-2cde6e41f3a8\") " pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.179326 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4263196-88ed-4050-82b9-2cde6e41f3a8-utilities\") pod \"redhat-marketplace-plcnk\" (UID: \"a4263196-88ed-4050-82b9-2cde6e41f3a8\") " pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.179589 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4263196-88ed-4050-82b9-2cde6e41f3a8-catalog-content\") pod \"redhat-marketplace-plcnk\" (UID: \"a4263196-88ed-4050-82b9-2cde6e41f3a8\") " pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.280381 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smbvv\" (UniqueName: \"kubernetes.io/projected/a4263196-88ed-4050-82b9-2cde6e41f3a8-kube-api-access-smbvv\") pod \"redhat-marketplace-plcnk\" (UID: \"a4263196-88ed-4050-82b9-2cde6e41f3a8\") " pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.280700 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4263196-88ed-4050-82b9-2cde6e41f3a8-utilities\") pod \"redhat-marketplace-plcnk\" (UID: \"a4263196-88ed-4050-82b9-2cde6e41f3a8\") " pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.280757 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4263196-88ed-4050-82b9-2cde6e41f3a8-catalog-content\") pod \"redhat-marketplace-plcnk\" (UID: \"a4263196-88ed-4050-82b9-2cde6e41f3a8\") " pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.281167 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4263196-88ed-4050-82b9-2cde6e41f3a8-catalog-content\") pod \"redhat-marketplace-plcnk\" (UID: \"a4263196-88ed-4050-82b9-2cde6e41f3a8\") " pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.281219 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4263196-88ed-4050-82b9-2cde6e41f3a8-utilities\") pod \"redhat-marketplace-plcnk\" (UID: \"a4263196-88ed-4050-82b9-2cde6e41f3a8\") " pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.307873 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smbvv\" (UniqueName: \"kubernetes.io/projected/a4263196-88ed-4050-82b9-2cde6e41f3a8-kube-api-access-smbvv\") pod \"redhat-marketplace-plcnk\" (UID: \"a4263196-88ed-4050-82b9-2cde6e41f3a8\") " pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.451665 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:20:54 crc kubenswrapper[4841]: I0130 05:20:54.641515 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-plcnk"] Jan 30 05:20:55 crc kubenswrapper[4841]: I0130 05:20:55.078812 4841 generic.go:334] "Generic (PLEG): container finished" podID="a4263196-88ed-4050-82b9-2cde6e41f3a8" containerID="0e390c5c8e5e3beaaa8eba879c23c2e9172b04dad7c0a31345344d1a3c8495d3" exitCode=0 Jan 30 05:20:55 crc kubenswrapper[4841]: I0130 05:20:55.078857 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plcnk" event={"ID":"a4263196-88ed-4050-82b9-2cde6e41f3a8","Type":"ContainerDied","Data":"0e390c5c8e5e3beaaa8eba879c23c2e9172b04dad7c0a31345344d1a3c8495d3"} Jan 30 05:20:55 crc kubenswrapper[4841]: I0130 05:20:55.078883 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plcnk" event={"ID":"a4263196-88ed-4050-82b9-2cde6e41f3a8","Type":"ContainerStarted","Data":"0f9a8cb055dc5f3963ae0c3c93ab7f0f0d93d17647cd4b5cb3ee676ea1ac4c1d"} Jan 30 05:20:56 crc kubenswrapper[4841]: I0130 05:20:56.094791 4841 generic.go:334] "Generic (PLEG): container finished" podID="a4263196-88ed-4050-82b9-2cde6e41f3a8" containerID="d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c" exitCode=0 Jan 30 05:20:56 crc kubenswrapper[4841]: I0130 05:20:56.094885 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plcnk" event={"ID":"a4263196-88ed-4050-82b9-2cde6e41f3a8","Type":"ContainerDied","Data":"d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c"} Jan 30 05:20:56 crc kubenswrapper[4841]: E0130 05:20:56.109998 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4263196_88ed_4050_82b9_2cde6e41f3a8.slice/crio-d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4263196_88ed_4050_82b9_2cde6e41f3a8.slice/crio-conmon-d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:20:57 crc kubenswrapper[4841]: I0130 05:20:57.106428 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plcnk" event={"ID":"a4263196-88ed-4050-82b9-2cde6e41f3a8","Type":"ContainerStarted","Data":"a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123"} Jan 30 05:20:57 crc kubenswrapper[4841]: I0130 05:20:57.137482 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-plcnk" podStartSLOduration=1.6559332310000001 podStartE2EDuration="3.137459716s" podCreationTimestamp="2026-01-30 05:20:54 +0000 UTC" firstStartedPulling="2026-01-30 05:20:55.080667925 +0000 UTC m=+792.074140573" lastFinishedPulling="2026-01-30 05:20:56.56219441 +0000 UTC m=+793.555667058" observedRunningTime="2026-01-30 05:20:57.1317196 +0000 UTC m=+794.125192248" watchObservedRunningTime="2026-01-30 05:20:57.137459716 +0000 UTC m=+794.130932394" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.054257 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6b59845b8f-g7kss" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.760550 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4"] Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.761743 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.763665 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-sq72v"] Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.765649 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.768760 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.769018 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.769560 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.770537 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gqwjv" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.782224 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4"] Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.870887 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-lq5x8"] Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.871903 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.876306 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.881329 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-2pl4z"] Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.882215 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2pl4z" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.884698 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.884860 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.885646 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-g8ghb" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.903479 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.907155 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-lq5x8"] Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.946237 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d80501d3-b73c-4a52-a12d-81e0115bc785-frr-startup\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.946292 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8kg9\" (UniqueName: \"kubernetes.io/projected/d80501d3-b73c-4a52-a12d-81e0115bc785-kube-api-access-q8kg9\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.946311 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d80501d3-b73c-4a52-a12d-81e0115bc785-frr-conf\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.946334 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d80501d3-b73c-4a52-a12d-81e0115bc785-metrics\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.946483 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d80501d3-b73c-4a52-a12d-81e0115bc785-metrics-certs\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.946533 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07615a0a-8c5e-4800-894f-96d6d83fdc93-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-h8rs4\" (UID: \"07615a0a-8c5e-4800-894f-96d6d83fdc93\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.946579 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d80501d3-b73c-4a52-a12d-81e0115bc785-frr-sockets\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.946615 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhftj\" (UniqueName: \"kubernetes.io/projected/07615a0a-8c5e-4800-894f-96d6d83fdc93-kube-api-access-nhftj\") pod \"frr-k8s-webhook-server-7df86c4f6c-h8rs4\" (UID: \"07615a0a-8c5e-4800-894f-96d6d83fdc93\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" Jan 30 05:20:58 crc kubenswrapper[4841]: I0130 05:20:58.946652 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d80501d3-b73c-4a52-a12d-81e0115bc785-reloader\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048327 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqw7j\" (UniqueName: \"kubernetes.io/projected/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-kube-api-access-vqw7j\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048386 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d1b8121-f308-408b-9a30-c80bf53ce798-cert\") pod \"controller-6968d8fdc4-lq5x8\" (UID: \"4d1b8121-f308-408b-9a30-c80bf53ce798\") " pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048431 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d80501d3-b73c-4a52-a12d-81e0115bc785-frr-startup\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048456 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8kg9\" (UniqueName: \"kubernetes.io/projected/d80501d3-b73c-4a52-a12d-81e0115bc785-kube-api-access-q8kg9\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048527 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d80501d3-b73c-4a52-a12d-81e0115bc785-frr-conf\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048606 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d80501d3-b73c-4a52-a12d-81e0115bc785-metrics\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048704 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d80501d3-b73c-4a52-a12d-81e0115bc785-metrics-certs\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048724 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d1b8121-f308-408b-9a30-c80bf53ce798-metrics-certs\") pod \"controller-6968d8fdc4-lq5x8\" (UID: \"4d1b8121-f308-408b-9a30-c80bf53ce798\") " pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048757 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bwrj\" (UniqueName: \"kubernetes.io/projected/4d1b8121-f308-408b-9a30-c80bf53ce798-kube-api-access-9bwrj\") pod \"controller-6968d8fdc4-lq5x8\" (UID: \"4d1b8121-f308-408b-9a30-c80bf53ce798\") " pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048790 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07615a0a-8c5e-4800-894f-96d6d83fdc93-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-h8rs4\" (UID: \"07615a0a-8c5e-4800-894f-96d6d83fdc93\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048819 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-metallb-excludel2\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048852 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-memberlist\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:20:59 crc kubenswrapper[4841]: E0130 05:20:59.048858 4841 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048882 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-metrics-certs\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048905 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d80501d3-b73c-4a52-a12d-81e0115bc785-frr-sockets\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: E0130 05:20:59.048938 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d80501d3-b73c-4a52-a12d-81e0115bc785-metrics-certs podName:d80501d3-b73c-4a52-a12d-81e0115bc785 nodeName:}" failed. No retries permitted until 2026-01-30 05:20:59.548920149 +0000 UTC m=+796.542392787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d80501d3-b73c-4a52-a12d-81e0115bc785-metrics-certs") pod "frr-k8s-sq72v" (UID: "d80501d3-b73c-4a52-a12d-81e0115bc785") : secret "frr-k8s-certs-secret" not found Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.048998 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhftj\" (UniqueName: \"kubernetes.io/projected/07615a0a-8c5e-4800-894f-96d6d83fdc93-kube-api-access-nhftj\") pod \"frr-k8s-webhook-server-7df86c4f6c-h8rs4\" (UID: \"07615a0a-8c5e-4800-894f-96d6d83fdc93\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.049039 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d80501d3-b73c-4a52-a12d-81e0115bc785-reloader\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.049294 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d80501d3-b73c-4a52-a12d-81e0115bc785-reloader\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.049346 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d80501d3-b73c-4a52-a12d-81e0115bc785-frr-sockets\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.049374 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d80501d3-b73c-4a52-a12d-81e0115bc785-metrics\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.049555 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d80501d3-b73c-4a52-a12d-81e0115bc785-frr-conf\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.049683 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d80501d3-b73c-4a52-a12d-81e0115bc785-frr-startup\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.057442 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07615a0a-8c5e-4800-894f-96d6d83fdc93-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-h8rs4\" (UID: \"07615a0a-8c5e-4800-894f-96d6d83fdc93\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.065875 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8kg9\" (UniqueName: \"kubernetes.io/projected/d80501d3-b73c-4a52-a12d-81e0115bc785-kube-api-access-q8kg9\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.070090 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhftj\" (UniqueName: \"kubernetes.io/projected/07615a0a-8c5e-4800-894f-96d6d83fdc93-kube-api-access-nhftj\") pod \"frr-k8s-webhook-server-7df86c4f6c-h8rs4\" (UID: \"07615a0a-8c5e-4800-894f-96d6d83fdc93\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.082738 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.150292 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqw7j\" (UniqueName: \"kubernetes.io/projected/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-kube-api-access-vqw7j\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.150578 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d1b8121-f308-408b-9a30-c80bf53ce798-cert\") pod \"controller-6968d8fdc4-lq5x8\" (UID: \"4d1b8121-f308-408b-9a30-c80bf53ce798\") " pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.150692 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d1b8121-f308-408b-9a30-c80bf53ce798-metrics-certs\") pod \"controller-6968d8fdc4-lq5x8\" (UID: \"4d1b8121-f308-408b-9a30-c80bf53ce798\") " pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:20:59 crc kubenswrapper[4841]: E0130 05:20:59.150774 4841 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.150779 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bwrj\" (UniqueName: \"kubernetes.io/projected/4d1b8121-f308-408b-9a30-c80bf53ce798-kube-api-access-9bwrj\") pod \"controller-6968d8fdc4-lq5x8\" (UID: \"4d1b8121-f308-408b-9a30-c80bf53ce798\") " pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:20:59 crc kubenswrapper[4841]: E0130 05:20:59.150835 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d1b8121-f308-408b-9a30-c80bf53ce798-metrics-certs podName:4d1b8121-f308-408b-9a30-c80bf53ce798 nodeName:}" failed. No retries permitted until 2026-01-30 05:20:59.650818867 +0000 UTC m=+796.644291505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4d1b8121-f308-408b-9a30-c80bf53ce798-metrics-certs") pod "controller-6968d8fdc4-lq5x8" (UID: "4d1b8121-f308-408b-9a30-c80bf53ce798") : secret "controller-certs-secret" not found Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.150915 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-metallb-excludel2\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.150951 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-memberlist\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.150995 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-metrics-certs\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:20:59 crc kubenswrapper[4841]: E0130 05:20:59.151064 4841 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 30 05:20:59 crc kubenswrapper[4841]: E0130 05:20:59.151093 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-memberlist podName:2f6fd33c-3515-4d5b-ac7a-582a89f3c82c nodeName:}" failed. No retries permitted until 2026-01-30 05:20:59.651085104 +0000 UTC m=+796.644557742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-memberlist") pod "speaker-2pl4z" (UID: "2f6fd33c-3515-4d5b-ac7a-582a89f3c82c") : secret "metallb-memberlist" not found Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.151618 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-metallb-excludel2\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.152123 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.154553 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-metrics-certs\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.166819 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d1b8121-f308-408b-9a30-c80bf53ce798-cert\") pod \"controller-6968d8fdc4-lq5x8\" (UID: \"4d1b8121-f308-408b-9a30-c80bf53ce798\") " pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.167919 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bwrj\" (UniqueName: \"kubernetes.io/projected/4d1b8121-f308-408b-9a30-c80bf53ce798-kube-api-access-9bwrj\") pod \"controller-6968d8fdc4-lq5x8\" (UID: \"4d1b8121-f308-408b-9a30-c80bf53ce798\") " pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.169860 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqw7j\" (UniqueName: \"kubernetes.io/projected/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-kube-api-access-vqw7j\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.472947 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.473478 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.548448 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.556622 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d80501d3-b73c-4a52-a12d-81e0115bc785-metrics-certs\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.564488 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d80501d3-b73c-4a52-a12d-81e0115bc785-metrics-certs\") pod \"frr-k8s-sq72v\" (UID: \"d80501d3-b73c-4a52-a12d-81e0115bc785\") " pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.579939 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4"] Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.658792 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d1b8121-f308-408b-9a30-c80bf53ce798-metrics-certs\") pod \"controller-6968d8fdc4-lq5x8\" (UID: \"4d1b8121-f308-408b-9a30-c80bf53ce798\") " pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.658874 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-memberlist\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:20:59 crc kubenswrapper[4841]: E0130 05:20:59.659144 4841 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 30 05:20:59 crc kubenswrapper[4841]: E0130 05:20:59.659271 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-memberlist podName:2f6fd33c-3515-4d5b-ac7a-582a89f3c82c nodeName:}" failed. No retries permitted until 2026-01-30 05:21:00.659243048 +0000 UTC m=+797.652715726 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-memberlist") pod "speaker-2pl4z" (UID: "2f6fd33c-3515-4d5b-ac7a-582a89f3c82c") : secret "metallb-memberlist" not found Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.664329 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d1b8121-f308-408b-9a30-c80bf53ce798-metrics-certs\") pod \"controller-6968d8fdc4-lq5x8\" (UID: \"4d1b8121-f308-408b-9a30-c80bf53ce798\") " pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.693891 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-sq72v" Jan 30 05:20:59 crc kubenswrapper[4841]: I0130 05:20:59.794683 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:21:00 crc kubenswrapper[4841]: I0130 05:21:00.092106 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-lq5x8"] Jan 30 05:21:00 crc kubenswrapper[4841]: I0130 05:21:00.159723 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sq72v" event={"ID":"d80501d3-b73c-4a52-a12d-81e0115bc785","Type":"ContainerStarted","Data":"b7118cf171e287994b9f016c9549807a1d80148dbd9b78614a348f23dcb6a893"} Jan 30 05:21:00 crc kubenswrapper[4841]: I0130 05:21:00.160719 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" event={"ID":"07615a0a-8c5e-4800-894f-96d6d83fdc93","Type":"ContainerStarted","Data":"1a9ad65c436115f3e39f0f0ae4882415291beea9da5c04434fc824f8d39e94ff"} Jan 30 05:21:00 crc kubenswrapper[4841]: I0130 05:21:00.161519 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-lq5x8" event={"ID":"4d1b8121-f308-408b-9a30-c80bf53ce798","Type":"ContainerStarted","Data":"4d8b1d8554d47785fde4b508114aa0c67eade1bb61713e6ea9ed4b663022ca76"} Jan 30 05:21:00 crc kubenswrapper[4841]: I0130 05:21:00.202540 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:21:00 crc kubenswrapper[4841]: I0130 05:21:00.673773 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-memberlist\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:21:00 crc kubenswrapper[4841]: I0130 05:21:00.681968 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2f6fd33c-3515-4d5b-ac7a-582a89f3c82c-memberlist\") pod \"speaker-2pl4z\" (UID: \"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c\") " pod="metallb-system/speaker-2pl4z" Jan 30 05:21:00 crc kubenswrapper[4841]: I0130 05:21:00.704718 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2pl4z" Jan 30 05:21:00 crc kubenswrapper[4841]: I0130 05:21:00.719631 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pt86s"] Jan 30 05:21:00 crc kubenswrapper[4841]: W0130 05:21:00.737123 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f6fd33c_3515_4d5b_ac7a_582a89f3c82c.slice/crio-5132d8ca23f9287503715207b7fa45120b305d3b56cae1eff441290aadf77c70 WatchSource:0}: Error finding container 5132d8ca23f9287503715207b7fa45120b305d3b56cae1eff441290aadf77c70: Status 404 returned error can't find the container with id 5132d8ca23f9287503715207b7fa45120b305d3b56cae1eff441290aadf77c70 Jan 30 05:21:01 crc kubenswrapper[4841]: I0130 05:21:01.169865 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-lq5x8" event={"ID":"4d1b8121-f308-408b-9a30-c80bf53ce798","Type":"ContainerStarted","Data":"132f979ede03a9fe441e3a8d18c6f5015de4703037af6522e414db214f1a6e1c"} Jan 30 05:21:01 crc kubenswrapper[4841]: I0130 05:21:01.170317 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:21:01 crc kubenswrapper[4841]: I0130 05:21:01.170328 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-lq5x8" event={"ID":"4d1b8121-f308-408b-9a30-c80bf53ce798","Type":"ContainerStarted","Data":"60d1fc6185e77dcaab04461704061344b8dc4348390f0b26acdd73acddc1d71c"} Jan 30 05:21:01 crc kubenswrapper[4841]: I0130 05:21:01.172977 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2pl4z" event={"ID":"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c","Type":"ContainerStarted","Data":"ad4ccf656499d856b223aa8af050b859bb95450fc4b16e7560c3839ef97e50d3"} Jan 30 05:21:01 crc kubenswrapper[4841]: I0130 05:21:01.172999 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2pl4z" event={"ID":"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c","Type":"ContainerStarted","Data":"5132d8ca23f9287503715207b7fa45120b305d3b56cae1eff441290aadf77c70"} Jan 30 05:21:01 crc kubenswrapper[4841]: I0130 05:21:01.188540 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-lq5x8" podStartSLOduration=3.188524139 podStartE2EDuration="3.188524139s" podCreationTimestamp="2026-01-30 05:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:21:01.188127048 +0000 UTC m=+798.181599686" watchObservedRunningTime="2026-01-30 05:21:01.188524139 +0000 UTC m=+798.181996777" Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.197103 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2pl4z" event={"ID":"2f6fd33c-3515-4d5b-ac7a-582a89f3c82c","Type":"ContainerStarted","Data":"ffd3724dddb66fecf58afce4cb54bfee4a87f22756be3f1bf86f824ac715e751"} Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.198114 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pt86s" podUID="0334a91b-fcc2-449e-8116-10ab1b289168" containerName="registry-server" containerID="cri-o://9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360" gracePeriod=2 Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.235840 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-2pl4z" podStartSLOduration=4.235824597 podStartE2EDuration="4.235824597s" podCreationTimestamp="2026-01-30 05:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:21:02.232123557 +0000 UTC m=+799.225596195" watchObservedRunningTime="2026-01-30 05:21:02.235824597 +0000 UTC m=+799.229297235" Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.578565 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.618017 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0334a91b-fcc2-449e-8116-10ab1b289168-catalog-content\") pod \"0334a91b-fcc2-449e-8116-10ab1b289168\" (UID: \"0334a91b-fcc2-449e-8116-10ab1b289168\") " Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.618153 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vljc\" (UniqueName: \"kubernetes.io/projected/0334a91b-fcc2-449e-8116-10ab1b289168-kube-api-access-4vljc\") pod \"0334a91b-fcc2-449e-8116-10ab1b289168\" (UID: \"0334a91b-fcc2-449e-8116-10ab1b289168\") " Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.618220 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0334a91b-fcc2-449e-8116-10ab1b289168-utilities\") pod \"0334a91b-fcc2-449e-8116-10ab1b289168\" (UID: \"0334a91b-fcc2-449e-8116-10ab1b289168\") " Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.618967 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0334a91b-fcc2-449e-8116-10ab1b289168-utilities" (OuterVolumeSpecName: "utilities") pod "0334a91b-fcc2-449e-8116-10ab1b289168" (UID: "0334a91b-fcc2-449e-8116-10ab1b289168"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.633041 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0334a91b-fcc2-449e-8116-10ab1b289168-kube-api-access-4vljc" (OuterVolumeSpecName: "kube-api-access-4vljc") pod "0334a91b-fcc2-449e-8116-10ab1b289168" (UID: "0334a91b-fcc2-449e-8116-10ab1b289168"). InnerVolumeSpecName "kube-api-access-4vljc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.696290 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0334a91b-fcc2-449e-8116-10ab1b289168-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0334a91b-fcc2-449e-8116-10ab1b289168" (UID: "0334a91b-fcc2-449e-8116-10ab1b289168"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.719438 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vljc\" (UniqueName: \"kubernetes.io/projected/0334a91b-fcc2-449e-8116-10ab1b289168-kube-api-access-4vljc\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.719475 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0334a91b-fcc2-449e-8116-10ab1b289168-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:02 crc kubenswrapper[4841]: I0130 05:21:02.719485 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0334a91b-fcc2-449e-8116-10ab1b289168-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.216799 4841 generic.go:334] "Generic (PLEG): container finished" podID="0334a91b-fcc2-449e-8116-10ab1b289168" containerID="9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360" exitCode=0 Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.216892 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pt86s" Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.216912 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt86s" event={"ID":"0334a91b-fcc2-449e-8116-10ab1b289168","Type":"ContainerDied","Data":"9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360"} Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.216955 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt86s" event={"ID":"0334a91b-fcc2-449e-8116-10ab1b289168","Type":"ContainerDied","Data":"df859ee966401506ac4be30cff57f89d046e0d923976373123bf02550e2c27d5"} Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.216977 4841 scope.go:117] "RemoveContainer" containerID="9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360" Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.217042 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-2pl4z" Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.247709 4841 scope.go:117] "RemoveContainer" containerID="3a107043f465429ab2b499a599175c858b4233bb0f72cea2967e64d6ff4786c8" Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.248562 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pt86s"] Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.255884 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pt86s"] Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.279962 4841 scope.go:117] "RemoveContainer" containerID="74237f13422957bc2160490954c47552b3aaa99a90e0ac851e8de7e7c29c9648" Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.303654 4841 scope.go:117] "RemoveContainer" containerID="9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360" Jan 30 05:21:03 crc kubenswrapper[4841]: E0130 05:21:03.304195 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360\": container with ID starting with 9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360 not found: ID does not exist" containerID="9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360" Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.304242 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360"} err="failed to get container status \"9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360\": rpc error: code = NotFound desc = could not find container \"9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360\": container with ID starting with 9dd0d299bc4f19a409f30aea86fa9fb7d0806f4612b16537bd43da9b677b5360 not found: ID does not exist" Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.304264 4841 scope.go:117] "RemoveContainer" containerID="3a107043f465429ab2b499a599175c858b4233bb0f72cea2967e64d6ff4786c8" Jan 30 05:21:03 crc kubenswrapper[4841]: E0130 05:21:03.304883 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a107043f465429ab2b499a599175c858b4233bb0f72cea2967e64d6ff4786c8\": container with ID starting with 3a107043f465429ab2b499a599175c858b4233bb0f72cea2967e64d6ff4786c8 not found: ID does not exist" containerID="3a107043f465429ab2b499a599175c858b4233bb0f72cea2967e64d6ff4786c8" Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.304914 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a107043f465429ab2b499a599175c858b4233bb0f72cea2967e64d6ff4786c8"} err="failed to get container status \"3a107043f465429ab2b499a599175c858b4233bb0f72cea2967e64d6ff4786c8\": rpc error: code = NotFound desc = could not find container \"3a107043f465429ab2b499a599175c858b4233bb0f72cea2967e64d6ff4786c8\": container with ID starting with 3a107043f465429ab2b499a599175c858b4233bb0f72cea2967e64d6ff4786c8 not found: ID does not exist" Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.304929 4841 scope.go:117] "RemoveContainer" containerID="74237f13422957bc2160490954c47552b3aaa99a90e0ac851e8de7e7c29c9648" Jan 30 05:21:03 crc kubenswrapper[4841]: E0130 05:21:03.305164 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74237f13422957bc2160490954c47552b3aaa99a90e0ac851e8de7e7c29c9648\": container with ID starting with 74237f13422957bc2160490954c47552b3aaa99a90e0ac851e8de7e7c29c9648 not found: ID does not exist" containerID="74237f13422957bc2160490954c47552b3aaa99a90e0ac851e8de7e7c29c9648" Jan 30 05:21:03 crc kubenswrapper[4841]: I0130 05:21:03.305184 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74237f13422957bc2160490954c47552b3aaa99a90e0ac851e8de7e7c29c9648"} err="failed to get container status \"74237f13422957bc2160490954c47552b3aaa99a90e0ac851e8de7e7c29c9648\": rpc error: code = NotFound desc = could not find container \"74237f13422957bc2160490954c47552b3aaa99a90e0ac851e8de7e7c29c9648\": container with ID starting with 74237f13422957bc2160490954c47552b3aaa99a90e0ac851e8de7e7c29c9648 not found: ID does not exist" Jan 30 05:21:04 crc kubenswrapper[4841]: I0130 05:21:04.439830 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0334a91b-fcc2-449e-8116-10ab1b289168" path="/var/lib/kubelet/pods/0334a91b-fcc2-449e-8116-10ab1b289168/volumes" Jan 30 05:21:04 crc kubenswrapper[4841]: I0130 05:21:04.452690 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:21:04 crc kubenswrapper[4841]: I0130 05:21:04.452973 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:21:04 crc kubenswrapper[4841]: I0130 05:21:04.495335 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:21:05 crc kubenswrapper[4841]: I0130 05:21:05.265890 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:21:06 crc kubenswrapper[4841]: I0130 05:21:06.117894 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-plcnk"] Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.245532 4841 generic.go:334] "Generic (PLEG): container finished" podID="d80501d3-b73c-4a52-a12d-81e0115bc785" containerID="2d1babed33445984becbdb9e38d94e969a28633ccd755068e5bc6eb425d26268" exitCode=0 Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.245667 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sq72v" event={"ID":"d80501d3-b73c-4a52-a12d-81e0115bc785","Type":"ContainerDied","Data":"2d1babed33445984becbdb9e38d94e969a28633ccd755068e5bc6eb425d26268"} Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.248644 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" event={"ID":"07615a0a-8c5e-4800-894f-96d6d83fdc93","Type":"ContainerStarted","Data":"d314550fd24a95ff802dd8d11fb841b99df1795a0a182a38fb6152a8202b9b17"} Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.248943 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-plcnk" podUID="a4263196-88ed-4050-82b9-2cde6e41f3a8" containerName="registry-server" containerID="cri-o://a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123" gracePeriod=2 Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.248981 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.640212 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.671244 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" podStartSLOduration=2.433142169 podStartE2EDuration="9.671227864s" podCreationTimestamp="2026-01-30 05:20:58 +0000 UTC" firstStartedPulling="2026-01-30 05:20:59.596450602 +0000 UTC m=+796.589923280" lastFinishedPulling="2026-01-30 05:21:06.834536337 +0000 UTC m=+803.828008975" observedRunningTime="2026-01-30 05:21:07.305418418 +0000 UTC m=+804.298891066" watchObservedRunningTime="2026-01-30 05:21:07.671227864 +0000 UTC m=+804.664700502" Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.691721 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smbvv\" (UniqueName: \"kubernetes.io/projected/a4263196-88ed-4050-82b9-2cde6e41f3a8-kube-api-access-smbvv\") pod \"a4263196-88ed-4050-82b9-2cde6e41f3a8\" (UID: \"a4263196-88ed-4050-82b9-2cde6e41f3a8\") " Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.691863 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4263196-88ed-4050-82b9-2cde6e41f3a8-catalog-content\") pod \"a4263196-88ed-4050-82b9-2cde6e41f3a8\" (UID: \"a4263196-88ed-4050-82b9-2cde6e41f3a8\") " Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.691906 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4263196-88ed-4050-82b9-2cde6e41f3a8-utilities\") pod \"a4263196-88ed-4050-82b9-2cde6e41f3a8\" (UID: \"a4263196-88ed-4050-82b9-2cde6e41f3a8\") " Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.692983 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4263196-88ed-4050-82b9-2cde6e41f3a8-utilities" (OuterVolumeSpecName: "utilities") pod "a4263196-88ed-4050-82b9-2cde6e41f3a8" (UID: "a4263196-88ed-4050-82b9-2cde6e41f3a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.699113 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4263196-88ed-4050-82b9-2cde6e41f3a8-kube-api-access-smbvv" (OuterVolumeSpecName: "kube-api-access-smbvv") pod "a4263196-88ed-4050-82b9-2cde6e41f3a8" (UID: "a4263196-88ed-4050-82b9-2cde6e41f3a8"). InnerVolumeSpecName "kube-api-access-smbvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.711836 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4263196-88ed-4050-82b9-2cde6e41f3a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4263196-88ed-4050-82b9-2cde6e41f3a8" (UID: "a4263196-88ed-4050-82b9-2cde6e41f3a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.793349 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4263196-88ed-4050-82b9-2cde6e41f3a8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.793418 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4263196-88ed-4050-82b9-2cde6e41f3a8-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:07 crc kubenswrapper[4841]: I0130 05:21:07.793432 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smbvv\" (UniqueName: \"kubernetes.io/projected/a4263196-88ed-4050-82b9-2cde6e41f3a8-kube-api-access-smbvv\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.258671 4841 generic.go:334] "Generic (PLEG): container finished" podID="a4263196-88ed-4050-82b9-2cde6e41f3a8" containerID="a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123" exitCode=0 Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.258772 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plcnk" event={"ID":"a4263196-88ed-4050-82b9-2cde6e41f3a8","Type":"ContainerDied","Data":"a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123"} Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.258857 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plcnk" event={"ID":"a4263196-88ed-4050-82b9-2cde6e41f3a8","Type":"ContainerDied","Data":"0f9a8cb055dc5f3963ae0c3c93ab7f0f0d93d17647cd4b5cb3ee676ea1ac4c1d"} Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.258893 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plcnk" Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.258934 4841 scope.go:117] "RemoveContainer" containerID="a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123" Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.261769 4841 generic.go:334] "Generic (PLEG): container finished" podID="d80501d3-b73c-4a52-a12d-81e0115bc785" containerID="346d51eec44c5a84c19d8654310746c45e71532e6b5318e67306c2790f6cfea5" exitCode=0 Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.261929 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sq72v" event={"ID":"d80501d3-b73c-4a52-a12d-81e0115bc785","Type":"ContainerDied","Data":"346d51eec44c5a84c19d8654310746c45e71532e6b5318e67306c2790f6cfea5"} Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.287669 4841 scope.go:117] "RemoveContainer" containerID="d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c" Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.318877 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-plcnk"] Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.320452 4841 scope.go:117] "RemoveContainer" containerID="0e390c5c8e5e3beaaa8eba879c23c2e9172b04dad7c0a31345344d1a3c8495d3" Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.324977 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-plcnk"] Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.351262 4841 scope.go:117] "RemoveContainer" containerID="a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123" Jan 30 05:21:08 crc kubenswrapper[4841]: E0130 05:21:08.351788 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123\": container with ID starting with a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123 not found: ID does not exist" containerID="a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123" Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.351840 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123"} err="failed to get container status \"a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123\": rpc error: code = NotFound desc = could not find container \"a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123\": container with ID starting with a00e9f48abac49a321cb8d81b14cbdec265a42fd7c2ab0ac6bcefcff0bd14123 not found: ID does not exist" Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.351875 4841 scope.go:117] "RemoveContainer" containerID="d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c" Jan 30 05:21:08 crc kubenswrapper[4841]: E0130 05:21:08.352211 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c\": container with ID starting with d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c not found: ID does not exist" containerID="d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c" Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.352264 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c"} err="failed to get container status \"d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c\": rpc error: code = NotFound desc = could not find container \"d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c\": container with ID starting with d207213ea7305e9f976a3b44aa5632da8ca1f4368a25ea30b7085d93f44ac42c not found: ID does not exist" Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.352298 4841 scope.go:117] "RemoveContainer" containerID="0e390c5c8e5e3beaaa8eba879c23c2e9172b04dad7c0a31345344d1a3c8495d3" Jan 30 05:21:08 crc kubenswrapper[4841]: E0130 05:21:08.352700 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e390c5c8e5e3beaaa8eba879c23c2e9172b04dad7c0a31345344d1a3c8495d3\": container with ID starting with 0e390c5c8e5e3beaaa8eba879c23c2e9172b04dad7c0a31345344d1a3c8495d3 not found: ID does not exist" containerID="0e390c5c8e5e3beaaa8eba879c23c2e9172b04dad7c0a31345344d1a3c8495d3" Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.352726 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e390c5c8e5e3beaaa8eba879c23c2e9172b04dad7c0a31345344d1a3c8495d3"} err="failed to get container status \"0e390c5c8e5e3beaaa8eba879c23c2e9172b04dad7c0a31345344d1a3c8495d3\": rpc error: code = NotFound desc = could not find container \"0e390c5c8e5e3beaaa8eba879c23c2e9172b04dad7c0a31345344d1a3c8495d3\": container with ID starting with 0e390c5c8e5e3beaaa8eba879c23c2e9172b04dad7c0a31345344d1a3c8495d3 not found: ID does not exist" Jan 30 05:21:08 crc kubenswrapper[4841]: I0130 05:21:08.439940 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4263196-88ed-4050-82b9-2cde6e41f3a8" path="/var/lib/kubelet/pods/a4263196-88ed-4050-82b9-2cde6e41f3a8/volumes" Jan 30 05:21:09 crc kubenswrapper[4841]: I0130 05:21:09.274310 4841 generic.go:334] "Generic (PLEG): container finished" podID="d80501d3-b73c-4a52-a12d-81e0115bc785" containerID="1075c4c542bd03285f87ac8d8ee38dc57ac502f7318c718ea2764efe88a6b5bf" exitCode=0 Jan 30 05:21:09 crc kubenswrapper[4841]: I0130 05:21:09.274389 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sq72v" event={"ID":"d80501d3-b73c-4a52-a12d-81e0115bc785","Type":"ContainerDied","Data":"1075c4c542bd03285f87ac8d8ee38dc57ac502f7318c718ea2764efe88a6b5bf"} Jan 30 05:21:10 crc kubenswrapper[4841]: I0130 05:21:10.289136 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sq72v" event={"ID":"d80501d3-b73c-4a52-a12d-81e0115bc785","Type":"ContainerStarted","Data":"823f8f43650a6e3a758b509a6f6944ac4b7ec1e12e317229a918920cf060054d"} Jan 30 05:21:10 crc kubenswrapper[4841]: I0130 05:21:10.289194 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sq72v" event={"ID":"d80501d3-b73c-4a52-a12d-81e0115bc785","Type":"ContainerStarted","Data":"77db867c5c9a4135c0ef58c5c63ebae36a8545d6dde84c1cf4229fc7bfbd8847"} Jan 30 05:21:10 crc kubenswrapper[4841]: I0130 05:21:10.289212 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sq72v" event={"ID":"d80501d3-b73c-4a52-a12d-81e0115bc785","Type":"ContainerStarted","Data":"85db9e6e05364639cc93237d7d151b96553a11a25336d7245f4bda2c6bf90d09"} Jan 30 05:21:10 crc kubenswrapper[4841]: I0130 05:21:10.289226 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sq72v" event={"ID":"d80501d3-b73c-4a52-a12d-81e0115bc785","Type":"ContainerStarted","Data":"06ffea35e56f03cf22b15eb25e6ba3bb25c96a770afc8b3a0eae253d303725b7"} Jan 30 05:21:10 crc kubenswrapper[4841]: I0130 05:21:10.289239 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sq72v" event={"ID":"d80501d3-b73c-4a52-a12d-81e0115bc785","Type":"ContainerStarted","Data":"8124ac66fce8e08ccece3cb9a332196c60c92b0cecf59e5d0885d8c9addf70ed"} Jan 30 05:21:11 crc kubenswrapper[4841]: I0130 05:21:11.315476 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sq72v" event={"ID":"d80501d3-b73c-4a52-a12d-81e0115bc785","Type":"ContainerStarted","Data":"6835b0af3a0ee30b8415371ae8d35a55bb277764d5b4980ccffdcdfcdc28e020"} Jan 30 05:21:11 crc kubenswrapper[4841]: I0130 05:21:11.316024 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-sq72v" Jan 30 05:21:14 crc kubenswrapper[4841]: I0130 05:21:14.694801 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-sq72v" Jan 30 05:21:14 crc kubenswrapper[4841]: I0130 05:21:14.753487 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-sq72v" Jan 30 05:21:14 crc kubenswrapper[4841]: I0130 05:21:14.784796 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-sq72v" podStartSLOduration=9.777761338 podStartE2EDuration="16.784772207s" podCreationTimestamp="2026-01-30 05:20:58 +0000 UTC" firstStartedPulling="2026-01-30 05:20:59.843121772 +0000 UTC m=+796.836594410" lastFinishedPulling="2026-01-30 05:21:06.850132651 +0000 UTC m=+803.843605279" observedRunningTime="2026-01-30 05:21:11.358203177 +0000 UTC m=+808.351675855" watchObservedRunningTime="2026-01-30 05:21:14.784772207 +0000 UTC m=+811.778244885" Jan 30 05:21:19 crc kubenswrapper[4841]: I0130 05:21:19.092642 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-h8rs4" Jan 30 05:21:19 crc kubenswrapper[4841]: I0130 05:21:19.697335 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-sq72v" Jan 30 05:21:19 crc kubenswrapper[4841]: I0130 05:21:19.800304 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-lq5x8" Jan 30 05:21:20 crc kubenswrapper[4841]: I0130 05:21:20.712694 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-2pl4z" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.199557 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv"] Jan 30 05:21:22 crc kubenswrapper[4841]: E0130 05:21:22.199953 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0334a91b-fcc2-449e-8116-10ab1b289168" containerName="extract-utilities" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.199964 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0334a91b-fcc2-449e-8116-10ab1b289168" containerName="extract-utilities" Jan 30 05:21:22 crc kubenswrapper[4841]: E0130 05:21:22.199975 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4263196-88ed-4050-82b9-2cde6e41f3a8" containerName="extract-utilities" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.199982 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4263196-88ed-4050-82b9-2cde6e41f3a8" containerName="extract-utilities" Jan 30 05:21:22 crc kubenswrapper[4841]: E0130 05:21:22.199999 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0334a91b-fcc2-449e-8116-10ab1b289168" containerName="extract-content" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.200005 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0334a91b-fcc2-449e-8116-10ab1b289168" containerName="extract-content" Jan 30 05:21:22 crc kubenswrapper[4841]: E0130 05:21:22.200014 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4263196-88ed-4050-82b9-2cde6e41f3a8" containerName="extract-content" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.200019 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4263196-88ed-4050-82b9-2cde6e41f3a8" containerName="extract-content" Jan 30 05:21:22 crc kubenswrapper[4841]: E0130 05:21:22.200027 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0334a91b-fcc2-449e-8116-10ab1b289168" containerName="registry-server" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.200032 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0334a91b-fcc2-449e-8116-10ab1b289168" containerName="registry-server" Jan 30 05:21:22 crc kubenswrapper[4841]: E0130 05:21:22.200041 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4263196-88ed-4050-82b9-2cde6e41f3a8" containerName="registry-server" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.200046 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4263196-88ed-4050-82b9-2cde6e41f3a8" containerName="registry-server" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.200150 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0334a91b-fcc2-449e-8116-10ab1b289168" containerName="registry-server" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.200168 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4263196-88ed-4050-82b9-2cde6e41f3a8" containerName="registry-server" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.200868 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.202705 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.209762 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv"] Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.312213 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv\" (UID: \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.312267 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtsms\" (UniqueName: \"kubernetes.io/projected/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-kube-api-access-wtsms\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv\" (UID: \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.312296 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv\" (UID: \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.413675 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv\" (UID: \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.413744 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtsms\" (UniqueName: \"kubernetes.io/projected/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-kube-api-access-wtsms\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv\" (UID: \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.413802 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv\" (UID: \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.414217 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv\" (UID: \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.414366 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv\" (UID: \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.470062 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtsms\" (UniqueName: \"kubernetes.io/projected/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-kube-api-access-wtsms\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv\" (UID: \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.514330 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:22 crc kubenswrapper[4841]: I0130 05:21:22.815085 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv"] Jan 30 05:21:22 crc kubenswrapper[4841]: W0130 05:21:22.819549 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37e9576_6d8a_408d_8bbd_ce2df13aecf1.slice/crio-66dfe63c30c4512824fb71095d538cfee8a086cc462f3f7f06bfd8d46f20c803 WatchSource:0}: Error finding container 66dfe63c30c4512824fb71095d538cfee8a086cc462f3f7f06bfd8d46f20c803: Status 404 returned error can't find the container with id 66dfe63c30c4512824fb71095d538cfee8a086cc462f3f7f06bfd8d46f20c803 Jan 30 05:21:23 crc kubenswrapper[4841]: I0130 05:21:23.399856 4841 generic.go:334] "Generic (PLEG): container finished" podID="b37e9576-6d8a-408d-8bbd-ce2df13aecf1" containerID="abf2c5f0e06a503852be60acdcfe497dd81c11b35a26d83af1fa57879ff55cc2" exitCode=0 Jan 30 05:21:23 crc kubenswrapper[4841]: I0130 05:21:23.399953 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" event={"ID":"b37e9576-6d8a-408d-8bbd-ce2df13aecf1","Type":"ContainerDied","Data":"abf2c5f0e06a503852be60acdcfe497dd81c11b35a26d83af1fa57879ff55cc2"} Jan 30 05:21:23 crc kubenswrapper[4841]: I0130 05:21:23.400175 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" event={"ID":"b37e9576-6d8a-408d-8bbd-ce2df13aecf1","Type":"ContainerStarted","Data":"66dfe63c30c4512824fb71095d538cfee8a086cc462f3f7f06bfd8d46f20c803"} Jan 30 05:21:26 crc kubenswrapper[4841]: I0130 05:21:26.424932 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" event={"ID":"b37e9576-6d8a-408d-8bbd-ce2df13aecf1","Type":"ContainerStarted","Data":"12b1fa8f901b5be02588bac820a2c91c155d72bfecf3a34785044208ec793d0a"} Jan 30 05:21:26 crc kubenswrapper[4841]: E0130 05:21:26.624718 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37e9576_6d8a_408d_8bbd_ce2df13aecf1.slice/crio-conmon-12b1fa8f901b5be02588bac820a2c91c155d72bfecf3a34785044208ec793d0a.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:21:27 crc kubenswrapper[4841]: I0130 05:21:27.433131 4841 generic.go:334] "Generic (PLEG): container finished" podID="b37e9576-6d8a-408d-8bbd-ce2df13aecf1" containerID="12b1fa8f901b5be02588bac820a2c91c155d72bfecf3a34785044208ec793d0a" exitCode=0 Jan 30 05:21:27 crc kubenswrapper[4841]: I0130 05:21:27.433179 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" event={"ID":"b37e9576-6d8a-408d-8bbd-ce2df13aecf1","Type":"ContainerDied","Data":"12b1fa8f901b5be02588bac820a2c91c155d72bfecf3a34785044208ec793d0a"} Jan 30 05:21:28 crc kubenswrapper[4841]: I0130 05:21:28.445921 4841 generic.go:334] "Generic (PLEG): container finished" podID="b37e9576-6d8a-408d-8bbd-ce2df13aecf1" containerID="5f5f24a34698123dbca65b50e8eddadbe9b8ac0477ab545affda53adf5f6d146" exitCode=0 Jan 30 05:21:28 crc kubenswrapper[4841]: I0130 05:21:28.446065 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" event={"ID":"b37e9576-6d8a-408d-8bbd-ce2df13aecf1","Type":"ContainerDied","Data":"5f5f24a34698123dbca65b50e8eddadbe9b8ac0477ab545affda53adf5f6d146"} Jan 30 05:21:29 crc kubenswrapper[4841]: I0130 05:21:29.775733 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:29 crc kubenswrapper[4841]: I0130 05:21:29.948115 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-bundle\") pod \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\" (UID: \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\") " Jan 30 05:21:29 crc kubenswrapper[4841]: I0130 05:21:29.948223 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-util\") pod \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\" (UID: \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\") " Jan 30 05:21:29 crc kubenswrapper[4841]: I0130 05:21:29.948292 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtsms\" (UniqueName: \"kubernetes.io/projected/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-kube-api-access-wtsms\") pod \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\" (UID: \"b37e9576-6d8a-408d-8bbd-ce2df13aecf1\") " Jan 30 05:21:29 crc kubenswrapper[4841]: I0130 05:21:29.950127 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-bundle" (OuterVolumeSpecName: "bundle") pod "b37e9576-6d8a-408d-8bbd-ce2df13aecf1" (UID: "b37e9576-6d8a-408d-8bbd-ce2df13aecf1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:21:29 crc kubenswrapper[4841]: I0130 05:21:29.956092 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-kube-api-access-wtsms" (OuterVolumeSpecName: "kube-api-access-wtsms") pod "b37e9576-6d8a-408d-8bbd-ce2df13aecf1" (UID: "b37e9576-6d8a-408d-8bbd-ce2df13aecf1"). InnerVolumeSpecName "kube-api-access-wtsms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:21:29 crc kubenswrapper[4841]: I0130 05:21:29.970791 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-util" (OuterVolumeSpecName: "util") pod "b37e9576-6d8a-408d-8bbd-ce2df13aecf1" (UID: "b37e9576-6d8a-408d-8bbd-ce2df13aecf1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:21:30 crc kubenswrapper[4841]: I0130 05:21:30.050125 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtsms\" (UniqueName: \"kubernetes.io/projected/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-kube-api-access-wtsms\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:30 crc kubenswrapper[4841]: I0130 05:21:30.050177 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:30 crc kubenswrapper[4841]: I0130 05:21:30.050196 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b37e9576-6d8a-408d-8bbd-ce2df13aecf1-util\") on node \"crc\" DevicePath \"\"" Jan 30 05:21:30 crc kubenswrapper[4841]: I0130 05:21:30.463645 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" event={"ID":"b37e9576-6d8a-408d-8bbd-ce2df13aecf1","Type":"ContainerDied","Data":"66dfe63c30c4512824fb71095d538cfee8a086cc462f3f7f06bfd8d46f20c803"} Jan 30 05:21:30 crc kubenswrapper[4841]: I0130 05:21:30.463687 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66dfe63c30c4512824fb71095d538cfee8a086cc462f3f7f06bfd8d46f20c803" Jan 30 05:21:30 crc kubenswrapper[4841]: I0130 05:21:30.463767 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.463633 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q"] Jan 30 05:21:35 crc kubenswrapper[4841]: E0130 05:21:35.464357 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37e9576-6d8a-408d-8bbd-ce2df13aecf1" containerName="pull" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.464371 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37e9576-6d8a-408d-8bbd-ce2df13aecf1" containerName="pull" Jan 30 05:21:35 crc kubenswrapper[4841]: E0130 05:21:35.464394 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37e9576-6d8a-408d-8bbd-ce2df13aecf1" containerName="util" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.464425 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37e9576-6d8a-408d-8bbd-ce2df13aecf1" containerName="util" Jan 30 05:21:35 crc kubenswrapper[4841]: E0130 05:21:35.464438 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37e9576-6d8a-408d-8bbd-ce2df13aecf1" containerName="extract" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.464447 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37e9576-6d8a-408d-8bbd-ce2df13aecf1" containerName="extract" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.464576 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37e9576-6d8a-408d-8bbd-ce2df13aecf1" containerName="extract" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.465025 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.471724 4841 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-zlbdc" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.471753 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.472213 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.489820 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q"] Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.633260 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c05bce08-beca-4e59-a260-e28115c11e32-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-4tp8q\" (UID: \"c05bce08-beca-4e59-a260-e28115c11e32\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.633387 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2977\" (UniqueName: \"kubernetes.io/projected/c05bce08-beca-4e59-a260-e28115c11e32-kube-api-access-w2977\") pod \"cert-manager-operator-controller-manager-66c8bdd694-4tp8q\" (UID: \"c05bce08-beca-4e59-a260-e28115c11e32\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.734239 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c05bce08-beca-4e59-a260-e28115c11e32-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-4tp8q\" (UID: \"c05bce08-beca-4e59-a260-e28115c11e32\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.734631 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2977\" (UniqueName: \"kubernetes.io/projected/c05bce08-beca-4e59-a260-e28115c11e32-kube-api-access-w2977\") pod \"cert-manager-operator-controller-manager-66c8bdd694-4tp8q\" (UID: \"c05bce08-beca-4e59-a260-e28115c11e32\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.734782 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c05bce08-beca-4e59-a260-e28115c11e32-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-4tp8q\" (UID: \"c05bce08-beca-4e59-a260-e28115c11e32\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.751039 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2977\" (UniqueName: \"kubernetes.io/projected/c05bce08-beca-4e59-a260-e28115c11e32-kube-api-access-w2977\") pod \"cert-manager-operator-controller-manager-66c8bdd694-4tp8q\" (UID: \"c05bce08-beca-4e59-a260-e28115c11e32\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.784308 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q" Jan 30 05:21:35 crc kubenswrapper[4841]: I0130 05:21:35.996547 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q"] Jan 30 05:21:36 crc kubenswrapper[4841]: W0130 05:21:36.016998 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc05bce08_beca_4e59_a260_e28115c11e32.slice/crio-5b79a5d3b0338cbecb55085ac6d2ed0269ebf8af9b7ce622727e318e1e70276f WatchSource:0}: Error finding container 5b79a5d3b0338cbecb55085ac6d2ed0269ebf8af9b7ce622727e318e1e70276f: Status 404 returned error can't find the container with id 5b79a5d3b0338cbecb55085ac6d2ed0269ebf8af9b7ce622727e318e1e70276f Jan 30 05:21:36 crc kubenswrapper[4841]: I0130 05:21:36.517456 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q" event={"ID":"c05bce08-beca-4e59-a260-e28115c11e32","Type":"ContainerStarted","Data":"5b79a5d3b0338cbecb55085ac6d2ed0269ebf8af9b7ce622727e318e1e70276f"} Jan 30 05:21:39 crc kubenswrapper[4841]: I0130 05:21:39.541186 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q" event={"ID":"c05bce08-beca-4e59-a260-e28115c11e32","Type":"ContainerStarted","Data":"b79d03eda56d32a0a27cc0e2672560a2f909b33b3cdbfff913bbb6a135d7afea"} Jan 30 05:21:39 crc kubenswrapper[4841]: I0130 05:21:39.574623 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4tp8q" podStartSLOduration=1.9579846669999998 podStartE2EDuration="4.574593535s" podCreationTimestamp="2026-01-30 05:21:35 +0000 UTC" firstStartedPulling="2026-01-30 05:21:36.019056972 +0000 UTC m=+833.012529610" lastFinishedPulling="2026-01-30 05:21:38.63566581 +0000 UTC m=+835.629138478" observedRunningTime="2026-01-30 05:21:39.572278802 +0000 UTC m=+836.565751470" watchObservedRunningTime="2026-01-30 05:21:39.574593535 +0000 UTC m=+836.568066213" Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.499364 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-pfb94"] Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.500253 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-pfb94" Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.503654 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.504165 4841 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wxlcg" Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.504383 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.513668 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-pfb94"] Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.643489 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af42727b-5803-4bf8-a60b-992706c93f4b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-pfb94\" (UID: \"af42727b-5803-4bf8-a60b-992706c93f4b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-pfb94" Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.643735 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbcd7\" (UniqueName: \"kubernetes.io/projected/af42727b-5803-4bf8-a60b-992706c93f4b-kube-api-access-fbcd7\") pod \"cert-manager-cainjector-5545bd876-pfb94\" (UID: \"af42727b-5803-4bf8-a60b-992706c93f4b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-pfb94" Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.744659 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af42727b-5803-4bf8-a60b-992706c93f4b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-pfb94\" (UID: \"af42727b-5803-4bf8-a60b-992706c93f4b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-pfb94" Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.744723 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbcd7\" (UniqueName: \"kubernetes.io/projected/af42727b-5803-4bf8-a60b-992706c93f4b-kube-api-access-fbcd7\") pod \"cert-manager-cainjector-5545bd876-pfb94\" (UID: \"af42727b-5803-4bf8-a60b-992706c93f4b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-pfb94" Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.768475 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af42727b-5803-4bf8-a60b-992706c93f4b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-pfb94\" (UID: \"af42727b-5803-4bf8-a60b-992706c93f4b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-pfb94" Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.772461 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbcd7\" (UniqueName: \"kubernetes.io/projected/af42727b-5803-4bf8-a60b-992706c93f4b-kube-api-access-fbcd7\") pod \"cert-manager-cainjector-5545bd876-pfb94\" (UID: \"af42727b-5803-4bf8-a60b-992706c93f4b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-pfb94" Jan 30 05:21:43 crc kubenswrapper[4841]: I0130 05:21:43.814426 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-pfb94" Jan 30 05:21:44 crc kubenswrapper[4841]: I0130 05:21:44.273371 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-pfb94"] Jan 30 05:21:44 crc kubenswrapper[4841]: I0130 05:21:44.574199 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-pfb94" event={"ID":"af42727b-5803-4bf8-a60b-992706c93f4b","Type":"ContainerStarted","Data":"65b00c6c2bf332498f3a740a3134ea2ec6e3297038ec31d272f9c05c1847792a"} Jan 30 05:21:47 crc kubenswrapper[4841]: I0130 05:21:47.519541 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-b8rnj"] Jan 30 05:21:47 crc kubenswrapper[4841]: I0130 05:21:47.521165 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" Jan 30 05:21:47 crc kubenswrapper[4841]: I0130 05:21:47.524901 4841 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-2646d" Jan 30 05:21:47 crc kubenswrapper[4841]: I0130 05:21:47.531237 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-b8rnj"] Jan 30 05:21:47 crc kubenswrapper[4841]: I0130 05:21:47.704655 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trsjc\" (UniqueName: \"kubernetes.io/projected/1b8b844d-eefd-433f-9f59-3bedb92b8731-kube-api-access-trsjc\") pod \"cert-manager-webhook-6888856db4-b8rnj\" (UID: \"1b8b844d-eefd-433f-9f59-3bedb92b8731\") " pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" Jan 30 05:21:47 crc kubenswrapper[4841]: I0130 05:21:47.704695 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b8b844d-eefd-433f-9f59-3bedb92b8731-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-b8rnj\" (UID: \"1b8b844d-eefd-433f-9f59-3bedb92b8731\") " pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" Jan 30 05:21:47 crc kubenswrapper[4841]: I0130 05:21:47.805687 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trsjc\" (UniqueName: \"kubernetes.io/projected/1b8b844d-eefd-433f-9f59-3bedb92b8731-kube-api-access-trsjc\") pod \"cert-manager-webhook-6888856db4-b8rnj\" (UID: \"1b8b844d-eefd-433f-9f59-3bedb92b8731\") " pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" Jan 30 05:21:47 crc kubenswrapper[4841]: I0130 05:21:47.805733 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b8b844d-eefd-433f-9f59-3bedb92b8731-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-b8rnj\" (UID: \"1b8b844d-eefd-433f-9f59-3bedb92b8731\") " pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" Jan 30 05:21:47 crc kubenswrapper[4841]: I0130 05:21:47.824374 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b8b844d-eefd-433f-9f59-3bedb92b8731-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-b8rnj\" (UID: \"1b8b844d-eefd-433f-9f59-3bedb92b8731\") " pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" Jan 30 05:21:47 crc kubenswrapper[4841]: I0130 05:21:47.824720 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trsjc\" (UniqueName: \"kubernetes.io/projected/1b8b844d-eefd-433f-9f59-3bedb92b8731-kube-api-access-trsjc\") pod \"cert-manager-webhook-6888856db4-b8rnj\" (UID: \"1b8b844d-eefd-433f-9f59-3bedb92b8731\") " pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" Jan 30 05:21:47 crc kubenswrapper[4841]: I0130 05:21:47.843778 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" Jan 30 05:21:48 crc kubenswrapper[4841]: I0130 05:21:48.657609 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-b8rnj"] Jan 30 05:21:48 crc kubenswrapper[4841]: W0130 05:21:48.661633 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b8b844d_eefd_433f_9f59_3bedb92b8731.slice/crio-8f8b49988301504393b466f36450ce4b098d8f16d5da490de061dfe71fa59311 WatchSource:0}: Error finding container 8f8b49988301504393b466f36450ce4b098d8f16d5da490de061dfe71fa59311: Status 404 returned error can't find the container with id 8f8b49988301504393b466f36450ce4b098d8f16d5da490de061dfe71fa59311 Jan 30 05:21:49 crc kubenswrapper[4841]: I0130 05:21:49.607381 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" event={"ID":"1b8b844d-eefd-433f-9f59-3bedb92b8731","Type":"ContainerStarted","Data":"1e9008f73f31f536d1d7fdc6d3dd791ed87288d32b4f3b137b1e12838a494834"} Jan 30 05:21:49 crc kubenswrapper[4841]: I0130 05:21:49.607721 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" Jan 30 05:21:49 crc kubenswrapper[4841]: I0130 05:21:49.607735 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" event={"ID":"1b8b844d-eefd-433f-9f59-3bedb92b8731","Type":"ContainerStarted","Data":"8f8b49988301504393b466f36450ce4b098d8f16d5da490de061dfe71fa59311"} Jan 30 05:21:49 crc kubenswrapper[4841]: I0130 05:21:49.609367 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-pfb94" event={"ID":"af42727b-5803-4bf8-a60b-992706c93f4b","Type":"ContainerStarted","Data":"d78c1effb8ee2901dc658cd60c313ae6fccf5b8a21544ce3268d17a43c08aadf"} Jan 30 05:21:49 crc kubenswrapper[4841]: I0130 05:21:49.634811 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" podStartSLOduration=2.6347816010000003 podStartE2EDuration="2.634781601s" podCreationTimestamp="2026-01-30 05:21:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:21:49.626355546 +0000 UTC m=+846.619828224" watchObservedRunningTime="2026-01-30 05:21:49.634781601 +0000 UTC m=+846.628254279" Jan 30 05:21:49 crc kubenswrapper[4841]: I0130 05:21:49.656553 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-pfb94" podStartSLOduration=2.497107915 podStartE2EDuration="6.656534967s" podCreationTimestamp="2026-01-30 05:21:43 +0000 UTC" firstStartedPulling="2026-01-30 05:21:44.293965588 +0000 UTC m=+841.287438256" lastFinishedPulling="2026-01-30 05:21:48.45339267 +0000 UTC m=+845.446865308" observedRunningTime="2026-01-30 05:21:49.65000742 +0000 UTC m=+846.643480068" watchObservedRunningTime="2026-01-30 05:21:49.656534967 +0000 UTC m=+846.650007605" Jan 30 05:21:57 crc kubenswrapper[4841]: I0130 05:21:57.850165 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-b8rnj" Jan 30 05:22:01 crc kubenswrapper[4841]: I0130 05:22:01.099537 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-rnc5n"] Jan 30 05:22:01 crc kubenswrapper[4841]: I0130 05:22:01.101213 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-rnc5n" Jan 30 05:22:01 crc kubenswrapper[4841]: I0130 05:22:01.106545 4841 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-mt4rl" Jan 30 05:22:01 crc kubenswrapper[4841]: I0130 05:22:01.109351 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-rnc5n"] Jan 30 05:22:01 crc kubenswrapper[4841]: I0130 05:22:01.207168 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e233256a-c807-4442-bb33-2506fd8d34bd-bound-sa-token\") pod \"cert-manager-545d4d4674-rnc5n\" (UID: \"e233256a-c807-4442-bb33-2506fd8d34bd\") " pod="cert-manager/cert-manager-545d4d4674-rnc5n" Jan 30 05:22:01 crc kubenswrapper[4841]: I0130 05:22:01.207525 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j85x\" (UniqueName: \"kubernetes.io/projected/e233256a-c807-4442-bb33-2506fd8d34bd-kube-api-access-5j85x\") pod \"cert-manager-545d4d4674-rnc5n\" (UID: \"e233256a-c807-4442-bb33-2506fd8d34bd\") " pod="cert-manager/cert-manager-545d4d4674-rnc5n" Jan 30 05:22:01 crc kubenswrapper[4841]: I0130 05:22:01.309675 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e233256a-c807-4442-bb33-2506fd8d34bd-bound-sa-token\") pod \"cert-manager-545d4d4674-rnc5n\" (UID: \"e233256a-c807-4442-bb33-2506fd8d34bd\") " pod="cert-manager/cert-manager-545d4d4674-rnc5n" Jan 30 05:22:01 crc kubenswrapper[4841]: I0130 05:22:01.309816 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j85x\" (UniqueName: \"kubernetes.io/projected/e233256a-c807-4442-bb33-2506fd8d34bd-kube-api-access-5j85x\") pod \"cert-manager-545d4d4674-rnc5n\" (UID: \"e233256a-c807-4442-bb33-2506fd8d34bd\") " pod="cert-manager/cert-manager-545d4d4674-rnc5n" Jan 30 05:22:01 crc kubenswrapper[4841]: I0130 05:22:01.337536 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j85x\" (UniqueName: \"kubernetes.io/projected/e233256a-c807-4442-bb33-2506fd8d34bd-kube-api-access-5j85x\") pod \"cert-manager-545d4d4674-rnc5n\" (UID: \"e233256a-c807-4442-bb33-2506fd8d34bd\") " pod="cert-manager/cert-manager-545d4d4674-rnc5n" Jan 30 05:22:01 crc kubenswrapper[4841]: I0130 05:22:01.338942 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e233256a-c807-4442-bb33-2506fd8d34bd-bound-sa-token\") pod \"cert-manager-545d4d4674-rnc5n\" (UID: \"e233256a-c807-4442-bb33-2506fd8d34bd\") " pod="cert-manager/cert-manager-545d4d4674-rnc5n" Jan 30 05:22:01 crc kubenswrapper[4841]: I0130 05:22:01.442885 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-rnc5n" Jan 30 05:22:01 crc kubenswrapper[4841]: I0130 05:22:01.744852 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-rnc5n"] Jan 30 05:22:01 crc kubenswrapper[4841]: W0130 05:22:01.760712 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode233256a_c807_4442_bb33_2506fd8d34bd.slice/crio-7690bc3c21a70acf55e22f64d42b8aa19e3857320388eecf9975c60684ca1d45 WatchSource:0}: Error finding container 7690bc3c21a70acf55e22f64d42b8aa19e3857320388eecf9975c60684ca1d45: Status 404 returned error can't find the container with id 7690bc3c21a70acf55e22f64d42b8aa19e3857320388eecf9975c60684ca1d45 Jan 30 05:22:02 crc kubenswrapper[4841]: I0130 05:22:02.716953 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-rnc5n" event={"ID":"e233256a-c807-4442-bb33-2506fd8d34bd","Type":"ContainerStarted","Data":"4ee699be80fb5bbacc0adec0860ed5bfa6e9b8b8177dea5cea14c93dc3386237"} Jan 30 05:22:02 crc kubenswrapper[4841]: I0130 05:22:02.717451 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-rnc5n" event={"ID":"e233256a-c807-4442-bb33-2506fd8d34bd","Type":"ContainerStarted","Data":"7690bc3c21a70acf55e22f64d42b8aa19e3857320388eecf9975c60684ca1d45"} Jan 30 05:22:02 crc kubenswrapper[4841]: I0130 05:22:02.760987 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-rnc5n" podStartSLOduration=1.760963118 podStartE2EDuration="1.760963118s" podCreationTimestamp="2026-01-30 05:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:22:02.75672909 +0000 UTC m=+859.750201748" watchObservedRunningTime="2026-01-30 05:22:02.760963118 +0000 UTC m=+859.754435796" Jan 30 05:22:11 crc kubenswrapper[4841]: I0130 05:22:11.399885 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-22chh"] Jan 30 05:22:11 crc kubenswrapper[4841]: I0130 05:22:11.402041 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-22chh" Jan 30 05:22:11 crc kubenswrapper[4841]: I0130 05:22:11.406066 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 30 05:22:11 crc kubenswrapper[4841]: I0130 05:22:11.406205 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 30 05:22:11 crc kubenswrapper[4841]: I0130 05:22:11.406390 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-pvp9r" Jan 30 05:22:11 crc kubenswrapper[4841]: I0130 05:22:11.411505 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-22chh"] Jan 30 05:22:11 crc kubenswrapper[4841]: I0130 05:22:11.559437 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjdsw\" (UniqueName: \"kubernetes.io/projected/d2ff95f1-15dc-4400-853e-e5aeb7d686af-kube-api-access-jjdsw\") pod \"openstack-operator-index-22chh\" (UID: \"d2ff95f1-15dc-4400-853e-e5aeb7d686af\") " pod="openstack-operators/openstack-operator-index-22chh" Jan 30 05:22:11 crc kubenswrapper[4841]: I0130 05:22:11.660120 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjdsw\" (UniqueName: \"kubernetes.io/projected/d2ff95f1-15dc-4400-853e-e5aeb7d686af-kube-api-access-jjdsw\") pod \"openstack-operator-index-22chh\" (UID: \"d2ff95f1-15dc-4400-853e-e5aeb7d686af\") " pod="openstack-operators/openstack-operator-index-22chh" Jan 30 05:22:11 crc kubenswrapper[4841]: I0130 05:22:11.677634 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjdsw\" (UniqueName: \"kubernetes.io/projected/d2ff95f1-15dc-4400-853e-e5aeb7d686af-kube-api-access-jjdsw\") pod \"openstack-operator-index-22chh\" (UID: \"d2ff95f1-15dc-4400-853e-e5aeb7d686af\") " pod="openstack-operators/openstack-operator-index-22chh" Jan 30 05:22:11 crc kubenswrapper[4841]: I0130 05:22:11.723491 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-22chh" Jan 30 05:22:12 crc kubenswrapper[4841]: I0130 05:22:12.207414 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-22chh"] Jan 30 05:22:12 crc kubenswrapper[4841]: I0130 05:22:12.799984 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-22chh" event={"ID":"d2ff95f1-15dc-4400-853e-e5aeb7d686af","Type":"ContainerStarted","Data":"2eb5066276064ad91d55052f2528a18588ba097850647e4dcbe603b4f1af643c"} Jan 30 05:22:13 crc kubenswrapper[4841]: I0130 05:22:13.810311 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-22chh" event={"ID":"d2ff95f1-15dc-4400-853e-e5aeb7d686af","Type":"ContainerStarted","Data":"1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41"} Jan 30 05:22:14 crc kubenswrapper[4841]: I0130 05:22:14.765690 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-22chh" podStartSLOduration=3.071300659 podStartE2EDuration="3.765657728s" podCreationTimestamp="2026-01-30 05:22:11 +0000 UTC" firstStartedPulling="2026-01-30 05:22:12.218477444 +0000 UTC m=+869.211950092" lastFinishedPulling="2026-01-30 05:22:12.912834523 +0000 UTC m=+869.906307161" observedRunningTime="2026-01-30 05:22:13.835649879 +0000 UTC m=+870.829122557" watchObservedRunningTime="2026-01-30 05:22:14.765657728 +0000 UTC m=+871.759130406" Jan 30 05:22:14 crc kubenswrapper[4841]: I0130 05:22:14.769563 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-22chh"] Jan 30 05:22:15 crc kubenswrapper[4841]: I0130 05:22:15.375087 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lzlm9"] Jan 30 05:22:15 crc kubenswrapper[4841]: I0130 05:22:15.376290 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lzlm9" Jan 30 05:22:15 crc kubenswrapper[4841]: I0130 05:22:15.388719 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lzlm9"] Jan 30 05:22:15 crc kubenswrapper[4841]: I0130 05:22:15.533186 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8zl5\" (UniqueName: \"kubernetes.io/projected/50258f4b-004d-44a8-bdde-f2e607ed183d-kube-api-access-d8zl5\") pod \"openstack-operator-index-lzlm9\" (UID: \"50258f4b-004d-44a8-bdde-f2e607ed183d\") " pod="openstack-operators/openstack-operator-index-lzlm9" Jan 30 05:22:15 crc kubenswrapper[4841]: I0130 05:22:15.634671 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8zl5\" (UniqueName: \"kubernetes.io/projected/50258f4b-004d-44a8-bdde-f2e607ed183d-kube-api-access-d8zl5\") pod \"openstack-operator-index-lzlm9\" (UID: \"50258f4b-004d-44a8-bdde-f2e607ed183d\") " pod="openstack-operators/openstack-operator-index-lzlm9" Jan 30 05:22:15 crc kubenswrapper[4841]: I0130 05:22:15.674295 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8zl5\" (UniqueName: \"kubernetes.io/projected/50258f4b-004d-44a8-bdde-f2e607ed183d-kube-api-access-d8zl5\") pod \"openstack-operator-index-lzlm9\" (UID: \"50258f4b-004d-44a8-bdde-f2e607ed183d\") " pod="openstack-operators/openstack-operator-index-lzlm9" Jan 30 05:22:15 crc kubenswrapper[4841]: I0130 05:22:15.754614 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lzlm9" Jan 30 05:22:15 crc kubenswrapper[4841]: I0130 05:22:15.834968 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-22chh" podUID="d2ff95f1-15dc-4400-853e-e5aeb7d686af" containerName="registry-server" containerID="cri-o://1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41" gracePeriod=2 Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.025771 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lzlm9"] Jan 30 05:22:16 crc kubenswrapper[4841]: W0130 05:22:16.030569 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50258f4b_004d_44a8_bdde_f2e607ed183d.slice/crio-0963f970f41076b01735b5299b4618a029f9eef9e8fbb4e714e45cb2cae4a52f WatchSource:0}: Error finding container 0963f970f41076b01735b5299b4618a029f9eef9e8fbb4e714e45cb2cae4a52f: Status 404 returned error can't find the container with id 0963f970f41076b01735b5299b4618a029f9eef9e8fbb4e714e45cb2cae4a52f Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.211459 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-22chh" Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.343472 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjdsw\" (UniqueName: \"kubernetes.io/projected/d2ff95f1-15dc-4400-853e-e5aeb7d686af-kube-api-access-jjdsw\") pod \"d2ff95f1-15dc-4400-853e-e5aeb7d686af\" (UID: \"d2ff95f1-15dc-4400-853e-e5aeb7d686af\") " Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.350646 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ff95f1-15dc-4400-853e-e5aeb7d686af-kube-api-access-jjdsw" (OuterVolumeSpecName: "kube-api-access-jjdsw") pod "d2ff95f1-15dc-4400-853e-e5aeb7d686af" (UID: "d2ff95f1-15dc-4400-853e-e5aeb7d686af"). InnerVolumeSpecName "kube-api-access-jjdsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.445178 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjdsw\" (UniqueName: \"kubernetes.io/projected/d2ff95f1-15dc-4400-853e-e5aeb7d686af-kube-api-access-jjdsw\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.845202 4841 generic.go:334] "Generic (PLEG): container finished" podID="d2ff95f1-15dc-4400-853e-e5aeb7d686af" containerID="1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41" exitCode=0 Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.845330 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-22chh" event={"ID":"d2ff95f1-15dc-4400-853e-e5aeb7d686af","Type":"ContainerDied","Data":"1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41"} Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.845389 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-22chh" event={"ID":"d2ff95f1-15dc-4400-853e-e5aeb7d686af","Type":"ContainerDied","Data":"2eb5066276064ad91d55052f2528a18588ba097850647e4dcbe603b4f1af643c"} Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.845468 4841 scope.go:117] "RemoveContainer" containerID="1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41" Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.845652 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-22chh" Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.848975 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lzlm9" event={"ID":"50258f4b-004d-44a8-bdde-f2e607ed183d","Type":"ContainerStarted","Data":"42ad5d0ec26217ea9d0436918953809817b26fe1809de3cdbdbaff7b4b04f9cc"} Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.849056 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lzlm9" event={"ID":"50258f4b-004d-44a8-bdde-f2e607ed183d","Type":"ContainerStarted","Data":"0963f970f41076b01735b5299b4618a029f9eef9e8fbb4e714e45cb2cae4a52f"} Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.880987 4841 scope.go:117] "RemoveContainer" containerID="1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41" Jan 30 05:22:16 crc kubenswrapper[4841]: E0130 05:22:16.882362 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41\": container with ID starting with 1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41 not found: ID does not exist" containerID="1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41" Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.882488 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41"} err="failed to get container status \"1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41\": rpc error: code = NotFound desc = could not find container \"1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41\": container with ID starting with 1fb3c2bcecd9f4372e4161a5155abb09a8f7be70e50fea71e03bfc28a058cb41 not found: ID does not exist" Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.888845 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lzlm9" podStartSLOduration=1.44693044 podStartE2EDuration="1.888806789s" podCreationTimestamp="2026-01-30 05:22:15 +0000 UTC" firstStartedPulling="2026-01-30 05:22:16.036192287 +0000 UTC m=+873.029664915" lastFinishedPulling="2026-01-30 05:22:16.478068586 +0000 UTC m=+873.471541264" observedRunningTime="2026-01-30 05:22:16.881318098 +0000 UTC m=+873.874790786" watchObservedRunningTime="2026-01-30 05:22:16.888806789 +0000 UTC m=+873.882279467" Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.921610 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-22chh"] Jan 30 05:22:16 crc kubenswrapper[4841]: I0130 05:22:16.927464 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-22chh"] Jan 30 05:22:18 crc kubenswrapper[4841]: I0130 05:22:18.448034 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ff95f1-15dc-4400-853e-e5aeb7d686af" path="/var/lib/kubelet/pods/d2ff95f1-15dc-4400-853e-e5aeb7d686af/volumes" Jan 30 05:22:25 crc kubenswrapper[4841]: I0130 05:22:25.755367 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-lzlm9" Jan 30 05:22:25 crc kubenswrapper[4841]: I0130 05:22:25.755881 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-lzlm9" Jan 30 05:22:25 crc kubenswrapper[4841]: I0130 05:22:25.791226 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-lzlm9" Jan 30 05:22:25 crc kubenswrapper[4841]: I0130 05:22:25.950877 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-lzlm9" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.032616 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88"] Jan 30 05:22:31 crc kubenswrapper[4841]: E0130 05:22:31.033618 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ff95f1-15dc-4400-853e-e5aeb7d686af" containerName="registry-server" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.033666 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ff95f1-15dc-4400-853e-e5aeb7d686af" containerName="registry-server" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.033907 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ff95f1-15dc-4400-853e-e5aeb7d686af" containerName="registry-server" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.034995 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.039154 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-7n5n8" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.046275 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88"] Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.084469 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8prsz\" (UniqueName: \"kubernetes.io/projected/d34858df-a7b1-4f65-9406-1b86d5a84e7d-kube-api-access-8prsz\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88\" (UID: \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.084954 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d34858df-a7b1-4f65-9406-1b86d5a84e7d-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88\" (UID: \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.085211 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d34858df-a7b1-4f65-9406-1b86d5a84e7d-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88\" (UID: \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.186146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d34858df-a7b1-4f65-9406-1b86d5a84e7d-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88\" (UID: \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.186391 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8prsz\" (UniqueName: \"kubernetes.io/projected/d34858df-a7b1-4f65-9406-1b86d5a84e7d-kube-api-access-8prsz\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88\" (UID: \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.186592 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d34858df-a7b1-4f65-9406-1b86d5a84e7d-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88\" (UID: \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.186807 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d34858df-a7b1-4f65-9406-1b86d5a84e7d-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88\" (UID: \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.187267 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d34858df-a7b1-4f65-9406-1b86d5a84e7d-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88\" (UID: \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.222119 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8prsz\" (UniqueName: \"kubernetes.io/projected/d34858df-a7b1-4f65-9406-1b86d5a84e7d-kube-api-access-8prsz\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88\" (UID: \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.371014 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.667639 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88"] Jan 30 05:22:31 crc kubenswrapper[4841]: W0130 05:22:31.680537 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd34858df_a7b1_4f65_9406_1b86d5a84e7d.slice/crio-e6942d32d33ec7adf8eacbf7e11a28ac8d6ed71392fd2cd4080f046e12fcb8a3 WatchSource:0}: Error finding container e6942d32d33ec7adf8eacbf7e11a28ac8d6ed71392fd2cd4080f046e12fcb8a3: Status 404 returned error can't find the container with id e6942d32d33ec7adf8eacbf7e11a28ac8d6ed71392fd2cd4080f046e12fcb8a3 Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.963337 4841 generic.go:334] "Generic (PLEG): container finished" podID="d34858df-a7b1-4f65-9406-1b86d5a84e7d" containerID="dc178bf8fe3ac704aedddd999a29822b5cddda89a8f7cb38ee9c5dde55c14714" exitCode=0 Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.963390 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" event={"ID":"d34858df-a7b1-4f65-9406-1b86d5a84e7d","Type":"ContainerDied","Data":"dc178bf8fe3ac704aedddd999a29822b5cddda89a8f7cb38ee9c5dde55c14714"} Jan 30 05:22:31 crc kubenswrapper[4841]: I0130 05:22:31.963444 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" event={"ID":"d34858df-a7b1-4f65-9406-1b86d5a84e7d","Type":"ContainerStarted","Data":"e6942d32d33ec7adf8eacbf7e11a28ac8d6ed71392fd2cd4080f046e12fcb8a3"} Jan 30 05:22:32 crc kubenswrapper[4841]: I0130 05:22:32.973745 4841 generic.go:334] "Generic (PLEG): container finished" podID="d34858df-a7b1-4f65-9406-1b86d5a84e7d" containerID="45386fb43f16fe7437848803284456774ccd4c9b06316ecb9242ee3321e50759" exitCode=0 Jan 30 05:22:32 crc kubenswrapper[4841]: I0130 05:22:32.973826 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" event={"ID":"d34858df-a7b1-4f65-9406-1b86d5a84e7d","Type":"ContainerDied","Data":"45386fb43f16fe7437848803284456774ccd4c9b06316ecb9242ee3321e50759"} Jan 30 05:22:33 crc kubenswrapper[4841]: I0130 05:22:33.986621 4841 generic.go:334] "Generic (PLEG): container finished" podID="d34858df-a7b1-4f65-9406-1b86d5a84e7d" containerID="0edf2fff92cd900d7cf5e4f9aa7196d0f21a638886714d5ccc695328353e0eda" exitCode=0 Jan 30 05:22:33 crc kubenswrapper[4841]: I0130 05:22:33.986696 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" event={"ID":"d34858df-a7b1-4f65-9406-1b86d5a84e7d","Type":"ContainerDied","Data":"0edf2fff92cd900d7cf5e4f9aa7196d0f21a638886714d5ccc695328353e0eda"} Jan 30 05:22:35 crc kubenswrapper[4841]: I0130 05:22:35.249248 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:35 crc kubenswrapper[4841]: I0130 05:22:35.350599 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d34858df-a7b1-4f65-9406-1b86d5a84e7d-bundle\") pod \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\" (UID: \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\") " Jan 30 05:22:35 crc kubenswrapper[4841]: I0130 05:22:35.350757 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8prsz\" (UniqueName: \"kubernetes.io/projected/d34858df-a7b1-4f65-9406-1b86d5a84e7d-kube-api-access-8prsz\") pod \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\" (UID: \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\") " Jan 30 05:22:35 crc kubenswrapper[4841]: I0130 05:22:35.350844 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d34858df-a7b1-4f65-9406-1b86d5a84e7d-util\") pod \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\" (UID: \"d34858df-a7b1-4f65-9406-1b86d5a84e7d\") " Jan 30 05:22:35 crc kubenswrapper[4841]: I0130 05:22:35.351677 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d34858df-a7b1-4f65-9406-1b86d5a84e7d-bundle" (OuterVolumeSpecName: "bundle") pod "d34858df-a7b1-4f65-9406-1b86d5a84e7d" (UID: "d34858df-a7b1-4f65-9406-1b86d5a84e7d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:22:35 crc kubenswrapper[4841]: I0130 05:22:35.360481 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34858df-a7b1-4f65-9406-1b86d5a84e7d-kube-api-access-8prsz" (OuterVolumeSpecName: "kube-api-access-8prsz") pod "d34858df-a7b1-4f65-9406-1b86d5a84e7d" (UID: "d34858df-a7b1-4f65-9406-1b86d5a84e7d"). InnerVolumeSpecName "kube-api-access-8prsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:22:35 crc kubenswrapper[4841]: I0130 05:22:35.373327 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d34858df-a7b1-4f65-9406-1b86d5a84e7d-util" (OuterVolumeSpecName: "util") pod "d34858df-a7b1-4f65-9406-1b86d5a84e7d" (UID: "d34858df-a7b1-4f65-9406-1b86d5a84e7d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:22:35 crc kubenswrapper[4841]: I0130 05:22:35.452799 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d34858df-a7b1-4f65-9406-1b86d5a84e7d-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:35 crc kubenswrapper[4841]: I0130 05:22:35.452863 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8prsz\" (UniqueName: \"kubernetes.io/projected/d34858df-a7b1-4f65-9406-1b86d5a84e7d-kube-api-access-8prsz\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:35 crc kubenswrapper[4841]: I0130 05:22:35.452886 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d34858df-a7b1-4f65-9406-1b86d5a84e7d-util\") on node \"crc\" DevicePath \"\"" Jan 30 05:22:36 crc kubenswrapper[4841]: I0130 05:22:36.002026 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" event={"ID":"d34858df-a7b1-4f65-9406-1b86d5a84e7d","Type":"ContainerDied","Data":"e6942d32d33ec7adf8eacbf7e11a28ac8d6ed71392fd2cd4080f046e12fcb8a3"} Jan 30 05:22:36 crc kubenswrapper[4841]: I0130 05:22:36.002075 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6942d32d33ec7adf8eacbf7e11a28ac8d6ed71392fd2cd4080f046e12fcb8a3" Jan 30 05:22:36 crc kubenswrapper[4841]: I0130 05:22:36.002116 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88" Jan 30 05:22:37 crc kubenswrapper[4841]: I0130 05:22:37.909166 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd"] Jan 30 05:22:37 crc kubenswrapper[4841]: E0130 05:22:37.910025 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34858df-a7b1-4f65-9406-1b86d5a84e7d" containerName="extract" Jan 30 05:22:37 crc kubenswrapper[4841]: I0130 05:22:37.910053 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34858df-a7b1-4f65-9406-1b86d5a84e7d" containerName="extract" Jan 30 05:22:37 crc kubenswrapper[4841]: E0130 05:22:37.910087 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34858df-a7b1-4f65-9406-1b86d5a84e7d" containerName="util" Jan 30 05:22:37 crc kubenswrapper[4841]: I0130 05:22:37.910106 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34858df-a7b1-4f65-9406-1b86d5a84e7d" containerName="util" Jan 30 05:22:37 crc kubenswrapper[4841]: E0130 05:22:37.910133 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34858df-a7b1-4f65-9406-1b86d5a84e7d" containerName="pull" Jan 30 05:22:37 crc kubenswrapper[4841]: I0130 05:22:37.910153 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34858df-a7b1-4f65-9406-1b86d5a84e7d" containerName="pull" Jan 30 05:22:37 crc kubenswrapper[4841]: I0130 05:22:37.910491 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34858df-a7b1-4f65-9406-1b86d5a84e7d" containerName="extract" Jan 30 05:22:37 crc kubenswrapper[4841]: I0130 05:22:37.911383 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd" Jan 30 05:22:37 crc kubenswrapper[4841]: I0130 05:22:37.913887 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-cdrpj" Jan 30 05:22:37 crc kubenswrapper[4841]: I0130 05:22:37.929633 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd"] Jan 30 05:22:38 crc kubenswrapper[4841]: I0130 05:22:38.087474 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b95jd\" (UniqueName: \"kubernetes.io/projected/fff273ef-511c-42c9-aeaa-b9f1b67483e8-kube-api-access-b95jd\") pod \"openstack-operator-controller-init-757f46c65d-45fmd\" (UID: \"fff273ef-511c-42c9-aeaa-b9f1b67483e8\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd" Jan 30 05:22:38 crc kubenswrapper[4841]: I0130 05:22:38.188273 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b95jd\" (UniqueName: \"kubernetes.io/projected/fff273ef-511c-42c9-aeaa-b9f1b67483e8-kube-api-access-b95jd\") pod \"openstack-operator-controller-init-757f46c65d-45fmd\" (UID: \"fff273ef-511c-42c9-aeaa-b9f1b67483e8\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd" Jan 30 05:22:38 crc kubenswrapper[4841]: I0130 05:22:38.206472 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b95jd\" (UniqueName: \"kubernetes.io/projected/fff273ef-511c-42c9-aeaa-b9f1b67483e8-kube-api-access-b95jd\") pod \"openstack-operator-controller-init-757f46c65d-45fmd\" (UID: \"fff273ef-511c-42c9-aeaa-b9f1b67483e8\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd" Jan 30 05:22:38 crc kubenswrapper[4841]: I0130 05:22:38.266158 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd" Jan 30 05:22:38 crc kubenswrapper[4841]: I0130 05:22:38.750925 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd"] Jan 30 05:22:39 crc kubenswrapper[4841]: I0130 05:22:39.042918 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd" event={"ID":"fff273ef-511c-42c9-aeaa-b9f1b67483e8","Type":"ContainerStarted","Data":"2e600410498fa17a9f2d35068557cfbc8543cff6754ecbe263f1c28df4387740"} Jan 30 05:22:40 crc kubenswrapper[4841]: I0130 05:22:40.463162 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:22:40 crc kubenswrapper[4841]: I0130 05:22:40.463485 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:22:44 crc kubenswrapper[4841]: I0130 05:22:44.086510 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd" event={"ID":"fff273ef-511c-42c9-aeaa-b9f1b67483e8","Type":"ContainerStarted","Data":"9d7cb23cb924ba8e2a5989ebc7a23f888c3ea1d3868f2c006ffd73b98fe8c36b"} Jan 30 05:22:44 crc kubenswrapper[4841]: I0130 05:22:44.087210 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd" Jan 30 05:22:44 crc kubenswrapper[4841]: I0130 05:22:44.141481 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd" podStartSLOduration=2.3249262760000002 podStartE2EDuration="7.141456546s" podCreationTimestamp="2026-01-30 05:22:37 +0000 UTC" firstStartedPulling="2026-01-30 05:22:38.756535995 +0000 UTC m=+895.750008633" lastFinishedPulling="2026-01-30 05:22:43.573066245 +0000 UTC m=+900.566538903" observedRunningTime="2026-01-30 05:22:44.137652919 +0000 UTC m=+901.131125597" watchObservedRunningTime="2026-01-30 05:22:44.141456546 +0000 UTC m=+901.134929224" Jan 30 05:22:48 crc kubenswrapper[4841]: I0130 05:22:48.269560 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-45fmd" Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.371578 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-frp95"] Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.373077 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.382544 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-frp95"] Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.523457 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pdhn\" (UniqueName: \"kubernetes.io/projected/f92a542d-4baf-4e08-9da4-02f641b46d94-kube-api-access-9pdhn\") pod \"community-operators-frp95\" (UID: \"f92a542d-4baf-4e08-9da4-02f641b46d94\") " pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.523504 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f92a542d-4baf-4e08-9da4-02f641b46d94-catalog-content\") pod \"community-operators-frp95\" (UID: \"f92a542d-4baf-4e08-9da4-02f641b46d94\") " pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.523616 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f92a542d-4baf-4e08-9da4-02f641b46d94-utilities\") pod \"community-operators-frp95\" (UID: \"f92a542d-4baf-4e08-9da4-02f641b46d94\") " pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.624289 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f92a542d-4baf-4e08-9da4-02f641b46d94-catalog-content\") pod \"community-operators-frp95\" (UID: \"f92a542d-4baf-4e08-9da4-02f641b46d94\") " pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.624717 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f92a542d-4baf-4e08-9da4-02f641b46d94-catalog-content\") pod \"community-operators-frp95\" (UID: \"f92a542d-4baf-4e08-9da4-02f641b46d94\") " pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.624866 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f92a542d-4baf-4e08-9da4-02f641b46d94-utilities\") pod \"community-operators-frp95\" (UID: \"f92a542d-4baf-4e08-9da4-02f641b46d94\") " pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.625096 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f92a542d-4baf-4e08-9da4-02f641b46d94-utilities\") pod \"community-operators-frp95\" (UID: \"f92a542d-4baf-4e08-9da4-02f641b46d94\") " pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.625165 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pdhn\" (UniqueName: \"kubernetes.io/projected/f92a542d-4baf-4e08-9da4-02f641b46d94-kube-api-access-9pdhn\") pod \"community-operators-frp95\" (UID: \"f92a542d-4baf-4e08-9da4-02f641b46d94\") " pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.645271 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pdhn\" (UniqueName: \"kubernetes.io/projected/f92a542d-4baf-4e08-9da4-02f641b46d94-kube-api-access-9pdhn\") pod \"community-operators-frp95\" (UID: \"f92a542d-4baf-4e08-9da4-02f641b46d94\") " pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:02 crc kubenswrapper[4841]: I0130 05:23:02.690285 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:03 crc kubenswrapper[4841]: I0130 05:23:03.220940 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-frp95"] Jan 30 05:23:03 crc kubenswrapper[4841]: W0130 05:23:03.228367 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf92a542d_4baf_4e08_9da4_02f641b46d94.slice/crio-743ded506fa512af6c0118484eca5ddbac846ee8230d2f7e8e66fc5de9e0cf90 WatchSource:0}: Error finding container 743ded506fa512af6c0118484eca5ddbac846ee8230d2f7e8e66fc5de9e0cf90: Status 404 returned error can't find the container with id 743ded506fa512af6c0118484eca5ddbac846ee8230d2f7e8e66fc5de9e0cf90 Jan 30 05:23:03 crc kubenswrapper[4841]: I0130 05:23:03.285950 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frp95" event={"ID":"f92a542d-4baf-4e08-9da4-02f641b46d94","Type":"ContainerStarted","Data":"743ded506fa512af6c0118484eca5ddbac846ee8230d2f7e8e66fc5de9e0cf90"} Jan 30 05:23:04 crc kubenswrapper[4841]: I0130 05:23:04.292312 4841 generic.go:334] "Generic (PLEG): container finished" podID="f92a542d-4baf-4e08-9da4-02f641b46d94" containerID="6d591635894f482579edc61b5d0f40a687c509fc10cc0691ddd34b71e0991ba8" exitCode=0 Jan 30 05:23:04 crc kubenswrapper[4841]: I0130 05:23:04.292445 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frp95" event={"ID":"f92a542d-4baf-4e08-9da4-02f641b46d94","Type":"ContainerDied","Data":"6d591635894f482579edc61b5d0f40a687c509fc10cc0691ddd34b71e0991ba8"} Jan 30 05:23:06 crc kubenswrapper[4841]: I0130 05:23:06.308741 4841 generic.go:334] "Generic (PLEG): container finished" podID="f92a542d-4baf-4e08-9da4-02f641b46d94" containerID="19378b28dc95e086fad25cb225d457941eb2c49ec1c1c4dca098bcd9994e58dd" exitCode=0 Jan 30 05:23:06 crc kubenswrapper[4841]: I0130 05:23:06.308973 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frp95" event={"ID":"f92a542d-4baf-4e08-9da4-02f641b46d94","Type":"ContainerDied","Data":"19378b28dc95e086fad25cb225d457941eb2c49ec1c1c4dca098bcd9994e58dd"} Jan 30 05:23:07 crc kubenswrapper[4841]: I0130 05:23:07.317443 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frp95" event={"ID":"f92a542d-4baf-4e08-9da4-02f641b46d94","Type":"ContainerStarted","Data":"b98b4b3f983d3c97bae7ad2c0ac24067b4f164169f6bc984155fe3910d0ee18b"} Jan 30 05:23:07 crc kubenswrapper[4841]: I0130 05:23:07.339062 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-frp95" podStartSLOduration=2.92103182 podStartE2EDuration="5.339042282s" podCreationTimestamp="2026-01-30 05:23:02 +0000 UTC" firstStartedPulling="2026-01-30 05:23:04.293812996 +0000 UTC m=+921.287285634" lastFinishedPulling="2026-01-30 05:23:06.711823438 +0000 UTC m=+923.705296096" observedRunningTime="2026-01-30 05:23:07.338689453 +0000 UTC m=+924.332162091" watchObservedRunningTime="2026-01-30 05:23:07.339042282 +0000 UTC m=+924.332514930" Jan 30 05:23:07 crc kubenswrapper[4841]: I0130 05:23:07.944800 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr"] Jan 30 05:23:07 crc kubenswrapper[4841]: I0130 05:23:07.945826 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr" Jan 30 05:23:07 crc kubenswrapper[4841]: I0130 05:23:07.952050 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-dlld2" Jan 30 05:23:07 crc kubenswrapper[4841]: I0130 05:23:07.962142 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j"] Jan 30 05:23:07 crc kubenswrapper[4841]: I0130 05:23:07.963155 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j" Jan 30 05:23:07 crc kubenswrapper[4841]: I0130 05:23:07.966544 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4524k" Jan 30 05:23:07 crc kubenswrapper[4841]: I0130 05:23:07.969140 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr"] Jan 30 05:23:07 crc kubenswrapper[4841]: I0130 05:23:07.976908 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j"] Jan 30 05:23:07 crc kubenswrapper[4841]: I0130 05:23:07.996858 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq"] Jan 30 05:23:07 crc kubenswrapper[4841]: I0130 05:23:07.998628 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.000871 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-tsl8l" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.020288 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.023116 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.030371 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-l8rt8" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.035459 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.041482 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.042315 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.048973 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4ddlf" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.070985 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.092784 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.093471 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.097810 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-psswb" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.101098 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.103824 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj225\" (UniqueName: \"kubernetes.io/projected/da35fca1-95bf-4bf9-9dc6-2696846c402d-kube-api-access-jj225\") pod \"designate-operator-controller-manager-6d9697b7f4-b9wjq\" (UID: \"da35fca1-95bf-4bf9-9dc6-2696846c402d\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.103873 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwrnt\" (UniqueName: \"kubernetes.io/projected/683d1de5-a849-4cc4-ad31-e4ddce58ce3a-kube-api-access-kwrnt\") pod \"cinder-operator-controller-manager-8d874c8fc-c857j\" (UID: \"683d1de5-a849-4cc4-ad31-e4ddce58ce3a\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.104039 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbrvm\" (UniqueName: \"kubernetes.io/projected/5abd246f-9cd6-44a5-b189-3f757aa6904b-kube-api-access-fbrvm\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-ggldr\" (UID: \"5abd246f-9cd6-44a5-b189-3f757aa6904b\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.104090 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-758pt\" (UniqueName: \"kubernetes.io/projected/36279e2b-d43d-452c-a159-269eccab814a-kube-api-access-758pt\") pod \"heat-operator-controller-manager-69d6db494d-69jsz\" (UID: \"36279e2b-d43d-452c-a159-269eccab814a\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.105819 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.114970 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.116317 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.117940 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.118670 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.120546 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-w6t5q" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.120982 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-qhrf6" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.121457 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.137234 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.137989 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.143820 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.150762 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2g6gs" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.152334 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.182792 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.192174 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-8622s"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.226159 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47czj\" (UniqueName: \"kubernetes.io/projected/382d26e6-1e34-4de0-8e4a-50230ce1a90f-kube-api-access-47czj\") pod \"glance-operator-controller-manager-8886f4c47-5bf7x\" (UID: \"382d26e6-1e34-4de0-8e4a-50230ce1a90f\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.226221 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsb5p\" (UniqueName: \"kubernetes.io/projected/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-kube-api-access-bsb5p\") pod \"infra-operator-controller-manager-79955696d6-bz4zb\" (UID: \"a3671fee-5baf-4bcf-8246-49b65ef8f0c8\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.226259 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbrvm\" (UniqueName: \"kubernetes.io/projected/5abd246f-9cd6-44a5-b189-3f757aa6904b-kube-api-access-fbrvm\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-ggldr\" (UID: \"5abd246f-9cd6-44a5-b189-3f757aa6904b\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.226289 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-758pt\" (UniqueName: \"kubernetes.io/projected/36279e2b-d43d-452c-a159-269eccab814a-kube-api-access-758pt\") pod \"heat-operator-controller-manager-69d6db494d-69jsz\" (UID: \"36279e2b-d43d-452c-a159-269eccab814a\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.226326 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8d8p\" (UniqueName: \"kubernetes.io/projected/99db8f34-be75-405c-abfc-c79f8a246b3a-kube-api-access-k8d8p\") pod \"horizon-operator-controller-manager-5fb775575f-qf8h2\" (UID: \"99db8f34-be75-405c-abfc-c79f8a246b3a\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.226368 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2l5m\" (UniqueName: \"kubernetes.io/projected/e0cb95b9-d9f5-4927-b7e5-47199da17894-kube-api-access-b2l5m\") pod \"ironic-operator-controller-manager-5f4b8bd54d-74h94\" (UID: \"e0cb95b9-d9f5-4927-b7e5-47199da17894\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.226390 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert\") pod \"infra-operator-controller-manager-79955696d6-bz4zb\" (UID: \"a3671fee-5baf-4bcf-8246-49b65ef8f0c8\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.226510 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj225\" (UniqueName: \"kubernetes.io/projected/da35fca1-95bf-4bf9-9dc6-2696846c402d-kube-api-access-jj225\") pod \"designate-operator-controller-manager-6d9697b7f4-b9wjq\" (UID: \"da35fca1-95bf-4bf9-9dc6-2696846c402d\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.226536 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwrnt\" (UniqueName: \"kubernetes.io/projected/683d1de5-a849-4cc4-ad31-e4ddce58ce3a-kube-api-access-kwrnt\") pod \"cinder-operator-controller-manager-8d874c8fc-c857j\" (UID: \"683d1de5-a849-4cc4-ad31-e4ddce58ce3a\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.227584 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-8622s" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.230498 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.231508 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.232110 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-hxf6t" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.233422 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-8622s"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.234769 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-xx6qh" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.261223 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbrvm\" (UniqueName: \"kubernetes.io/projected/5abd246f-9cd6-44a5-b189-3f757aa6904b-kube-api-access-fbrvm\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-ggldr\" (UID: \"5abd246f-9cd6-44a5-b189-3f757aa6904b\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.261980 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.266619 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-758pt\" (UniqueName: \"kubernetes.io/projected/36279e2b-d43d-452c-a159-269eccab814a-kube-api-access-758pt\") pod \"heat-operator-controller-manager-69d6db494d-69jsz\" (UID: \"36279e2b-d43d-452c-a159-269eccab814a\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.269682 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj225\" (UniqueName: \"kubernetes.io/projected/da35fca1-95bf-4bf9-9dc6-2696846c402d-kube-api-access-jj225\") pod \"designate-operator-controller-manager-6d9697b7f4-b9wjq\" (UID: \"da35fca1-95bf-4bf9-9dc6-2696846c402d\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.270998 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwrnt\" (UniqueName: \"kubernetes.io/projected/683d1de5-a849-4cc4-ad31-e4ddce58ce3a-kube-api-access-kwrnt\") pod \"cinder-operator-controller-manager-8d874c8fc-c857j\" (UID: \"683d1de5-a849-4cc4-ad31-e4ddce58ce3a\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.277543 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.278376 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.278922 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.279315 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.286084 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-4hqtf" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.287309 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-b28st" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.302277 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.302581 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.307684 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.322251 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.323177 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.324648 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.328187 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-657hj" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.329455 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.330175 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert\") pod \"infra-operator-controller-manager-79955696d6-bz4zb\" (UID: \"a3671fee-5baf-4bcf-8246-49b65ef8f0c8\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.330281 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8wrn\" (UniqueName: \"kubernetes.io/projected/d3675976-12f6-4169-8354-b3fbca99354a-kube-api-access-l8wrn\") pod \"keystone-operator-controller-manager-84f48565d4-b64xq\" (UID: \"d3675976-12f6-4169-8354-b3fbca99354a\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.330341 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47czj\" (UniqueName: \"kubernetes.io/projected/382d26e6-1e34-4de0-8e4a-50230ce1a90f-kube-api-access-47czj\") pod \"glance-operator-controller-manager-8886f4c47-5bf7x\" (UID: \"382d26e6-1e34-4de0-8e4a-50230ce1a90f\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.330389 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsb5p\" (UniqueName: \"kubernetes.io/projected/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-kube-api-access-bsb5p\") pod \"infra-operator-controller-manager-79955696d6-bz4zb\" (UID: \"a3671fee-5baf-4bcf-8246-49b65ef8f0c8\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.330471 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccrrs\" (UniqueName: \"kubernetes.io/projected/53bed426-a8df-4f33-8c52-c838d1a47f35-kube-api-access-ccrrs\") pod \"manila-operator-controller-manager-7dd968899f-8622s\" (UID: \"53bed426-a8df-4f33-8c52-c838d1a47f35\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-8622s" Jan 30 05:23:08 crc kubenswrapper[4841]: E0130 05:23:08.330491 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:08 crc kubenswrapper[4841]: E0130 05:23:08.330562 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert podName:a3671fee-5baf-4bcf-8246-49b65ef8f0c8 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:08.830542723 +0000 UTC m=+925.824015351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert") pod "infra-operator-controller-manager-79955696d6-bz4zb" (UID: "a3671fee-5baf-4bcf-8246-49b65ef8f0c8") : secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.330503 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8d8p\" (UniqueName: \"kubernetes.io/projected/99db8f34-be75-405c-abfc-c79f8a246b3a-kube-api-access-k8d8p\") pod \"horizon-operator-controller-manager-5fb775575f-qf8h2\" (UID: \"99db8f34-be75-405c-abfc-c79f8a246b3a\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.331764 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2l5m\" (UniqueName: \"kubernetes.io/projected/e0cb95b9-d9f5-4927-b7e5-47199da17894-kube-api-access-b2l5m\") pod \"ironic-operator-controller-manager-5f4b8bd54d-74h94\" (UID: \"e0cb95b9-d9f5-4927-b7e5-47199da17894\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.354004 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.356528 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47czj\" (UniqueName: \"kubernetes.io/projected/382d26e6-1e34-4de0-8e4a-50230ce1a90f-kube-api-access-47czj\") pod \"glance-operator-controller-manager-8886f4c47-5bf7x\" (UID: \"382d26e6-1e34-4de0-8e4a-50230ce1a90f\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.356082 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8d8p\" (UniqueName: \"kubernetes.io/projected/99db8f34-be75-405c-abfc-c79f8a246b3a-kube-api-access-k8d8p\") pod \"horizon-operator-controller-manager-5fb775575f-qf8h2\" (UID: \"99db8f34-be75-405c-abfc-c79f8a246b3a\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.359265 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2l5m\" (UniqueName: \"kubernetes.io/projected/e0cb95b9-d9f5-4927-b7e5-47199da17894-kube-api-access-b2l5m\") pod \"ironic-operator-controller-manager-5f4b8bd54d-74h94\" (UID: \"e0cb95b9-d9f5-4927-b7e5-47199da17894\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.362136 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.362999 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.362994 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsb5p\" (UniqueName: \"kubernetes.io/projected/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-kube-api-access-bsb5p\") pod \"infra-operator-controller-manager-79955696d6-bz4zb\" (UID: \"a3671fee-5baf-4bcf-8246-49b65ef8f0c8\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.364630 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-dfctq" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.371348 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.372240 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.375456 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.376637 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.377618 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.377879 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cl26t" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.381230 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.385231 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.386170 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.388755 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-g7qld" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.394170 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.398303 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.411436 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.443408 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccrrs\" (UniqueName: \"kubernetes.io/projected/53bed426-a8df-4f33-8c52-c838d1a47f35-kube-api-access-ccrrs\") pod \"manila-operator-controller-manager-7dd968899f-8622s\" (UID: \"53bed426-a8df-4f33-8c52-c838d1a47f35\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-8622s" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.444238 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8n4f\" (UniqueName: \"kubernetes.io/projected/0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8-kube-api-access-d8n4f\") pod \"mariadb-operator-controller-manager-67bf948998-6mfjj\" (UID: \"0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.444325 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6lr5\" (UniqueName: \"kubernetes.io/projected/9dfc9a4b-0426-4293-b78d-e63c74b0ec96-kube-api-access-w6lr5\") pod \"neutron-operator-controller-manager-585dbc889-ptsr8\" (UID: \"9dfc9a4b-0426-4293-b78d-e63c74b0ec96\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.444365 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg5xh\" (UniqueName: \"kubernetes.io/projected/22c16a0e-8121-40ce-82a1-6129d8f4b017-kube-api-access-kg5xh\") pod \"octavia-operator-controller-manager-6687f8d877-s227p\" (UID: \"22c16a0e-8121-40ce-82a1-6129d8f4b017\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.444382 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwgft\" (UniqueName: \"kubernetes.io/projected/a9d964d8-8035-47e5-9ed9-c7713882002c-kube-api-access-hwgft\") pod \"nova-operator-controller-manager-55bff696bd-cwtpg\" (UID: \"a9d964d8-8035-47e5-9ed9-c7713882002c\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.444420 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wrn\" (UniqueName: \"kubernetes.io/projected/d3675976-12f6-4169-8354-b3fbca99354a-kube-api-access-l8wrn\") pod \"keystone-operator-controller-manager-84f48565d4-b64xq\" (UID: \"d3675976-12f6-4169-8354-b3fbca99354a\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.444456 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98fhs\" (UniqueName: \"kubernetes.io/projected/3b8874f3-5993-4156-be01-f7952851fb6f-kube-api-access-98fhs\") pod \"placement-operator-controller-manager-5b964cf4cd-fjqg4\" (UID: \"3b8874f3-5993-4156-be01-f7952851fb6f\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.452638 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.456344 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.457270 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.461653 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-9q6p6" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.475312 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccrrs\" (UniqueName: \"kubernetes.io/projected/53bed426-a8df-4f33-8c52-c838d1a47f35-kube-api-access-ccrrs\") pod \"manila-operator-controller-manager-7dd968899f-8622s\" (UID: \"53bed426-a8df-4f33-8c52-c838d1a47f35\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-8622s" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.475469 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8wrn\" (UniqueName: \"kubernetes.io/projected/d3675976-12f6-4169-8354-b3fbca99354a-kube-api-access-l8wrn\") pod \"keystone-operator-controller-manager-84f48565d4-b64xq\" (UID: \"d3675976-12f6-4169-8354-b3fbca99354a\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.502711 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.546569 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8n4f\" (UniqueName: \"kubernetes.io/projected/0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8-kube-api-access-d8n4f\") pod \"mariadb-operator-controller-manager-67bf948998-6mfjj\" (UID: \"0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.546615 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7nwz\" (UniqueName: \"kubernetes.io/projected/72033411-0acc-4925-85c1-9fe48cb2157d-kube-api-access-d7nwz\") pod \"swift-operator-controller-manager-68fc8c869-56qdv\" (UID: \"72033411-0acc-4925-85c1-9fe48cb2157d\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.546661 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f\" (UID: \"13851207-9bc9-41c9-b6d6-3dab03a5e62c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.546678 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6lr5\" (UniqueName: \"kubernetes.io/projected/9dfc9a4b-0426-4293-b78d-e63c74b0ec96-kube-api-access-w6lr5\") pod \"neutron-operator-controller-manager-585dbc889-ptsr8\" (UID: \"9dfc9a4b-0426-4293-b78d-e63c74b0ec96\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.546693 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdx59\" (UniqueName: \"kubernetes.io/projected/f8753b24-bc7c-4623-904c-a7c7c0dd7aec-kube-api-access-zdx59\") pod \"ovn-operator-controller-manager-788c46999f-fhf8t\" (UID: \"f8753b24-bc7c-4623-904c-a7c7c0dd7aec\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.546708 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzcbn\" (UniqueName: \"kubernetes.io/projected/13851207-9bc9-41c9-b6d6-3dab03a5e62c-kube-api-access-tzcbn\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f\" (UID: \"13851207-9bc9-41c9-b6d6-3dab03a5e62c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.546756 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg5xh\" (UniqueName: \"kubernetes.io/projected/22c16a0e-8121-40ce-82a1-6129d8f4b017-kube-api-access-kg5xh\") pod \"octavia-operator-controller-manager-6687f8d877-s227p\" (UID: \"22c16a0e-8121-40ce-82a1-6129d8f4b017\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.546772 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwgft\" (UniqueName: \"kubernetes.io/projected/a9d964d8-8035-47e5-9ed9-c7713882002c-kube-api-access-hwgft\") pod \"nova-operator-controller-manager-55bff696bd-cwtpg\" (UID: \"a9d964d8-8035-47e5-9ed9-c7713882002c\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.546804 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98fhs\" (UniqueName: \"kubernetes.io/projected/3b8874f3-5993-4156-be01-f7952851fb6f-kube-api-access-98fhs\") pod \"placement-operator-controller-manager-5b964cf4cd-fjqg4\" (UID: \"3b8874f3-5993-4156-be01-f7952851fb6f\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.547505 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.554113 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-8622s" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.588465 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.589488 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.636671 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8n4f\" (UniqueName: \"kubernetes.io/projected/0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8-kube-api-access-d8n4f\") pod \"mariadb-operator-controller-manager-67bf948998-6mfjj\" (UID: \"0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.637538 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwgft\" (UniqueName: \"kubernetes.io/projected/a9d964d8-8035-47e5-9ed9-c7713882002c-kube-api-access-hwgft\") pod \"nova-operator-controller-manager-55bff696bd-cwtpg\" (UID: \"a9d964d8-8035-47e5-9ed9-c7713882002c\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.637611 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98fhs\" (UniqueName: \"kubernetes.io/projected/3b8874f3-5993-4156-be01-f7952851fb6f-kube-api-access-98fhs\") pod \"placement-operator-controller-manager-5b964cf4cd-fjqg4\" (UID: \"3b8874f3-5993-4156-be01-f7952851fb6f\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.637699 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vqtd6" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.638954 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg5xh\" (UniqueName: \"kubernetes.io/projected/22c16a0e-8121-40ce-82a1-6129d8f4b017-kube-api-access-kg5xh\") pod \"octavia-operator-controller-manager-6687f8d877-s227p\" (UID: \"22c16a0e-8121-40ce-82a1-6129d8f4b017\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.640109 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.647363 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6lr5\" (UniqueName: \"kubernetes.io/projected/9dfc9a4b-0426-4293-b78d-e63c74b0ec96-kube-api-access-w6lr5\") pod \"neutron-operator-controller-manager-585dbc889-ptsr8\" (UID: \"9dfc9a4b-0426-4293-b78d-e63c74b0ec96\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.665714 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7nwz\" (UniqueName: \"kubernetes.io/projected/72033411-0acc-4925-85c1-9fe48cb2157d-kube-api-access-d7nwz\") pod \"swift-operator-controller-manager-68fc8c869-56qdv\" (UID: \"72033411-0acc-4925-85c1-9fe48cb2157d\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.665832 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f\" (UID: \"13851207-9bc9-41c9-b6d6-3dab03a5e62c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.665865 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdx59\" (UniqueName: \"kubernetes.io/projected/f8753b24-bc7c-4623-904c-a7c7c0dd7aec-kube-api-access-zdx59\") pod \"ovn-operator-controller-manager-788c46999f-fhf8t\" (UID: \"f8753b24-bc7c-4623-904c-a7c7c0dd7aec\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.665884 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzcbn\" (UniqueName: \"kubernetes.io/projected/13851207-9bc9-41c9-b6d6-3dab03a5e62c-kube-api-access-tzcbn\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f\" (UID: \"13851207-9bc9-41c9-b6d6-3dab03a5e62c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:08 crc kubenswrapper[4841]: E0130 05:23:08.667305 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:08 crc kubenswrapper[4841]: E0130 05:23:08.667378 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert podName:13851207-9bc9-41c9-b6d6-3dab03a5e62c nodeName:}" failed. No retries permitted until 2026-01-30 05:23:09.167358004 +0000 UTC m=+926.160830642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" (UID: "13851207-9bc9-41c9-b6d6-3dab03a5e62c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.695000 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.695181 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.704159 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7nwz\" (UniqueName: \"kubernetes.io/projected/72033411-0acc-4925-85c1-9fe48cb2157d-kube-api-access-d7nwz\") pod \"swift-operator-controller-manager-68fc8c869-56qdv\" (UID: \"72033411-0acc-4925-85c1-9fe48cb2157d\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.706442 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdx59\" (UniqueName: \"kubernetes.io/projected/f8753b24-bc7c-4623-904c-a7c7c0dd7aec-kube-api-access-zdx59\") pod \"ovn-operator-controller-manager-788c46999f-fhf8t\" (UID: \"f8753b24-bc7c-4623-904c-a7c7c0dd7aec\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.711082 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzcbn\" (UniqueName: \"kubernetes.io/projected/13851207-9bc9-41c9-b6d6-3dab03a5e62c-kube-api-access-tzcbn\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f\" (UID: \"13851207-9bc9-41c9-b6d6-3dab03a5e62c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.726860 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.744920 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.799762 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h6q7\" (UniqueName: \"kubernetes.io/projected/4c201612-a22c-44ad-8f91-c5e4f45e895f-kube-api-access-5h6q7\") pod \"telemetry-operator-controller-manager-64b5b76f97-zfhk4\" (UID: \"4c201612-a22c-44ad-8f91-c5e4f45e895f\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.801657 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.825422 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.841013 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.842050 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.847896 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nsrjp" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.850319 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.860248 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-hlthw"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.861102 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-hlthw" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.867815 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5gxfj" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.867975 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-hlthw"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.868474 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.882577 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.908437 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert\") pod \"infra-operator-controller-manager-79955696d6-bz4zb\" (UID: \"a3671fee-5baf-4bcf-8246-49b65ef8f0c8\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.908479 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpjg9\" (UniqueName: \"kubernetes.io/projected/14b05ee0-2b05-4406-8721-979476d7c5be-kube-api-access-zpjg9\") pod \"test-operator-controller-manager-56f8bfcd9f-vcq29\" (UID: \"14b05ee0-2b05-4406-8721-979476d7c5be\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29" Jan 30 05:23:08 crc kubenswrapper[4841]: E0130 05:23:08.908769 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:08 crc kubenswrapper[4841]: E0130 05:23:08.908878 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert podName:a3671fee-5baf-4bcf-8246-49b65ef8f0c8 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:09.908860851 +0000 UTC m=+926.902333489 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert") pod "infra-operator-controller-manager-79955696d6-bz4zb" (UID: "a3671fee-5baf-4bcf-8246-49b65ef8f0c8") : secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.909032 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h6q7\" (UniqueName: \"kubernetes.io/projected/4c201612-a22c-44ad-8f91-c5e4f45e895f-kube-api-access-5h6q7\") pod \"telemetry-operator-controller-manager-64b5b76f97-zfhk4\" (UID: \"4c201612-a22c-44ad-8f91-c5e4f45e895f\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.911424 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.912635 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.916873 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.919207 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.919673 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-cdvz5" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.928279 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.930044 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.931789 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.938196 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2"] Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.938441 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-kvqnz" Jan 30 05:23:08 crc kubenswrapper[4841]: I0130 05:23:08.944724 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h6q7\" (UniqueName: \"kubernetes.io/projected/4c201612-a22c-44ad-8f91-c5e4f45e895f-kube-api-access-5h6q7\") pod \"telemetry-operator-controller-manager-64b5b76f97-zfhk4\" (UID: \"4c201612-a22c-44ad-8f91-c5e4f45e895f\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.002715 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.030088 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9mw7\" (UniqueName: \"kubernetes.io/projected/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-kube-api-access-p9mw7\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.030360 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.030382 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sw4m\" (UniqueName: \"kubernetes.io/projected/cad53450-3002-49dc-bde3-c32d90ec2272-kube-api-access-8sw4m\") pod \"watcher-operator-controller-manager-564965969-hlthw\" (UID: \"cad53450-3002-49dc-bde3-c32d90ec2272\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-hlthw" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.030482 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-899hv\" (UniqueName: \"kubernetes.io/projected/9585c98b-78c2-4860-a00f-4347390f4432-kube-api-access-899hv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wbqf2\" (UID: \"9585c98b-78c2-4860-a00f-4347390f4432\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.030531 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpjg9\" (UniqueName: \"kubernetes.io/projected/14b05ee0-2b05-4406-8721-979476d7c5be-kube-api-access-zpjg9\") pod \"test-operator-controller-manager-56f8bfcd9f-vcq29\" (UID: \"14b05ee0-2b05-4406-8721-979476d7c5be\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.040531 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.078413 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpjg9\" (UniqueName: \"kubernetes.io/projected/14b05ee0-2b05-4406-8721-979476d7c5be-kube-api-access-zpjg9\") pod \"test-operator-controller-manager-56f8bfcd9f-vcq29\" (UID: \"14b05ee0-2b05-4406-8721-979476d7c5be\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.141423 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.141496 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9mw7\" (UniqueName: \"kubernetes.io/projected/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-kube-api-access-p9mw7\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.141556 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:09 crc kubenswrapper[4841]: E0130 05:23:09.141571 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 05:23:09 crc kubenswrapper[4841]: E0130 05:23:09.141633 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs podName:1e0a24a0-c7c6-4c83-94f9-918314ee3ac7 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:09.641616731 +0000 UTC m=+926.635089369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-mbf2k" (UID: "1e0a24a0-c7c6-4c83-94f9-918314ee3ac7") : secret "webhook-server-cert" not found Jan 30 05:23:09 crc kubenswrapper[4841]: E0130 05:23:09.141768 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.141574 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sw4m\" (UniqueName: \"kubernetes.io/projected/cad53450-3002-49dc-bde3-c32d90ec2272-kube-api-access-8sw4m\") pod \"watcher-operator-controller-manager-564965969-hlthw\" (UID: \"cad53450-3002-49dc-bde3-c32d90ec2272\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-hlthw" Jan 30 05:23:09 crc kubenswrapper[4841]: E0130 05:23:09.141818 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs podName:1e0a24a0-c7c6-4c83-94f9-918314ee3ac7 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:09.641800906 +0000 UTC m=+926.635273544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-mbf2k" (UID: "1e0a24a0-c7c6-4c83-94f9-918314ee3ac7") : secret "metrics-server-cert" not found Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.141886 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-899hv\" (UniqueName: \"kubernetes.io/projected/9585c98b-78c2-4860-a00f-4347390f4432-kube-api-access-899hv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wbqf2\" (UID: \"9585c98b-78c2-4860-a00f-4347390f4432\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.145009 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq"] Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.160004 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-899hv\" (UniqueName: \"kubernetes.io/projected/9585c98b-78c2-4860-a00f-4347390f4432-kube-api-access-899hv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wbqf2\" (UID: \"9585c98b-78c2-4860-a00f-4347390f4432\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.166170 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sw4m\" (UniqueName: \"kubernetes.io/projected/cad53450-3002-49dc-bde3-c32d90ec2272-kube-api-access-8sw4m\") pod \"watcher-operator-controller-manager-564965969-hlthw\" (UID: \"cad53450-3002-49dc-bde3-c32d90ec2272\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-hlthw" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.167417 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9mw7\" (UniqueName: \"kubernetes.io/projected/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-kube-api-access-p9mw7\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.177483 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.201697 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-hlthw" Jan 30 05:23:09 crc kubenswrapper[4841]: W0130 05:23:09.218604 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda35fca1_95bf_4bf9_9dc6_2696846c402d.slice/crio-b08ae714e296cd419209c2ba2b3295c0506a261af1c2323b3629296bde904605 WatchSource:0}: Error finding container b08ae714e296cd419209c2ba2b3295c0506a261af1c2323b3629296bde904605: Status 404 returned error can't find the container with id b08ae714e296cd419209c2ba2b3295c0506a261af1c2323b3629296bde904605 Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.245029 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f\" (UID: \"13851207-9bc9-41c9-b6d6-3dab03a5e62c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:09 crc kubenswrapper[4841]: E0130 05:23:09.245180 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:09 crc kubenswrapper[4841]: E0130 05:23:09.245221 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert podName:13851207-9bc9-41c9-b6d6-3dab03a5e62c nodeName:}" failed. No retries permitted until 2026-01-30 05:23:10.245209009 +0000 UTC m=+927.238681647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" (UID: "13851207-9bc9-41c9-b6d6-3dab03a5e62c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.291115 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.341098 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94"] Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.384652 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq" event={"ID":"da35fca1-95bf-4bf9-9dc6-2696846c402d","Type":"ContainerStarted","Data":"b08ae714e296cd419209c2ba2b3295c0506a261af1c2323b3629296bde904605"} Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.387144 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr" event={"ID":"5abd246f-9cd6-44a5-b189-3f757aa6904b","Type":"ContainerStarted","Data":"662ebd147956d5b69e6711642327e3ef736815f10ea18bee4a785054eed8191e"} Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.626218 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x"] Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.633051 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j"] Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.656943 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.657077 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:09 crc kubenswrapper[4841]: E0130 05:23:09.657233 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 05:23:09 crc kubenswrapper[4841]: E0130 05:23:09.657279 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs podName:1e0a24a0-c7c6-4c83-94f9-918314ee3ac7 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:10.65726573 +0000 UTC m=+927.650738368 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-mbf2k" (UID: "1e0a24a0-c7c6-4c83-94f9-918314ee3ac7") : secret "webhook-server-cert" not found Jan 30 05:23:09 crc kubenswrapper[4841]: E0130 05:23:09.657341 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 05:23:09 crc kubenswrapper[4841]: E0130 05:23:09.657418 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs podName:1e0a24a0-c7c6-4c83-94f9-918314ee3ac7 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:10.657387353 +0000 UTC m=+927.650859991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-mbf2k" (UID: "1e0a24a0-c7c6-4c83-94f9-918314ee3ac7") : secret "metrics-server-cert" not found Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.660886 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-8622s"] Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.803187 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq"] Jan 30 05:23:09 crc kubenswrapper[4841]: W0130 05:23:09.814490 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3675976_12f6_4169_8354_b3fbca99354a.slice/crio-8c95b4bfc4fdca3c493085d0fea198c0f844c2200f273658612720b79fee562b WatchSource:0}: Error finding container 8c95b4bfc4fdca3c493085d0fea198c0f844c2200f273658612720b79fee562b: Status 404 returned error can't find the container with id 8c95b4bfc4fdca3c493085d0fea198c0f844c2200f273658612720b79fee562b Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.905794 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz"] Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.913290 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p"] Jan 30 05:23:09 crc kubenswrapper[4841]: W0130 05:23:09.913557 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36279e2b_d43d_452c_a159_269eccab814a.slice/crio-3e3acf93aadc9c28193e136b02afcbc8bdf70320028916da3ead740c51f1589e WatchSource:0}: Error finding container 3e3acf93aadc9c28193e136b02afcbc8bdf70320028916da3ead740c51f1589e: Status 404 returned error can't find the container with id 3e3acf93aadc9c28193e136b02afcbc8bdf70320028916da3ead740c51f1589e Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.937756 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8"] Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.962726 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4"] Jan 30 05:23:09 crc kubenswrapper[4841]: I0130 05:23:09.982296 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg"] Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.007570 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert\") pod \"infra-operator-controller-manager-79955696d6-bz4zb\" (UID: \"a3671fee-5baf-4bcf-8246-49b65ef8f0c8\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.008129 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.008210 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert podName:a3671fee-5baf-4bcf-8246-49b65ef8f0c8 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:12.008183244 +0000 UTC m=+929.001655872 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert") pod "infra-operator-controller-manager-79955696d6-bz4zb" (UID: "a3671fee-5baf-4bcf-8246-49b65ef8f0c8") : secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.020593 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2"] Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.112663 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29"] Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.147452 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4"] Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.152465 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv"] Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.160696 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj"] Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.160743 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-hlthw"] Jan 30 05:23:10 crc kubenswrapper[4841]: W0130 05:23:10.182899 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b8874f3_5993_4156_be01_f7952851fb6f.slice/crio-3ab330af3f285c2793e570ca426117e474402f2a0ac61a868b96a048b13512a1 WatchSource:0}: Error finding container 3ab330af3f285c2793e570ca426117e474402f2a0ac61a868b96a048b13512a1: Status 404 returned error can't find the container with id 3ab330af3f285c2793e570ca426117e474402f2a0ac61a868b96a048b13512a1 Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.211749 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-98fhs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-fjqg4_openstack-operators(3b8874f3-5993-4156-be01-f7952851fb6f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.211892 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d7nwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68fc8c869-56qdv_openstack-operators(72033411-0acc-4925-85c1-9fe48cb2157d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.211980 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d8n4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67bf948998-6mfjj_openstack-operators(0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.212863 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" podUID="3b8874f3-5993-4156-be01-f7952851fb6f" Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.212939 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" podUID="72033411-0acc-4925-85c1-9fe48cb2157d" Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.215488 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" podUID="0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8" Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.280608 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t"] Jan 30 05:23:10 crc kubenswrapper[4841]: W0130 05:23:10.284644 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8753b24_bc7c_4623_904c_a7c7c0dd7aec.slice/crio-baf6d9c9a99b834258ab2c0aeddbf761a2759c74f1030f43acd2693ae03d7208 WatchSource:0}: Error finding container baf6d9c9a99b834258ab2c0aeddbf761a2759c74f1030f43acd2693ae03d7208: Status 404 returned error can't find the container with id baf6d9c9a99b834258ab2c0aeddbf761a2759c74f1030f43acd2693ae03d7208 Jan 30 05:23:10 crc kubenswrapper[4841]: W0130 05:23:10.285145 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9585c98b_78c2_4860_a00f_4347390f4432.slice/crio-4dbe03ff394ec87be2e50208a9ea4d98430509c68efd5f37a0a8263fc643c395 WatchSource:0}: Error finding container 4dbe03ff394ec87be2e50208a9ea4d98430509c68efd5f37a0a8263fc643c395: Status 404 returned error can't find the container with id 4dbe03ff394ec87be2e50208a9ea4d98430509c68efd5f37a0a8263fc643c395 Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.307255 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2"] Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.313958 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f\" (UID: \"13851207-9bc9-41c9-b6d6-3dab03a5e62c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.314078 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.314117 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert podName:13851207-9bc9-41c9-b6d6-3dab03a5e62c nodeName:}" failed. No retries permitted until 2026-01-30 05:23:12.314103957 +0000 UTC m=+929.307576595 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" (UID: "13851207-9bc9-41c9-b6d6-3dab03a5e62c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.317729 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zdx59,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-788c46999f-fhf8t_openstack-operators(f8753b24-bc7c-4623-904c-a7c7c0dd7aec): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.320561 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" podUID="f8753b24-bc7c-4623-904c-a7c7c0dd7aec" Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.392715 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j" event={"ID":"683d1de5-a849-4cc4-ad31-e4ddce58ce3a","Type":"ContainerStarted","Data":"b71e8ef33ab3a2ac5f0e2d4a66fc0f8e977769a2245afdf5a3fa334b48760a52"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.394054 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8" event={"ID":"9dfc9a4b-0426-4293-b78d-e63c74b0ec96","Type":"ContainerStarted","Data":"068857852b5aed3546f53b9ee1b70ffd725a9df403f0f48a16b9a83ed5e3fc20"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.396574 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2" event={"ID":"99db8f34-be75-405c-abfc-c79f8a246b3a","Type":"ContainerStarted","Data":"e5e1493a9379ceb1bcf275977a33bfe8e035d24d4c72fd95f2ad2d772a346601"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.397576 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg" event={"ID":"a9d964d8-8035-47e5-9ed9-c7713882002c","Type":"ContainerStarted","Data":"9cb385682f9fe1f0a8cf4df34fab57a191b205f70254ce05a084fe46b8b42d3f"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.403358 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x" event={"ID":"382d26e6-1e34-4de0-8e4a-50230ce1a90f","Type":"ContainerStarted","Data":"024c3250cfe84cc4f87e7136367e280be4d133681f3dd9d6a333f8a5c59c1ed8"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.405020 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" event={"ID":"f8753b24-bc7c-4623-904c-a7c7c0dd7aec","Type":"ContainerStarted","Data":"baf6d9c9a99b834258ab2c0aeddbf761a2759c74f1030f43acd2693ae03d7208"} Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.407342 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" podUID="f8753b24-bc7c-4623-904c-a7c7c0dd7aec" Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.415529 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29" event={"ID":"14b05ee0-2b05-4406-8721-979476d7c5be","Type":"ContainerStarted","Data":"7e9c08873edf346081974f144543be4450d27bf49be5c4b92a58cf29a4705b90"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.424538 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" event={"ID":"72033411-0acc-4925-85c1-9fe48cb2157d","Type":"ContainerStarted","Data":"f0b7e599276cfaf7e3012db7b4c9bed3502f9120b5574b6848e9d808e0ba6609"} Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.426035 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" podUID="72033411-0acc-4925-85c1-9fe48cb2157d" Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.426599 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94" event={"ID":"e0cb95b9-d9f5-4927-b7e5-47199da17894","Type":"ContainerStarted","Data":"d83afa7a8c539d3dc024ee3de91fa75e0fdfdbca7df75f1c606a87a61c494fc9"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.427831 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-8622s" event={"ID":"53bed426-a8df-4f33-8c52-c838d1a47f35","Type":"ContainerStarted","Data":"5db8f4d6f96930123f07df0877c5113ffaca7b061bd412dc09729cf3ee7b70f1"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.428809 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p" event={"ID":"22c16a0e-8121-40ce-82a1-6129d8f4b017","Type":"ContainerStarted","Data":"24a5082be0b4d06e1fe55cbae21a59fba26be3468083eb5896f84c5dd878940e"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.430371 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" event={"ID":"3b8874f3-5993-4156-be01-f7952851fb6f","Type":"ContainerStarted","Data":"3ab330af3f285c2793e570ca426117e474402f2a0ac61a868b96a048b13512a1"} Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.432596 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" podUID="3b8874f3-5993-4156-be01-f7952851fb6f" Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.443916 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" podUID="0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8" Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.455710 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz" event={"ID":"36279e2b-d43d-452c-a159-269eccab814a","Type":"ContainerStarted","Data":"3e3acf93aadc9c28193e136b02afcbc8bdf70320028916da3ead740c51f1589e"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.455760 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-hlthw" event={"ID":"cad53450-3002-49dc-bde3-c32d90ec2272","Type":"ContainerStarted","Data":"12aea85c3ade46606dfbd32f689ffc1f3ab0da5021b1f7ada14ef7ad8b2a3724"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.455773 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" event={"ID":"0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8","Type":"ContainerStarted","Data":"dd7c14388677963cac188957f0dd9dc4d7ca37eb0bad8531f737d9f0148715dc"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.455794 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq" event={"ID":"d3675976-12f6-4169-8354-b3fbca99354a","Type":"ContainerStarted","Data":"8c95b4bfc4fdca3c493085d0fea198c0f844c2200f273658612720b79fee562b"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.463925 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.463964 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.473813 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4" event={"ID":"4c201612-a22c-44ad-8f91-c5e4f45e895f","Type":"ContainerStarted","Data":"70f57ae78ff6b37272a29f282388a2f1e7a7f57049228d95635401051235d2c4"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.475250 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2" event={"ID":"9585c98b-78c2-4860-a00f-4347390f4432","Type":"ContainerStarted","Data":"4dbe03ff394ec87be2e50208a9ea4d98430509c68efd5f37a0a8263fc643c395"} Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.720959 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:10 crc kubenswrapper[4841]: I0130 05:23:10.721042 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.721112 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.721177 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.721191 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs podName:1e0a24a0-c7c6-4c83-94f9-918314ee3ac7 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:12.721169033 +0000 UTC m=+929.714641671 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-mbf2k" (UID: "1e0a24a0-c7c6-4c83-94f9-918314ee3ac7") : secret "webhook-server-cert" not found Jan 30 05:23:10 crc kubenswrapper[4841]: E0130 05:23:10.721220 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs podName:1e0a24a0-c7c6-4c83-94f9-918314ee3ac7 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:12.721206794 +0000 UTC m=+929.714679432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-mbf2k" (UID: "1e0a24a0-c7c6-4c83-94f9-918314ee3ac7") : secret "metrics-server-cert" not found Jan 30 05:23:11 crc kubenswrapper[4841]: E0130 05:23:11.502331 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" podUID="f8753b24-bc7c-4623-904c-a7c7c0dd7aec" Jan 30 05:23:11 crc kubenswrapper[4841]: E0130 05:23:11.502337 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" podUID="3b8874f3-5993-4156-be01-f7952851fb6f" Jan 30 05:23:11 crc kubenswrapper[4841]: E0130 05:23:11.502411 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" podUID="0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8" Jan 30 05:23:11 crc kubenswrapper[4841]: E0130 05:23:11.514452 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" podUID="72033411-0acc-4925-85c1-9fe48cb2157d" Jan 30 05:23:12 crc kubenswrapper[4841]: I0130 05:23:12.051807 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert\") pod \"infra-operator-controller-manager-79955696d6-bz4zb\" (UID: \"a3671fee-5baf-4bcf-8246-49b65ef8f0c8\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:12 crc kubenswrapper[4841]: E0130 05:23:12.051995 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:12 crc kubenswrapper[4841]: E0130 05:23:12.052144 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert podName:a3671fee-5baf-4bcf-8246-49b65ef8f0c8 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:16.052129045 +0000 UTC m=+933.045601683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert") pod "infra-operator-controller-manager-79955696d6-bz4zb" (UID: "a3671fee-5baf-4bcf-8246-49b65ef8f0c8") : secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:12 crc kubenswrapper[4841]: I0130 05:23:12.355678 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f\" (UID: \"13851207-9bc9-41c9-b6d6-3dab03a5e62c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:12 crc kubenswrapper[4841]: E0130 05:23:12.355843 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:12 crc kubenswrapper[4841]: E0130 05:23:12.355915 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert podName:13851207-9bc9-41c9-b6d6-3dab03a5e62c nodeName:}" failed. No retries permitted until 2026-01-30 05:23:16.35589864 +0000 UTC m=+933.349371278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" (UID: "13851207-9bc9-41c9-b6d6-3dab03a5e62c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:12 crc kubenswrapper[4841]: I0130 05:23:12.691000 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:12 crc kubenswrapper[4841]: I0130 05:23:12.691079 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:12 crc kubenswrapper[4841]: I0130 05:23:12.765021 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:12 crc kubenswrapper[4841]: I0130 05:23:12.765114 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:12 crc kubenswrapper[4841]: E0130 05:23:12.765199 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 05:23:12 crc kubenswrapper[4841]: E0130 05:23:12.765268 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs podName:1e0a24a0-c7c6-4c83-94f9-918314ee3ac7 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:16.765251168 +0000 UTC m=+933.758723806 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-mbf2k" (UID: "1e0a24a0-c7c6-4c83-94f9-918314ee3ac7") : secret "webhook-server-cert" not found Jan 30 05:23:12 crc kubenswrapper[4841]: E0130 05:23:12.765269 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 05:23:12 crc kubenswrapper[4841]: E0130 05:23:12.765328 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs podName:1e0a24a0-c7c6-4c83-94f9-918314ee3ac7 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:16.76531426 +0000 UTC m=+933.758786898 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-mbf2k" (UID: "1e0a24a0-c7c6-4c83-94f9-918314ee3ac7") : secret "metrics-server-cert" not found Jan 30 05:23:12 crc kubenswrapper[4841]: I0130 05:23:12.768013 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:13 crc kubenswrapper[4841]: I0130 05:23:13.565499 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:15 crc kubenswrapper[4841]: I0130 05:23:15.166773 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-frp95"] Jan 30 05:23:16 crc kubenswrapper[4841]: I0130 05:23:16.128679 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert\") pod \"infra-operator-controller-manager-79955696d6-bz4zb\" (UID: \"a3671fee-5baf-4bcf-8246-49b65ef8f0c8\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:16 crc kubenswrapper[4841]: E0130 05:23:16.128878 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:16 crc kubenswrapper[4841]: E0130 05:23:16.129009 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert podName:a3671fee-5baf-4bcf-8246-49b65ef8f0c8 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:24.128973799 +0000 UTC m=+941.122446477 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert") pod "infra-operator-controller-manager-79955696d6-bz4zb" (UID: "a3671fee-5baf-4bcf-8246-49b65ef8f0c8") : secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:16 crc kubenswrapper[4841]: I0130 05:23:16.433116 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f\" (UID: \"13851207-9bc9-41c9-b6d6-3dab03a5e62c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:16 crc kubenswrapper[4841]: E0130 05:23:16.433300 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:16 crc kubenswrapper[4841]: E0130 05:23:16.433381 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert podName:13851207-9bc9-41c9-b6d6-3dab03a5e62c nodeName:}" failed. No retries permitted until 2026-01-30 05:23:24.433362171 +0000 UTC m=+941.426834819 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" (UID: "13851207-9bc9-41c9-b6d6-3dab03a5e62c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 05:23:16 crc kubenswrapper[4841]: I0130 05:23:16.551323 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-frp95" podUID="f92a542d-4baf-4e08-9da4-02f641b46d94" containerName="registry-server" containerID="cri-o://b98b4b3f983d3c97bae7ad2c0ac24067b4f164169f6bc984155fe3910d0ee18b" gracePeriod=2 Jan 30 05:23:16 crc kubenswrapper[4841]: I0130 05:23:16.839112 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:16 crc kubenswrapper[4841]: I0130 05:23:16.839471 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:16 crc kubenswrapper[4841]: E0130 05:23:16.839677 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 05:23:16 crc kubenswrapper[4841]: E0130 05:23:16.839807 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs podName:1e0a24a0-c7c6-4c83-94f9-918314ee3ac7 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:24.839788849 +0000 UTC m=+941.833261487 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-mbf2k" (UID: "1e0a24a0-c7c6-4c83-94f9-918314ee3ac7") : secret "metrics-server-cert" not found Jan 30 05:23:16 crc kubenswrapper[4841]: E0130 05:23:16.840033 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 05:23:16 crc kubenswrapper[4841]: E0130 05:23:16.840184 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs podName:1e0a24a0-c7c6-4c83-94f9-918314ee3ac7 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:24.840150939 +0000 UTC m=+941.833623637 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-mbf2k" (UID: "1e0a24a0-c7c6-4c83-94f9-918314ee3ac7") : secret "webhook-server-cert" not found Jan 30 05:23:17 crc kubenswrapper[4841]: I0130 05:23:17.566601 4841 generic.go:334] "Generic (PLEG): container finished" podID="f92a542d-4baf-4e08-9da4-02f641b46d94" containerID="b98b4b3f983d3c97bae7ad2c0ac24067b4f164169f6bc984155fe3910d0ee18b" exitCode=0 Jan 30 05:23:17 crc kubenswrapper[4841]: I0130 05:23:17.566684 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frp95" event={"ID":"f92a542d-4baf-4e08-9da4-02f641b46d94","Type":"ContainerDied","Data":"b98b4b3f983d3c97bae7ad2c0ac24067b4f164169f6bc984155fe3910d0ee18b"} Jan 30 05:23:22 crc kubenswrapper[4841]: E0130 05:23:22.692954 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b98b4b3f983d3c97bae7ad2c0ac24067b4f164169f6bc984155fe3910d0ee18b is running failed: container process not found" containerID="b98b4b3f983d3c97bae7ad2c0ac24067b4f164169f6bc984155fe3910d0ee18b" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 05:23:22 crc kubenswrapper[4841]: E0130 05:23:22.695032 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b98b4b3f983d3c97bae7ad2c0ac24067b4f164169f6bc984155fe3910d0ee18b is running failed: container process not found" containerID="b98b4b3f983d3c97bae7ad2c0ac24067b4f164169f6bc984155fe3910d0ee18b" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 05:23:22 crc kubenswrapper[4841]: E0130 05:23:22.695274 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b98b4b3f983d3c97bae7ad2c0ac24067b4f164169f6bc984155fe3910d0ee18b is running failed: container process not found" containerID="b98b4b3f983d3c97bae7ad2c0ac24067b4f164169f6bc984155fe3910d0ee18b" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 05:23:22 crc kubenswrapper[4841]: E0130 05:23:22.695340 4841 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b98b4b3f983d3c97bae7ad2c0ac24067b4f164169f6bc984155fe3910d0ee18b is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-frp95" podUID="f92a542d-4baf-4e08-9da4-02f641b46d94" containerName="registry-server" Jan 30 05:23:23 crc kubenswrapper[4841]: E0130 05:23:23.304043 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a" Jan 30 05:23:23 crc kubenswrapper[4841]: E0130 05:23:23.304198 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5h6q7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-64b5b76f97-zfhk4_openstack-operators(4c201612-a22c-44ad-8f91-c5e4f45e895f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:23:23 crc kubenswrapper[4841]: E0130 05:23:23.305446 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4" podUID="4c201612-a22c-44ad-8f91-c5e4f45e895f" Jan 30 05:23:23 crc kubenswrapper[4841]: E0130 05:23:23.622174 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4" podUID="4c201612-a22c-44ad-8f91-c5e4f45e895f" Jan 30 05:23:23 crc kubenswrapper[4841]: E0130 05:23:23.998770 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10" Jan 30 05:23:24 crc kubenswrapper[4841]: E0130 05:23:23.999266 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-758pt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-69d6db494d-69jsz_openstack-operators(36279e2b-d43d-452c-a159-269eccab814a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:23:24 crc kubenswrapper[4841]: E0130 05:23:24.000475 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz" podUID="36279e2b-d43d-452c-a159-269eccab814a" Jan 30 05:23:24 crc kubenswrapper[4841]: E0130 05:23:24.164170 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:24 crc kubenswrapper[4841]: I0130 05:23:24.165025 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert\") pod \"infra-operator-controller-manager-79955696d6-bz4zb\" (UID: \"a3671fee-5baf-4bcf-8246-49b65ef8f0c8\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:24 crc kubenswrapper[4841]: E0130 05:23:24.165100 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert podName:a3671fee-5baf-4bcf-8246-49b65ef8f0c8 nodeName:}" failed. No retries permitted until 2026-01-30 05:23:40.165079279 +0000 UTC m=+957.158551927 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert") pod "infra-operator-controller-manager-79955696d6-bz4zb" (UID: "a3671fee-5baf-4bcf-8246-49b65ef8f0c8") : secret "infra-operator-webhook-server-cert" not found Jan 30 05:23:24 crc kubenswrapper[4841]: I0130 05:23:24.473451 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f\" (UID: \"13851207-9bc9-41c9-b6d6-3dab03a5e62c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:24 crc kubenswrapper[4841]: I0130 05:23:24.486340 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13851207-9bc9-41c9-b6d6-3dab03a5e62c-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f\" (UID: \"13851207-9bc9-41c9-b6d6-3dab03a5e62c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:24 crc kubenswrapper[4841]: E0130 05:23:24.631957 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10\\\"\"" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz" podUID="36279e2b-d43d-452c-a159-269eccab814a" Jan 30 05:23:24 crc kubenswrapper[4841]: I0130 05:23:24.704713 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:24 crc kubenswrapper[4841]: E0130 05:23:24.712869 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521" Jan 30 05:23:24 crc kubenswrapper[4841]: E0130 05:23:24.713040 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b2l5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-5f4b8bd54d-74h94_openstack-operators(e0cb95b9-d9f5-4927-b7e5-47199da17894): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:23:24 crc kubenswrapper[4841]: E0130 05:23:24.714379 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94" podUID="e0cb95b9-d9f5-4927-b7e5-47199da17894" Jan 30 05:23:24 crc kubenswrapper[4841]: I0130 05:23:24.879122 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:24 crc kubenswrapper[4841]: I0130 05:23:24.879197 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:24 crc kubenswrapper[4841]: I0130 05:23:24.883390 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:24 crc kubenswrapper[4841]: I0130 05:23:24.895232 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e0a24a0-c7c6-4c83-94f9-918314ee3ac7-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-mbf2k\" (UID: \"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:25 crc kubenswrapper[4841]: I0130 05:23:25.165228 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:25 crc kubenswrapper[4841]: E0130 05:23:25.355370 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17" Jan 30 05:23:25 crc kubenswrapper[4841]: E0130 05:23:25.355585 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l8wrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-84f48565d4-b64xq_openstack-operators(d3675976-12f6-4169-8354-b3fbca99354a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:23:25 crc kubenswrapper[4841]: E0130 05:23:25.356762 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq" podUID="d3675976-12f6-4169-8354-b3fbca99354a" Jan 30 05:23:25 crc kubenswrapper[4841]: E0130 05:23:25.643531 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94" podUID="e0cb95b9-d9f5-4927-b7e5-47199da17894" Jan 30 05:23:25 crc kubenswrapper[4841]: E0130 05:23:25.648472 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq" podUID="d3675976-12f6-4169-8354-b3fbca99354a" Jan 30 05:23:26 crc kubenswrapper[4841]: E0130 05:23:26.145289 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241" Jan 30 05:23:26 crc kubenswrapper[4841]: E0130 05:23:26.145646 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zpjg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-vcq29_openstack-operators(14b05ee0-2b05-4406-8721-979476d7c5be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:23:26 crc kubenswrapper[4841]: E0130 05:23:26.146731 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29" podUID="14b05ee0-2b05-4406-8721-979476d7c5be" Jan 30 05:23:26 crc kubenswrapper[4841]: E0130 05:23:26.651056 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29" podUID="14b05ee0-2b05-4406-8721-979476d7c5be" Jan 30 05:23:26 crc kubenswrapper[4841]: E0130 05:23:26.667114 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6" Jan 30 05:23:26 crc kubenswrapper[4841]: E0130 05:23:26.667248 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w6lr5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-585dbc889-ptsr8_openstack-operators(9dfc9a4b-0426-4293-b78d-e63c74b0ec96): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:23:26 crc kubenswrapper[4841]: E0130 05:23:26.668465 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8" podUID="9dfc9a4b-0426-4293-b78d-e63c74b0ec96" Jan 30 05:23:27 crc kubenswrapper[4841]: E0130 05:23:27.287042 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e" Jan 30 05:23:27 crc kubenswrapper[4841]: E0130 05:23:27.287221 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hwgft,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-55bff696bd-cwtpg_openstack-operators(a9d964d8-8035-47e5-9ed9-c7713882002c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:23:27 crc kubenswrapper[4841]: E0130 05:23:27.289326 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg" podUID="a9d964d8-8035-47e5-9ed9-c7713882002c" Jan 30 05:23:27 crc kubenswrapper[4841]: E0130 05:23:27.657441 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg" podUID="a9d964d8-8035-47e5-9ed9-c7713882002c" Jan 30 05:23:27 crc kubenswrapper[4841]: E0130 05:23:27.657628 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8" podUID="9dfc9a4b-0426-4293-b78d-e63c74b0ec96" Jan 30 05:23:27 crc kubenswrapper[4841]: E0130 05:23:27.951474 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898" Jan 30 05:23:27 crc kubenswrapper[4841]: E0130 05:23:27.951684 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kwrnt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d874c8fc-c857j_openstack-operators(683d1de5-a849-4cc4-ad31-e4ddce58ce3a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:23:27 crc kubenswrapper[4841]: E0130 05:23:27.952901 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j" podUID="683d1de5-a849-4cc4-ad31-e4ddce58ce3a" Jan 30 05:23:28 crc kubenswrapper[4841]: E0130 05:23:28.663342 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j" podUID="683d1de5-a849-4cc4-ad31-e4ddce58ce3a" Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.312338 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.459157 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pdhn\" (UniqueName: \"kubernetes.io/projected/f92a542d-4baf-4e08-9da4-02f641b46d94-kube-api-access-9pdhn\") pod \"f92a542d-4baf-4e08-9da4-02f641b46d94\" (UID: \"f92a542d-4baf-4e08-9da4-02f641b46d94\") " Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.459207 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f92a542d-4baf-4e08-9da4-02f641b46d94-utilities\") pod \"f92a542d-4baf-4e08-9da4-02f641b46d94\" (UID: \"f92a542d-4baf-4e08-9da4-02f641b46d94\") " Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.459287 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f92a542d-4baf-4e08-9da4-02f641b46d94-catalog-content\") pod \"f92a542d-4baf-4e08-9da4-02f641b46d94\" (UID: \"f92a542d-4baf-4e08-9da4-02f641b46d94\") " Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.460860 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f92a542d-4baf-4e08-9da4-02f641b46d94-utilities" (OuterVolumeSpecName: "utilities") pod "f92a542d-4baf-4e08-9da4-02f641b46d94" (UID: "f92a542d-4baf-4e08-9da4-02f641b46d94"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.474471 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92a542d-4baf-4e08-9da4-02f641b46d94-kube-api-access-9pdhn" (OuterVolumeSpecName: "kube-api-access-9pdhn") pod "f92a542d-4baf-4e08-9da4-02f641b46d94" (UID: "f92a542d-4baf-4e08-9da4-02f641b46d94"). InnerVolumeSpecName "kube-api-access-9pdhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.505602 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f92a542d-4baf-4e08-9da4-02f641b46d94-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f92a542d-4baf-4e08-9da4-02f641b46d94" (UID: "f92a542d-4baf-4e08-9da4-02f641b46d94"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.560584 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pdhn\" (UniqueName: \"kubernetes.io/projected/f92a542d-4baf-4e08-9da4-02f641b46d94-kube-api-access-9pdhn\") on node \"crc\" DevicePath \"\"" Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.560608 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f92a542d-4baf-4e08-9da4-02f641b46d94-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.560618 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f92a542d-4baf-4e08-9da4-02f641b46d94-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.672222 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frp95" event={"ID":"f92a542d-4baf-4e08-9da4-02f641b46d94","Type":"ContainerDied","Data":"743ded506fa512af6c0118484eca5ddbac846ee8230d2f7e8e66fc5de9e0cf90"} Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.672288 4841 scope.go:117] "RemoveContainer" containerID="b98b4b3f983d3c97bae7ad2c0ac24067b4f164169f6bc984155fe3910d0ee18b" Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.672484 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frp95" Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.725872 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-frp95"] Jan 30 05:23:29 crc kubenswrapper[4841]: I0130 05:23:29.734200 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-frp95"] Jan 30 05:23:30 crc kubenswrapper[4841]: I0130 05:23:30.439831 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f92a542d-4baf-4e08-9da4-02f641b46d94" path="/var/lib/kubelet/pods/f92a542d-4baf-4e08-9da4-02f641b46d94/volumes" Jan 30 05:23:31 crc kubenswrapper[4841]: E0130 05:23:31.328500 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 30 05:23:31 crc kubenswrapper[4841]: E0130 05:23:31.328739 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-899hv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-wbqf2_openstack-operators(9585c98b-78c2-4860-a00f-4347390f4432): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:23:31 crc kubenswrapper[4841]: E0130 05:23:31.330122 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2" podUID="9585c98b-78c2-4860-a00f-4347390f4432" Jan 30 05:23:31 crc kubenswrapper[4841]: E0130 05:23:31.697728 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2" podUID="9585c98b-78c2-4860-a00f-4347390f4432" Jan 30 05:23:32 crc kubenswrapper[4841]: I0130 05:23:32.940620 4841 scope.go:117] "RemoveContainer" containerID="19378b28dc95e086fad25cb225d457941eb2c49ec1c1c4dca098bcd9994e58dd" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.079786 4841 scope.go:117] "RemoveContainer" containerID="6d591635894f482579edc61b5d0f40a687c509fc10cc0691ddd34b71e0991ba8" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.339148 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f"] Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.376431 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k"] Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.711120 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p" event={"ID":"22c16a0e-8121-40ce-82a1-6129d8f4b017","Type":"ContainerStarted","Data":"803f18ffcef457d1af046e0184a27c50d3997d686d9a367dc65272511ec4781d"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.711223 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.712439 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" event={"ID":"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7","Type":"ContainerStarted","Data":"5bd42cf8c5ee13066b97196642a695c699071231da51277b91f3363fcf3cc98c"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.712469 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" event={"ID":"1e0a24a0-c7c6-4c83-94f9-918314ee3ac7","Type":"ContainerStarted","Data":"90f4071b252cc751c6c05186ee3c31153f805bd5f5dfe72a026a84b7ad900dbf"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.712564 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.713910 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" event={"ID":"13851207-9bc9-41c9-b6d6-3dab03a5e62c","Type":"ContainerStarted","Data":"2f415a9605f9927f80c2ba046897087ae7b78e259e0d739fef212c24756b6e3e"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.715671 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" event={"ID":"72033411-0acc-4925-85c1-9fe48cb2157d","Type":"ContainerStarted","Data":"3901bc1be9bb08320d9eeee1b02af7d8179836d23741de3be872853379109fea"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.715854 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.717045 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x" event={"ID":"382d26e6-1e34-4de0-8e4a-50230ce1a90f","Type":"ContainerStarted","Data":"e71c6c662221554e3693005eb882cd18a584f65f78079312877f6077ca99d97a"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.717167 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.718161 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-hlthw" event={"ID":"cad53450-3002-49dc-bde3-c32d90ec2272","Type":"ContainerStarted","Data":"deec8312503d02ecdf4784bbc5e376f85f68c8c0de1feecafd85c304c6c2674a"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.718508 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-hlthw" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.719851 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr" event={"ID":"5abd246f-9cd6-44a5-b189-3f757aa6904b","Type":"ContainerStarted","Data":"8ad3a4e7b1acefad0ae9ee1b916fdd2c86e7ac4aa50fcc417195fc2fc4a4561e"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.720000 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.721032 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq" event={"ID":"da35fca1-95bf-4bf9-9dc6-2696846c402d","Type":"ContainerStarted","Data":"bef55dafddf36d8a087b18860c8c090da0a5d752d042d116035333ba70067a4c"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.721353 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.722506 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-8622s" event={"ID":"53bed426-a8df-4f33-8c52-c838d1a47f35","Type":"ContainerStarted","Data":"4ae5f75aed2b362570fb2ef687cb3ce1285ecf25eafc9a45ab39c4b2ca0855a8"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.722812 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-8622s" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.724130 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" event={"ID":"f8753b24-bc7c-4623-904c-a7c7c0dd7aec","Type":"ContainerStarted","Data":"4308441be5a064d0b9b33a843bef952d446f7b27157e917b24b5cccc03fef413"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.724453 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.725610 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" event={"ID":"3b8874f3-5993-4156-be01-f7952851fb6f","Type":"ContainerStarted","Data":"1de5e29f5b164d78b88c7e792393c2b04403f33a01a9330306bb6bc2a7ab194c"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.725929 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.727207 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2" event={"ID":"99db8f34-be75-405c-abfc-c79f8a246b3a","Type":"ContainerStarted","Data":"7a5d5c8a36c36628a5346b4bc6d1f8315534d94032bfc00c256dc271f4f19873"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.727556 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.729676 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" event={"ID":"0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8","Type":"ContainerStarted","Data":"1ad29bf051137ec3266cbd51fb6735eba79682ad2ba1eb106c8c99fee630dd20"} Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.729979 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.752357 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p" podStartSLOduration=5.025494313 podStartE2EDuration="25.75234055s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:09.94720125 +0000 UTC m=+926.940673888" lastFinishedPulling="2026-01-30 05:23:30.674047487 +0000 UTC m=+947.667520125" observedRunningTime="2026-01-30 05:23:33.736514531 +0000 UTC m=+950.729987169" watchObservedRunningTime="2026-01-30 05:23:33.75234055 +0000 UTC m=+950.745813188" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.784554 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x" podStartSLOduration=5.770572462 podStartE2EDuration="26.784538822s" podCreationTimestamp="2026-01-30 05:23:07 +0000 UTC" firstStartedPulling="2026-01-30 05:23:09.65984611 +0000 UTC m=+926.653318748" lastFinishedPulling="2026-01-30 05:23:30.67381247 +0000 UTC m=+947.667285108" observedRunningTime="2026-01-30 05:23:33.755839725 +0000 UTC m=+950.749312363" watchObservedRunningTime="2026-01-30 05:23:33.784538822 +0000 UTC m=+950.778011460" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.805128 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq" podStartSLOduration=4.75763671 podStartE2EDuration="26.80511294s" podCreationTimestamp="2026-01-30 05:23:07 +0000 UTC" firstStartedPulling="2026-01-30 05:23:09.267899374 +0000 UTC m=+926.261372012" lastFinishedPulling="2026-01-30 05:23:31.315375594 +0000 UTC m=+948.308848242" observedRunningTime="2026-01-30 05:23:33.8010463 +0000 UTC m=+950.794518928" watchObservedRunningTime="2026-01-30 05:23:33.80511294 +0000 UTC m=+950.798585578" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.806358 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" podStartSLOduration=3.171900082 podStartE2EDuration="25.806352644s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:10.317615872 +0000 UTC m=+927.311088510" lastFinishedPulling="2026-01-30 05:23:32.952068434 +0000 UTC m=+949.945541072" observedRunningTime="2026-01-30 05:23:33.78739237 +0000 UTC m=+950.780865008" watchObservedRunningTime="2026-01-30 05:23:33.806352644 +0000 UTC m=+950.799825282" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.829851 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2" podStartSLOduration=3.7544576750000003 podStartE2EDuration="25.82983588s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:09.993191787 +0000 UTC m=+926.986664425" lastFinishedPulling="2026-01-30 05:23:32.068569992 +0000 UTC m=+949.062042630" observedRunningTime="2026-01-30 05:23:33.826612513 +0000 UTC m=+950.820085151" watchObservedRunningTime="2026-01-30 05:23:33.82983588 +0000 UTC m=+950.823308518" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.880291 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr" podStartSLOduration=4.432715062 podStartE2EDuration="26.880267627s" podCreationTimestamp="2026-01-30 05:23:07 +0000 UTC" firstStartedPulling="2026-01-30 05:23:08.866810501 +0000 UTC m=+925.860283139" lastFinishedPulling="2026-01-30 05:23:31.314363046 +0000 UTC m=+948.307835704" observedRunningTime="2026-01-30 05:23:33.867119731 +0000 UTC m=+950.860592369" watchObservedRunningTime="2026-01-30 05:23:33.880267627 +0000 UTC m=+950.873740265" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.895043 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" podStartSLOduration=3.137347985 podStartE2EDuration="25.895030468s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:10.211931987 +0000 UTC m=+927.205404625" lastFinishedPulling="2026-01-30 05:23:32.96961447 +0000 UTC m=+949.963087108" observedRunningTime="2026-01-30 05:23:33.889265522 +0000 UTC m=+950.882738160" watchObservedRunningTime="2026-01-30 05:23:33.895030468 +0000 UTC m=+950.888503106" Jan 30 05:23:33 crc kubenswrapper[4841]: I0130 05:23:33.946906 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-hlthw" podStartSLOduration=4.8033142 podStartE2EDuration="25.946889234s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:10.172225691 +0000 UTC m=+927.165698319" lastFinishedPulling="2026-01-30 05:23:31.315800685 +0000 UTC m=+948.309273353" observedRunningTime="2026-01-30 05:23:33.927236751 +0000 UTC m=+950.920709389" watchObservedRunningTime="2026-01-30 05:23:33.946889234 +0000 UTC m=+950.940361872" Jan 30 05:23:34 crc kubenswrapper[4841]: I0130 05:23:34.001546 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-8622s" podStartSLOduration=4.346922956 podStartE2EDuration="26.001527575s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:09.659749517 +0000 UTC m=+926.653222155" lastFinishedPulling="2026-01-30 05:23:31.314354106 +0000 UTC m=+948.307826774" observedRunningTime="2026-01-30 05:23:33.962671021 +0000 UTC m=+950.956143659" watchObservedRunningTime="2026-01-30 05:23:34.001527575 +0000 UTC m=+950.995000223" Jan 30 05:23:34 crc kubenswrapper[4841]: I0130 05:23:34.001832 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" podStartSLOduration=26.001828493 podStartE2EDuration="26.001828493s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:23:33.996083537 +0000 UTC m=+950.989556175" watchObservedRunningTime="2026-01-30 05:23:34.001828493 +0000 UTC m=+950.995301131" Jan 30 05:23:34 crc kubenswrapper[4841]: I0130 05:23:34.059243 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" podStartSLOduration=3.329416052 podStartE2EDuration="26.059227719s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:10.211817894 +0000 UTC m=+927.205290532" lastFinishedPulling="2026-01-30 05:23:32.941629541 +0000 UTC m=+949.935102199" observedRunningTime="2026-01-30 05:23:34.057940734 +0000 UTC m=+951.051413372" watchObservedRunningTime="2026-01-30 05:23:34.059227719 +0000 UTC m=+951.052700357" Jan 30 05:23:34 crc kubenswrapper[4841]: I0130 05:23:34.059353 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" podStartSLOduration=3.3291951060000002 podStartE2EDuration="26.059350612s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:10.211598208 +0000 UTC m=+927.205070846" lastFinishedPulling="2026-01-30 05:23:32.941753714 +0000 UTC m=+949.935226352" observedRunningTime="2026-01-30 05:23:34.03526362 +0000 UTC m=+951.028736258" watchObservedRunningTime="2026-01-30 05:23:34.059350612 +0000 UTC m=+951.052823250" Jan 30 05:23:36 crc kubenswrapper[4841]: I0130 05:23:36.757026 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4" event={"ID":"4c201612-a22c-44ad-8f91-c5e4f45e895f","Type":"ContainerStarted","Data":"7413824d423d870299d98409e328cb34fa75d5d96fc01bc1b3727b3df93e0b98"} Jan 30 05:23:36 crc kubenswrapper[4841]: I0130 05:23:36.758036 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4" Jan 30 05:23:36 crc kubenswrapper[4841]: I0130 05:23:36.762140 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" event={"ID":"13851207-9bc9-41c9-b6d6-3dab03a5e62c","Type":"ContainerStarted","Data":"b0fc7c51ec5ddd31b591a10eb9455ac6a50784b4d352844c88d065e14886ca74"} Jan 30 05:23:36 crc kubenswrapper[4841]: I0130 05:23:36.762378 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:36 crc kubenswrapper[4841]: I0130 05:23:36.793776 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4" podStartSLOduration=3.866561494 podStartE2EDuration="28.793751102s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:09.949636316 +0000 UTC m=+926.943108954" lastFinishedPulling="2026-01-30 05:23:34.876825924 +0000 UTC m=+951.870298562" observedRunningTime="2026-01-30 05:23:36.784962344 +0000 UTC m=+953.778435022" watchObservedRunningTime="2026-01-30 05:23:36.793751102 +0000 UTC m=+953.787223780" Jan 30 05:23:36 crc kubenswrapper[4841]: I0130 05:23:36.820128 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" podStartSLOduration=26.591094758 podStartE2EDuration="28.820108237s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:33.393914203 +0000 UTC m=+950.387386841" lastFinishedPulling="2026-01-30 05:23:35.622927682 +0000 UTC m=+952.616400320" observedRunningTime="2026-01-30 05:23:36.814745082 +0000 UTC m=+953.808217780" watchObservedRunningTime="2026-01-30 05:23:36.820108237 +0000 UTC m=+953.813580885" Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.307590 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-ggldr" Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.329509 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b9wjq" Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.379349 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-5bf7x" Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.417090 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-qf8h2" Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.556394 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-8622s" Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.643276 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-6mfjj" Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.747565 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s227p" Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.778640 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94" event={"ID":"e0cb95b9-d9f5-4927-b7e5-47199da17894","Type":"ContainerStarted","Data":"38e68b8f414a27a1d6b3c02af9b7805195cdc5be3526913fda8f8fb34b55594f"} Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.779686 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94" Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.804156 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-fhf8t" Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.804340 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94" podStartSLOduration=2.408765992 podStartE2EDuration="30.804321789s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:09.449137057 +0000 UTC m=+926.442609695" lastFinishedPulling="2026-01-30 05:23:37.844692844 +0000 UTC m=+954.838165492" observedRunningTime="2026-01-30 05:23:38.800463525 +0000 UTC m=+955.793936173" watchObservedRunningTime="2026-01-30 05:23:38.804321789 +0000 UTC m=+955.797794427" Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.830262 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-fjqg4" Jan 30 05:23:38 crc kubenswrapper[4841]: I0130 05:23:38.874964 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" Jan 30 05:23:39 crc kubenswrapper[4841]: I0130 05:23:39.204488 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-hlthw" Jan 30 05:23:39 crc kubenswrapper[4841]: I0130 05:23:39.786423 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29" event={"ID":"14b05ee0-2b05-4406-8721-979476d7c5be","Type":"ContainerStarted","Data":"0865ff2a6144de011f03166d38c9b6a3b8c966aa21b19e625156c2a717796e96"} Jan 30 05:23:39 crc kubenswrapper[4841]: I0130 05:23:39.787289 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29" Jan 30 05:23:39 crc kubenswrapper[4841]: I0130 05:23:39.814331 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29" podStartSLOduration=2.985472909 podStartE2EDuration="31.814312451s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:10.172638842 +0000 UTC m=+927.166111480" lastFinishedPulling="2026-01-30 05:23:39.001478364 +0000 UTC m=+955.994951022" observedRunningTime="2026-01-30 05:23:39.810314932 +0000 UTC m=+956.803787570" watchObservedRunningTime="2026-01-30 05:23:39.814312451 +0000 UTC m=+956.807785089" Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.166509 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert\") pod \"infra-operator-controller-manager-79955696d6-bz4zb\" (UID: \"a3671fee-5baf-4bcf-8246-49b65ef8f0c8\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.182973 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3671fee-5baf-4bcf-8246-49b65ef8f0c8-cert\") pod \"infra-operator-controller-manager-79955696d6-bz4zb\" (UID: \"a3671fee-5baf-4bcf-8246-49b65ef8f0c8\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.234656 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.468220 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.468509 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.468544 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.469072 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6383000b3c9a91b0197a8dcc28ebef33bc3b14f4f0dc8fac3dcfc3c8dcddb775"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.469118 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://6383000b3c9a91b0197a8dcc28ebef33bc3b14f4f0dc8fac3dcfc3c8dcddb775" gracePeriod=600 Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.741202 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb"] Jan 30 05:23:40 crc kubenswrapper[4841]: W0130 05:23:40.748126 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3671fee_5baf_4bcf_8246_49b65ef8f0c8.slice/crio-a6b8220c9728fc96fb5cc28b3a763505c853cefe3191c230644a98e2a109dc23 WatchSource:0}: Error finding container a6b8220c9728fc96fb5cc28b3a763505c853cefe3191c230644a98e2a109dc23: Status 404 returned error can't find the container with id a6b8220c9728fc96fb5cc28b3a763505c853cefe3191c230644a98e2a109dc23 Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.794563 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg" event={"ID":"a9d964d8-8035-47e5-9ed9-c7713882002c","Type":"ContainerStarted","Data":"2270e45885adc98fa8587f9b5c16615324f64078762ff67fb7f3d5984fc02d20"} Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.794715 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg" Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.796007 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz" event={"ID":"36279e2b-d43d-452c-a159-269eccab814a","Type":"ContainerStarted","Data":"242e3138ce7aa33f60e4de9cac236b4fbda3c0139a9dbea5227044c1828c92de"} Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.796223 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz" Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.797015 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" event={"ID":"a3671fee-5baf-4bcf-8246-49b65ef8f0c8","Type":"ContainerStarted","Data":"a6b8220c9728fc96fb5cc28b3a763505c853cefe3191c230644a98e2a109dc23"} Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.798898 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="6383000b3c9a91b0197a8dcc28ebef33bc3b14f4f0dc8fac3dcfc3c8dcddb775" exitCode=0 Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.798917 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"6383000b3c9a91b0197a8dcc28ebef33bc3b14f4f0dc8fac3dcfc3c8dcddb775"} Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.798948 4841 scope.go:117] "RemoveContainer" containerID="59fbdcdf822ce8767e625a9fdb4978cefa9eaf5250b991d0c4b4b761a1a7e71e" Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.799829 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq" event={"ID":"d3675976-12f6-4169-8354-b3fbca99354a","Type":"ContainerStarted","Data":"ed429fc32773322b7cbb17cedcb7f997747b985f3e7e6976375c69ea1c610a81"} Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.812377 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg" podStartSLOduration=2.878436806 podStartE2EDuration="32.812359587s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:09.977382699 +0000 UTC m=+926.970855337" lastFinishedPulling="2026-01-30 05:23:39.91130544 +0000 UTC m=+956.904778118" observedRunningTime="2026-01-30 05:23:40.806975672 +0000 UTC m=+957.800448310" watchObservedRunningTime="2026-01-30 05:23:40.812359587 +0000 UTC m=+957.805832235" Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.840458 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz" podStartSLOduration=3.853487469 podStartE2EDuration="33.840442108s" podCreationTimestamp="2026-01-30 05:23:07 +0000 UTC" firstStartedPulling="2026-01-30 05:23:09.921778681 +0000 UTC m=+926.915251309" lastFinishedPulling="2026-01-30 05:23:39.90873327 +0000 UTC m=+956.902205948" observedRunningTime="2026-01-30 05:23:40.837736236 +0000 UTC m=+957.831208884" watchObservedRunningTime="2026-01-30 05:23:40.840442108 +0000 UTC m=+957.833914736" Jan 30 05:23:40 crc kubenswrapper[4841]: I0130 05:23:40.840573 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq" podStartSLOduration=2.750150269 podStartE2EDuration="32.840569803s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:09.817660229 +0000 UTC m=+926.811132867" lastFinishedPulling="2026-01-30 05:23:39.908079723 +0000 UTC m=+956.901552401" observedRunningTime="2026-01-30 05:23:40.824514877 +0000 UTC m=+957.817987515" watchObservedRunningTime="2026-01-30 05:23:40.840569803 +0000 UTC m=+957.834042441" Jan 30 05:23:41 crc kubenswrapper[4841]: I0130 05:23:41.807426 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8" event={"ID":"9dfc9a4b-0426-4293-b78d-e63c74b0ec96","Type":"ContainerStarted","Data":"03ac07495595f4a139095ff2bfe32177041b4cf1d1f1bd98946c603f2c41293b"} Jan 30 05:23:41 crc kubenswrapper[4841]: I0130 05:23:41.808028 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8" Jan 30 05:23:41 crc kubenswrapper[4841]: I0130 05:23:41.810873 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"344e964639b293843f58c03939c43edd9bcd822c2de642650f9f33c6e2a4eb20"} Jan 30 05:23:41 crc kubenswrapper[4841]: I0130 05:23:41.812383 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j" event={"ID":"683d1de5-a849-4cc4-ad31-e4ddce58ce3a","Type":"ContainerStarted","Data":"50d763b21c61305870fa72c5bb368db61ea61a6adb1074eb7e13ac74d74c1a44"} Jan 30 05:23:41 crc kubenswrapper[4841]: I0130 05:23:41.812625 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j" Jan 30 05:23:41 crc kubenswrapper[4841]: I0130 05:23:41.826198 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8" podStartSLOduration=2.89994315 podStartE2EDuration="33.826184922s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:09.949232856 +0000 UTC m=+926.942705494" lastFinishedPulling="2026-01-30 05:23:40.875474628 +0000 UTC m=+957.868947266" observedRunningTime="2026-01-30 05:23:41.822267776 +0000 UTC m=+958.815740405" watchObservedRunningTime="2026-01-30 05:23:41.826184922 +0000 UTC m=+958.819657560" Jan 30 05:23:41 crc kubenswrapper[4841]: I0130 05:23:41.842805 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j" podStartSLOduration=3.633851804 podStartE2EDuration="34.842788592s" podCreationTimestamp="2026-01-30 05:23:07 +0000 UTC" firstStartedPulling="2026-01-30 05:23:09.65914374 +0000 UTC m=+926.652616378" lastFinishedPulling="2026-01-30 05:23:40.868080538 +0000 UTC m=+957.861553166" observedRunningTime="2026-01-30 05:23:41.839661918 +0000 UTC m=+958.833134556" watchObservedRunningTime="2026-01-30 05:23:41.842788592 +0000 UTC m=+958.836261220" Jan 30 05:23:43 crc kubenswrapper[4841]: I0130 05:23:43.831923 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" event={"ID":"a3671fee-5baf-4bcf-8246-49b65ef8f0c8","Type":"ContainerStarted","Data":"1924ebcf23c1224057bfaa52f8e84872d68e2429731e20c4cfe094c9af6051f4"} Jan 30 05:23:43 crc kubenswrapper[4841]: I0130 05:23:43.832672 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:23:43 crc kubenswrapper[4841]: I0130 05:23:43.872927 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" podStartSLOduration=33.632661076 podStartE2EDuration="35.872901279s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:40.751762765 +0000 UTC m=+957.745235403" lastFinishedPulling="2026-01-30 05:23:42.992002968 +0000 UTC m=+959.985475606" observedRunningTime="2026-01-30 05:23:43.854011597 +0000 UTC m=+960.847484275" watchObservedRunningTime="2026-01-30 05:23:43.872901279 +0000 UTC m=+960.866373957" Jan 30 05:23:44 crc kubenswrapper[4841]: I0130 05:23:44.713025 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f" Jan 30 05:23:44 crc kubenswrapper[4841]: I0130 05:23:44.853073 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2" event={"ID":"9585c98b-78c2-4860-a00f-4347390f4432","Type":"ContainerStarted","Data":"a859ca4168dc08b1d84a45e01bef9ef05fe6301b29115c394259d41f2a238e1c"} Jan 30 05:23:44 crc kubenswrapper[4841]: I0130 05:23:44.876269 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wbqf2" podStartSLOduration=3.311820885 podStartE2EDuration="36.87625008s" podCreationTimestamp="2026-01-30 05:23:08 +0000 UTC" firstStartedPulling="2026-01-30 05:23:10.2872841 +0000 UTC m=+927.280756738" lastFinishedPulling="2026-01-30 05:23:43.851713285 +0000 UTC m=+960.845185933" observedRunningTime="2026-01-30 05:23:44.871348297 +0000 UTC m=+961.864820945" watchObservedRunningTime="2026-01-30 05:23:44.87625008 +0000 UTC m=+961.869722718" Jan 30 05:23:45 crc kubenswrapper[4841]: I0130 05:23:45.174890 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-mbf2k" Jan 30 05:23:48 crc kubenswrapper[4841]: I0130 05:23:48.312289 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c857j" Jan 30 05:23:48 crc kubenswrapper[4841]: I0130 05:23:48.368559 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-69jsz" Jan 30 05:23:48 crc kubenswrapper[4841]: I0130 05:23:48.456608 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-74h94" Jan 30 05:23:48 crc kubenswrapper[4841]: I0130 05:23:48.548212 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq" Jan 30 05:23:48 crc kubenswrapper[4841]: I0130 05:23:48.550973 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-b64xq" Jan 30 05:23:48 crc kubenswrapper[4841]: I0130 05:23:48.702211 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cwtpg" Jan 30 05:23:48 crc kubenswrapper[4841]: I0130 05:23:48.730393 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-ptsr8" Jan 30 05:23:49 crc kubenswrapper[4841]: I0130 05:23:49.006322 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-zfhk4" Jan 30 05:23:49 crc kubenswrapper[4841]: I0130 05:23:49.181646 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vcq29" Jan 30 05:23:50 crc kubenswrapper[4841]: I0130 05:23:50.246486 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.830684 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-j7qw7"] Jan 30 05:24:06 crc kubenswrapper[4841]: E0130 05:24:06.831330 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92a542d-4baf-4e08-9da4-02f641b46d94" containerName="registry-server" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.831343 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92a542d-4baf-4e08-9da4-02f641b46d94" containerName="registry-server" Jan 30 05:24:06 crc kubenswrapper[4841]: E0130 05:24:06.831350 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92a542d-4baf-4e08-9da4-02f641b46d94" containerName="extract-utilities" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.831356 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92a542d-4baf-4e08-9da4-02f641b46d94" containerName="extract-utilities" Jan 30 05:24:06 crc kubenswrapper[4841]: E0130 05:24:06.831386 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92a542d-4baf-4e08-9da4-02f641b46d94" containerName="extract-content" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.831393 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92a542d-4baf-4e08-9da4-02f641b46d94" containerName="extract-content" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.831517 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f92a542d-4baf-4e08-9da4-02f641b46d94" containerName="registry-server" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.832127 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.834719 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.834813 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-fwp8p" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.835073 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.835820 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.845768 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-j7qw7"] Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.881648 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-5r7fq"] Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.883197 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.893339 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.893850 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-5r7fq"] Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.925835 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32147ec7-e387-4d0b-9536-1341f982729e-config\") pod \"dnsmasq-dns-5f854695bc-5r7fq\" (UID: \"32147ec7-e387-4d0b-9536-1341f982729e\") " pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.925875 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzljg\" (UniqueName: \"kubernetes.io/projected/32147ec7-e387-4d0b-9536-1341f982729e-kube-api-access-vzljg\") pod \"dnsmasq-dns-5f854695bc-5r7fq\" (UID: \"32147ec7-e387-4d0b-9536-1341f982729e\") " pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.925915 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwxxz\" (UniqueName: \"kubernetes.io/projected/beed4aec-0cf6-48ef-b3a1-55843db48bf7-kube-api-access-kwxxz\") pod \"dnsmasq-dns-84bb9d8bd9-j7qw7\" (UID: \"beed4aec-0cf6-48ef-b3a1-55843db48bf7\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.925929 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32147ec7-e387-4d0b-9536-1341f982729e-dns-svc\") pod \"dnsmasq-dns-5f854695bc-5r7fq\" (UID: \"32147ec7-e387-4d0b-9536-1341f982729e\") " pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:06 crc kubenswrapper[4841]: I0130 05:24:06.925972 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beed4aec-0cf6-48ef-b3a1-55843db48bf7-config\") pod \"dnsmasq-dns-84bb9d8bd9-j7qw7\" (UID: \"beed4aec-0cf6-48ef-b3a1-55843db48bf7\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.027576 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32147ec7-e387-4d0b-9536-1341f982729e-config\") pod \"dnsmasq-dns-5f854695bc-5r7fq\" (UID: \"32147ec7-e387-4d0b-9536-1341f982729e\") " pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.027922 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzljg\" (UniqueName: \"kubernetes.io/projected/32147ec7-e387-4d0b-9536-1341f982729e-kube-api-access-vzljg\") pod \"dnsmasq-dns-5f854695bc-5r7fq\" (UID: \"32147ec7-e387-4d0b-9536-1341f982729e\") " pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.028244 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwxxz\" (UniqueName: \"kubernetes.io/projected/beed4aec-0cf6-48ef-b3a1-55843db48bf7-kube-api-access-kwxxz\") pod \"dnsmasq-dns-84bb9d8bd9-j7qw7\" (UID: \"beed4aec-0cf6-48ef-b3a1-55843db48bf7\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.028474 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32147ec7-e387-4d0b-9536-1341f982729e-dns-svc\") pod \"dnsmasq-dns-5f854695bc-5r7fq\" (UID: \"32147ec7-e387-4d0b-9536-1341f982729e\") " pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.028720 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beed4aec-0cf6-48ef-b3a1-55843db48bf7-config\") pod \"dnsmasq-dns-84bb9d8bd9-j7qw7\" (UID: \"beed4aec-0cf6-48ef-b3a1-55843db48bf7\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.029229 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32147ec7-e387-4d0b-9536-1341f982729e-config\") pod \"dnsmasq-dns-5f854695bc-5r7fq\" (UID: \"32147ec7-e387-4d0b-9536-1341f982729e\") " pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.029560 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beed4aec-0cf6-48ef-b3a1-55843db48bf7-config\") pod \"dnsmasq-dns-84bb9d8bd9-j7qw7\" (UID: \"beed4aec-0cf6-48ef-b3a1-55843db48bf7\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.029769 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32147ec7-e387-4d0b-9536-1341f982729e-dns-svc\") pod \"dnsmasq-dns-5f854695bc-5r7fq\" (UID: \"32147ec7-e387-4d0b-9536-1341f982729e\") " pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.057581 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwxxz\" (UniqueName: \"kubernetes.io/projected/beed4aec-0cf6-48ef-b3a1-55843db48bf7-kube-api-access-kwxxz\") pod \"dnsmasq-dns-84bb9d8bd9-j7qw7\" (UID: \"beed4aec-0cf6-48ef-b3a1-55843db48bf7\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.058902 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzljg\" (UniqueName: \"kubernetes.io/projected/32147ec7-e387-4d0b-9536-1341f982729e-kube-api-access-vzljg\") pod \"dnsmasq-dns-5f854695bc-5r7fq\" (UID: \"32147ec7-e387-4d0b-9536-1341f982729e\") " pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.151309 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.200381 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.443832 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-j7qw7"] Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.447893 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:24:07 crc kubenswrapper[4841]: I0130 05:24:07.542688 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-5r7fq"] Jan 30 05:24:08 crc kubenswrapper[4841]: I0130 05:24:08.051528 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" event={"ID":"beed4aec-0cf6-48ef-b3a1-55843db48bf7","Type":"ContainerStarted","Data":"2cd945876f52946045dfe01de944633e0ae16592a207aecaca5b8694045e8102"} Jan 30 05:24:08 crc kubenswrapper[4841]: I0130 05:24:08.054988 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" event={"ID":"32147ec7-e387-4d0b-9536-1341f982729e","Type":"ContainerStarted","Data":"0e99cca3a74382d1c8f3f581d137dd037eccda69638b4d4b282538a9439381ab"} Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.112512 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-5r7fq"] Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.122091 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-99rj4"] Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.124733 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.134367 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-99rj4"] Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.162243 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-99rj4\" (UID: \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\") " pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.162320 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-config\") pod \"dnsmasq-dns-744ffd65bc-99rj4\" (UID: \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\") " pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.162337 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp69r\" (UniqueName: \"kubernetes.io/projected/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-kube-api-access-qp69r\") pod \"dnsmasq-dns-744ffd65bc-99rj4\" (UID: \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\") " pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.263725 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-99rj4\" (UID: \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\") " pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.265131 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-99rj4\" (UID: \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\") " pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.265273 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-config\") pod \"dnsmasq-dns-744ffd65bc-99rj4\" (UID: \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\") " pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.265308 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp69r\" (UniqueName: \"kubernetes.io/projected/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-kube-api-access-qp69r\") pod \"dnsmasq-dns-744ffd65bc-99rj4\" (UID: \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\") " pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.267485 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-config\") pod \"dnsmasq-dns-744ffd65bc-99rj4\" (UID: \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\") " pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.283528 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp69r\" (UniqueName: \"kubernetes.io/projected/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-kube-api-access-qp69r\") pod \"dnsmasq-dns-744ffd65bc-99rj4\" (UID: \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\") " pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.387456 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-j7qw7"] Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.412406 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-2szrx"] Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.413456 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.423625 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-2szrx"] Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.452167 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.472063 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-config\") pod \"dnsmasq-dns-95f5f6995-2szrx\" (UID: \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\") " pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.472161 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55blv\" (UniqueName: \"kubernetes.io/projected/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-kube-api-access-55blv\") pod \"dnsmasq-dns-95f5f6995-2szrx\" (UID: \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\") " pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.472195 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-dns-svc\") pod \"dnsmasq-dns-95f5f6995-2szrx\" (UID: \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\") " pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.574164 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55blv\" (UniqueName: \"kubernetes.io/projected/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-kube-api-access-55blv\") pod \"dnsmasq-dns-95f5f6995-2szrx\" (UID: \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\") " pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.574212 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-dns-svc\") pod \"dnsmasq-dns-95f5f6995-2szrx\" (UID: \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\") " pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.574291 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-config\") pod \"dnsmasq-dns-95f5f6995-2szrx\" (UID: \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\") " pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.575087 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-config\") pod \"dnsmasq-dns-95f5f6995-2szrx\" (UID: \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\") " pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.578395 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-dns-svc\") pod \"dnsmasq-dns-95f5f6995-2szrx\" (UID: \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\") " pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.593825 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55blv\" (UniqueName: \"kubernetes.io/projected/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-kube-api-access-55blv\") pod \"dnsmasq-dns-95f5f6995-2szrx\" (UID: \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\") " pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.732317 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:09 crc kubenswrapper[4841]: I0130 05:24:09.949440 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-99rj4"] Jan 30 05:24:09 crc kubenswrapper[4841]: W0130 05:24:09.971804 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc535d999_3ff1_4fc1_8e7e_2e073c9a1fbe.slice/crio-d3a69b8c234a0db631e87a5124b38211d38b859cee62fc079314bda00ad1254c WatchSource:0}: Error finding container d3a69b8c234a0db631e87a5124b38211d38b859cee62fc079314bda00ad1254c: Status 404 returned error can't find the container with id d3a69b8c234a0db631e87a5124b38211d38b859cee62fc079314bda00ad1254c Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.068772 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" event={"ID":"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe","Type":"ContainerStarted","Data":"d3a69b8c234a0db631e87a5124b38211d38b859cee62fc079314bda00ad1254c"} Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.162015 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-2szrx"] Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.280055 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.296322 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.296502 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.301633 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.301833 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.301906 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.302082 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.302233 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.302517 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-xxvtd" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.302622 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.403014 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.403053 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc423120-ba93-465b-8ef8-871904b901ef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.403175 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.403257 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.403378 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.403487 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.403530 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpbqm\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-kube-api-access-cpbqm\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.403588 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.403628 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.403700 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.403759 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc423120-ba93-465b-8ef8-871904b901ef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.505325 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.505417 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.505459 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.505757 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.505935 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.506266 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.515243 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpbqm\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-kube-api-access-cpbqm\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.515320 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.515352 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.515423 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.515469 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc423120-ba93-465b-8ef8-871904b901ef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.515491 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.515528 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc423120-ba93-465b-8ef8-871904b901ef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.515559 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.515956 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.516425 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.517609 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.522315 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.522368 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.522583 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc423120-ba93-465b-8ef8-871904b901ef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.530113 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.531257 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.532777 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.534079 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-fvln7" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.534187 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.534326 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.535313 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.535544 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.538864 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.539039 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc423120-ba93-465b-8ef8-871904b901ef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.544293 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpbqm\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-kube-api-access-cpbqm\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.544561 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.554603 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.616630 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.616687 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.616755 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.616776 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad7779ad-0912-4695-853f-3ce786c2e9ae-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.616799 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg8w6\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-kube-api-access-vg8w6\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.616822 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.616843 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.616999 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.617059 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.617136 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad7779ad-0912-4695-853f-3ce786c2e9ae-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.617159 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.623666 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.718893 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.718939 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.718973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad7779ad-0912-4695-853f-3ce786c2e9ae-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.718992 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.719025 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.719052 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.719079 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.719096 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad7779ad-0912-4695-853f-3ce786c2e9ae-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.719114 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg8w6\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-kube-api-access-vg8w6\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.719133 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.719158 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.720101 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.720134 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.720258 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.720354 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.720422 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.720881 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.724906 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.724939 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad7779ad-0912-4695-853f-3ce786c2e9ae-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.736302 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.736971 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg8w6\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-kube-api-access-vg8w6\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.741325 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad7779ad-0912-4695-853f-3ce786c2e9ae-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.743378 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:10 crc kubenswrapper[4841]: I0130 05:24:10.927250 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.083504 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" event={"ID":"89c123b6-19c8-4eb3-b34e-103dcd6cc16e","Type":"ContainerStarted","Data":"e549ff8f4dc6d61a441649f612ea78299cbed0d8a060616e1583c568b4b2df36"} Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.844805 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.846075 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.848382 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.849276 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-r57sp" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.849545 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.849605 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.853848 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.865285 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.937782 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.937867 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2461f9-732a-448f-a5d7-7528bc3956e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.937900 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e2461f9-732a-448f-a5d7-7528bc3956e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.937934 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvrps\" (UniqueName: \"kubernetes.io/projected/8e2461f9-732a-448f-a5d7-7528bc3956e3-kube-api-access-jvrps\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.938010 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.938035 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.938065 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:11 crc kubenswrapper[4841]: I0130 05:24:11.938085 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8e2461f9-732a-448f-a5d7-7528bc3956e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.039226 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2461f9-732a-448f-a5d7-7528bc3956e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.039278 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e2461f9-732a-448f-a5d7-7528bc3956e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.039315 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvrps\" (UniqueName: \"kubernetes.io/projected/8e2461f9-732a-448f-a5d7-7528bc3956e3-kube-api-access-jvrps\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.039338 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.039363 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.039385 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.039423 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8e2461f9-732a-448f-a5d7-7528bc3956e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.039438 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.041064 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.041601 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.042041 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8e2461f9-732a-448f-a5d7-7528bc3956e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.042181 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.042469 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.046187 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2461f9-732a-448f-a5d7-7528bc3956e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.055946 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvrps\" (UniqueName: \"kubernetes.io/projected/8e2461f9-732a-448f-a5d7-7528bc3956e3-kube-api-access-jvrps\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.057868 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e2461f9-732a-448f-a5d7-7528bc3956e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.063874 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " pod="openstack/openstack-galera-0" Jan 30 05:24:12 crc kubenswrapper[4841]: I0130 05:24:12.172129 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.245615 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.248813 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.266595 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-wbm6q" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.266745 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.266923 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.267021 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.277329 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.357184 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365caacf-756c-4558-b281-f8644c9c1c5f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.357323 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.357580 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/365caacf-756c-4558-b281-f8644c9c1c5f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.357740 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.357802 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.357836 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7zqc\" (UniqueName: \"kubernetes.io/projected/365caacf-756c-4558-b281-f8644c9c1c5f-kube-api-access-n7zqc\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.357907 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/365caacf-756c-4558-b281-f8644c9c1c5f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.357955 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.459996 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/365caacf-756c-4558-b281-f8644c9c1c5f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.460078 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.460153 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365caacf-756c-4558-b281-f8644c9c1c5f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.460238 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.460298 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/365caacf-756c-4558-b281-f8644c9c1c5f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.460366 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.460436 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.460475 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7zqc\" (UniqueName: \"kubernetes.io/projected/365caacf-756c-4558-b281-f8644c9c1c5f-kube-api-access-n7zqc\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.460617 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.461953 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/365caacf-756c-4558-b281-f8644c9c1c5f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.462678 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.462878 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.463837 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.484082 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365caacf-756c-4558-b281-f8644c9c1c5f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.484509 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/365caacf-756c-4558-b281-f8644c9c1c5f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.497156 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7zqc\" (UniqueName: \"kubernetes.io/projected/365caacf-756c-4558-b281-f8644c9c1c5f-kube-api-access-n7zqc\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.513347 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.595873 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.695116 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.696303 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.699189 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-zxtvn" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.699634 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.701650 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.704620 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.765946 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90edf3da-3cbc-407f-9cfa-de97879f3834-combined-ca-bundle\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.766014 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5vwx\" (UniqueName: \"kubernetes.io/projected/90edf3da-3cbc-407f-9cfa-de97879f3834-kube-api-access-d5vwx\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.766045 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/90edf3da-3cbc-407f-9cfa-de97879f3834-kolla-config\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.766066 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90edf3da-3cbc-407f-9cfa-de97879f3834-config-data\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.766083 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/90edf3da-3cbc-407f-9cfa-de97879f3834-memcached-tls-certs\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.866785 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/90edf3da-3cbc-407f-9cfa-de97879f3834-memcached-tls-certs\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.866879 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90edf3da-3cbc-407f-9cfa-de97879f3834-combined-ca-bundle\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.866937 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5vwx\" (UniqueName: \"kubernetes.io/projected/90edf3da-3cbc-407f-9cfa-de97879f3834-kube-api-access-d5vwx\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.866978 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/90edf3da-3cbc-407f-9cfa-de97879f3834-kolla-config\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.867007 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90edf3da-3cbc-407f-9cfa-de97879f3834-config-data\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.868601 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90edf3da-3cbc-407f-9cfa-de97879f3834-config-data\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.870532 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/90edf3da-3cbc-407f-9cfa-de97879f3834-kolla-config\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.870626 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90edf3da-3cbc-407f-9cfa-de97879f3834-combined-ca-bundle\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.870683 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/90edf3da-3cbc-407f-9cfa-de97879f3834-memcached-tls-certs\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:13 crc kubenswrapper[4841]: I0130 05:24:13.882448 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5vwx\" (UniqueName: \"kubernetes.io/projected/90edf3da-3cbc-407f-9cfa-de97879f3834-kube-api-access-d5vwx\") pod \"memcached-0\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " pod="openstack/memcached-0" Jan 30 05:24:14 crc kubenswrapper[4841]: I0130 05:24:14.041079 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 05:24:15 crc kubenswrapper[4841]: I0130 05:24:15.429260 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:24:15 crc kubenswrapper[4841]: I0130 05:24:15.431049 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:24:15 crc kubenswrapper[4841]: I0130 05:24:15.433191 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-kwnjs" Jan 30 05:24:15 crc kubenswrapper[4841]: I0130 05:24:15.437346 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:24:15 crc kubenswrapper[4841]: I0130 05:24:15.494769 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h4zg\" (UniqueName: \"kubernetes.io/projected/e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f-kube-api-access-2h4zg\") pod \"kube-state-metrics-0\" (UID: \"e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f\") " pod="openstack/kube-state-metrics-0" Jan 30 05:24:15 crc kubenswrapper[4841]: I0130 05:24:15.596270 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h4zg\" (UniqueName: \"kubernetes.io/projected/e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f-kube-api-access-2h4zg\") pod \"kube-state-metrics-0\" (UID: \"e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f\") " pod="openstack/kube-state-metrics-0" Jan 30 05:24:15 crc kubenswrapper[4841]: I0130 05:24:15.614717 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h4zg\" (UniqueName: \"kubernetes.io/projected/e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f-kube-api-access-2h4zg\") pod \"kube-state-metrics-0\" (UID: \"e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f\") " pod="openstack/kube-state-metrics-0" Jan 30 05:24:15 crc kubenswrapper[4841]: I0130 05:24:15.761172 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.219423 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-88rdn"] Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.220914 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.223880 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.224043 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.224801 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-4s66p" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.226975 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-lbv2q"] Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.228502 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.233960 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-88rdn"] Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.240754 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lbv2q"] Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260234 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-run\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260271 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d25b55-9643-45fd-b2fe-eb593334924d-combined-ca-bundle\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260294 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-lib\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260313 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-etc-ovs\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260335 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-run-ovn\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260352 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c9bs\" (UniqueName: \"kubernetes.io/projected/47d25b55-9643-45fd-b2fe-eb593334924d-kube-api-access-2c9bs\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260378 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-log-ovn\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260465 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-run\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260556 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqz9b\" (UniqueName: \"kubernetes.io/projected/582a9577-0530-4793-8723-01681bdcfda4-kube-api-access-rqz9b\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260579 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/582a9577-0530-4793-8723-01681bdcfda4-scripts\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260596 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47d25b55-9643-45fd-b2fe-eb593334924d-scripts\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260613 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d25b55-9643-45fd-b2fe-eb593334924d-ovn-controller-tls-certs\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.260637 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-log\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.361479 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-log-ovn\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.361530 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-run\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.361600 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqz9b\" (UniqueName: \"kubernetes.io/projected/582a9577-0530-4793-8723-01681bdcfda4-kube-api-access-rqz9b\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.361622 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/582a9577-0530-4793-8723-01681bdcfda4-scripts\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.361655 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47d25b55-9643-45fd-b2fe-eb593334924d-scripts\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.361671 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d25b55-9643-45fd-b2fe-eb593334924d-ovn-controller-tls-certs\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.361694 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-log\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.361731 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-run\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.361750 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d25b55-9643-45fd-b2fe-eb593334924d-combined-ca-bundle\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.361772 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-lib\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.361793 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-etc-ovs\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.361831 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-run-ovn\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.362095 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c9bs\" (UniqueName: \"kubernetes.io/projected/47d25b55-9643-45fd-b2fe-eb593334924d-kube-api-access-2c9bs\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.362271 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-lib\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.362272 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-etc-ovs\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.362330 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-run-ovn\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.362456 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-run\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.362469 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-run\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.362592 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-log-ovn\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.362642 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-log\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.364179 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/582a9577-0530-4793-8723-01681bdcfda4-scripts\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.364697 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47d25b55-9643-45fd-b2fe-eb593334924d-scripts\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.367240 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d25b55-9643-45fd-b2fe-eb593334924d-ovn-controller-tls-certs\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.368878 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d25b55-9643-45fd-b2fe-eb593334924d-combined-ca-bundle\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.377444 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c9bs\" (UniqueName: \"kubernetes.io/projected/47d25b55-9643-45fd-b2fe-eb593334924d-kube-api-access-2c9bs\") pod \"ovn-controller-88rdn\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.377577 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqz9b\" (UniqueName: \"kubernetes.io/projected/582a9577-0530-4793-8723-01681bdcfda4-kube-api-access-rqz9b\") pod \"ovn-controller-ovs-lbv2q\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.435450 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.437616 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.440765 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.441023 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.441228 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.442900 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-qlh45" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.447121 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.450989 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.466409 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.466476 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df66af1-0c57-44f7-8b54-bc351a3faa66-config\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.466541 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.466577 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.466595 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5df66af1-0c57-44f7-8b54-bc351a3faa66-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.466612 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5df66af1-0c57-44f7-8b54-bc351a3faa66-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.466755 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc5mt\" (UniqueName: \"kubernetes.io/projected/5df66af1-0c57-44f7-8b54-bc351a3faa66-kube-api-access-kc5mt\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.466846 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.545208 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-88rdn" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.567017 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.568290 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.568355 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df66af1-0c57-44f7-8b54-bc351a3faa66-config\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.568450 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.568502 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.568526 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5df66af1-0c57-44f7-8b54-bc351a3faa66-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.568548 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5df66af1-0c57-44f7-8b54-bc351a3faa66-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.568578 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc5mt\" (UniqueName: \"kubernetes.io/projected/5df66af1-0c57-44f7-8b54-bc351a3faa66-kube-api-access-kc5mt\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.568607 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.569258 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5df66af1-0c57-44f7-8b54-bc351a3faa66-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.569326 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df66af1-0c57-44f7-8b54-bc351a3faa66-config\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.570270 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5df66af1-0c57-44f7-8b54-bc351a3faa66-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.570589 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.574565 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.582952 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.583725 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.585762 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc5mt\" (UniqueName: \"kubernetes.io/projected/5df66af1-0c57-44f7-8b54-bc351a3faa66-kube-api-access-kc5mt\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.594145 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:19 crc kubenswrapper[4841]: I0130 05:24:19.760326 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.210395 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.216926 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.221010 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dr52c" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.221123 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.221799 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.222165 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.241853 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.314842 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd33f000-ac38-400f-95b4-d9f6a68d13c0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.314909 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.314943 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.315100 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd33f000-ac38-400f-95b4-d9f6a68d13c0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.315227 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2rb\" (UniqueName: \"kubernetes.io/projected/cd33f000-ac38-400f-95b4-d9f6a68d13c0-kube-api-access-tl2rb\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.315282 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.315541 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.315721 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd33f000-ac38-400f-95b4-d9f6a68d13c0-config\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.417635 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd33f000-ac38-400f-95b4-d9f6a68d13c0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.417695 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.417719 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.417745 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd33f000-ac38-400f-95b4-d9f6a68d13c0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.417769 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.417785 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2rb\" (UniqueName: \"kubernetes.io/projected/cd33f000-ac38-400f-95b4-d9f6a68d13c0-kube-api-access-tl2rb\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.417819 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.417846 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd33f000-ac38-400f-95b4-d9f6a68d13c0-config\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.418152 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd33f000-ac38-400f-95b4-d9f6a68d13c0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.418698 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd33f000-ac38-400f-95b4-d9f6a68d13c0-config\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.419049 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd33f000-ac38-400f-95b4-d9f6a68d13c0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.419422 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.425148 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.432657 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.438051 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.443527 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2rb\" (UniqueName: \"kubernetes.io/projected/cd33f000-ac38-400f-95b4-d9f6a68d13c0-kube-api-access-tl2rb\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.448769 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: I0130 05:24:22.547519 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:22 crc kubenswrapper[4841]: E0130 05:24:22.868166 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 30 05:24:22 crc kubenswrapper[4841]: E0130 05:24:22.868355 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwxxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-84bb9d8bd9-j7qw7_openstack(beed4aec-0cf6-48ef-b3a1-55843db48bf7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:24:22 crc kubenswrapper[4841]: E0130 05:24:22.869692 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" podUID="beed4aec-0cf6-48ef-b3a1-55843db48bf7" Jan 30 05:24:22 crc kubenswrapper[4841]: E0130 05:24:22.904283 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 30 05:24:22 crc kubenswrapper[4841]: E0130 05:24:22.904643 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qp69r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-744ffd65bc-99rj4_openstack(c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:24:22 crc kubenswrapper[4841]: E0130 05:24:22.908090 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" podUID="c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" Jan 30 05:24:22 crc kubenswrapper[4841]: E0130 05:24:22.936198 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 30 05:24:22 crc kubenswrapper[4841]: E0130 05:24:22.936912 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vzljg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5f854695bc-5r7fq_openstack(32147ec7-e387-4d0b-9536-1341f982729e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:24:22 crc kubenswrapper[4841]: E0130 05:24:22.939362 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" podUID="32147ec7-e387-4d0b-9536-1341f982729e" Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.174616 4841 generic.go:334] "Generic (PLEG): container finished" podID="89c123b6-19c8-4eb3-b34e-103dcd6cc16e" containerID="4ad9e5e5cc9d026cc88df21dc9afd36f60c141301f41ff957b0346cc4f584f5e" exitCode=0 Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.174719 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" event={"ID":"89c123b6-19c8-4eb3-b34e-103dcd6cc16e","Type":"ContainerDied","Data":"4ad9e5e5cc9d026cc88df21dc9afd36f60c141301f41ff957b0346cc4f584f5e"} Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.286734 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.416804 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.575020 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.604091 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.728091 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.734228 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-88rdn"] Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.742954 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.743258 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32147ec7-e387-4d0b-9536-1341f982729e-dns-svc\") pod \"32147ec7-e387-4d0b-9536-1341f982729e\" (UID: \"32147ec7-e387-4d0b-9536-1341f982729e\") " Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.743392 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzljg\" (UniqueName: \"kubernetes.io/projected/32147ec7-e387-4d0b-9536-1341f982729e-kube-api-access-vzljg\") pod \"32147ec7-e387-4d0b-9536-1341f982729e\" (UID: \"32147ec7-e387-4d0b-9536-1341f982729e\") " Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.743425 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beed4aec-0cf6-48ef-b3a1-55843db48bf7-config\") pod \"beed4aec-0cf6-48ef-b3a1-55843db48bf7\" (UID: \"beed4aec-0cf6-48ef-b3a1-55843db48bf7\") " Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.743440 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwxxz\" (UniqueName: \"kubernetes.io/projected/beed4aec-0cf6-48ef-b3a1-55843db48bf7-kube-api-access-kwxxz\") pod \"beed4aec-0cf6-48ef-b3a1-55843db48bf7\" (UID: \"beed4aec-0cf6-48ef-b3a1-55843db48bf7\") " Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.743491 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32147ec7-e387-4d0b-9536-1341f982729e-config\") pod \"32147ec7-e387-4d0b-9536-1341f982729e\" (UID: \"32147ec7-e387-4d0b-9536-1341f982729e\") " Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.744374 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32147ec7-e387-4d0b-9536-1341f982729e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32147ec7-e387-4d0b-9536-1341f982729e" (UID: "32147ec7-e387-4d0b-9536-1341f982729e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.744380 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beed4aec-0cf6-48ef-b3a1-55843db48bf7-config" (OuterVolumeSpecName: "config") pod "beed4aec-0cf6-48ef-b3a1-55843db48bf7" (UID: "beed4aec-0cf6-48ef-b3a1-55843db48bf7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.746661 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32147ec7-e387-4d0b-9536-1341f982729e-config" (OuterVolumeSpecName: "config") pod "32147ec7-e387-4d0b-9536-1341f982729e" (UID: "32147ec7-e387-4d0b-9536-1341f982729e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:23 crc kubenswrapper[4841]: W0130 05:24:23.749716 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7a3ee64_f1d6_4f70_90ac_ac7550a66d6f.slice/crio-692f951e0050bdc3d6c9526d02e8fbdbfe850520ea37731736a809e0bc15b7eb WatchSource:0}: Error finding container 692f951e0050bdc3d6c9526d02e8fbdbfe850520ea37731736a809e0bc15b7eb: Status 404 returned error can't find the container with id 692f951e0050bdc3d6c9526d02e8fbdbfe850520ea37731736a809e0bc15b7eb Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.749894 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32147ec7-e387-4d0b-9536-1341f982729e-kube-api-access-vzljg" (OuterVolumeSpecName: "kube-api-access-vzljg") pod "32147ec7-e387-4d0b-9536-1341f982729e" (UID: "32147ec7-e387-4d0b-9536-1341f982729e"). InnerVolumeSpecName "kube-api-access-vzljg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:23 crc kubenswrapper[4841]: W0130 05:24:23.750136 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod365caacf_756c_4558_b281_f8644c9c1c5f.slice/crio-fc053fb5b0d026c63c908afd011d2cb35880c88e1d810dafc464dc552806e00f WatchSource:0}: Error finding container fc053fb5b0d026c63c908afd011d2cb35880c88e1d810dafc464dc552806e00f: Status 404 returned error can't find the container with id fc053fb5b0d026c63c908afd011d2cb35880c88e1d810dafc464dc552806e00f Jan 30 05:24:23 crc kubenswrapper[4841]: W0130 05:24:23.751556 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47d25b55_9643_45fd_b2fe_eb593334924d.slice/crio-298c3e61574db7def40be6b74e7df4b9777b7f2eb9efaf52c249f4dc85769ee6 WatchSource:0}: Error finding container 298c3e61574db7def40be6b74e7df4b9777b7f2eb9efaf52c249f4dc85769ee6: Status 404 returned error can't find the container with id 298c3e61574db7def40be6b74e7df4b9777b7f2eb9efaf52c249f4dc85769ee6 Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.753199 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beed4aec-0cf6-48ef-b3a1-55843db48bf7-kube-api-access-kwxxz" (OuterVolumeSpecName: "kube-api-access-kwxxz") pod "beed4aec-0cf6-48ef-b3a1-55843db48bf7" (UID: "beed4aec-0cf6-48ef-b3a1-55843db48bf7"). InnerVolumeSpecName "kube-api-access-kwxxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.769638 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 05:24:23 crc kubenswrapper[4841]: W0130 05:24:23.780942 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad7779ad_0912_4695_853f_3ce786c2e9ae.slice/crio-8600967c2ade3e891b207c101977cc2ccfbdba76d18ef1953587344914c55439 WatchSource:0}: Error finding container 8600967c2ade3e891b207c101977cc2ccfbdba76d18ef1953587344914c55439: Status 404 returned error can't find the container with id 8600967c2ade3e891b207c101977cc2ccfbdba76d18ef1953587344914c55439 Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.781727 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.844092 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.847592 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32147ec7-e387-4d0b-9536-1341f982729e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.847631 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzljg\" (UniqueName: \"kubernetes.io/projected/32147ec7-e387-4d0b-9536-1341f982729e-kube-api-access-vzljg\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.847646 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beed4aec-0cf6-48ef-b3a1-55843db48bf7-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.847659 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwxxz\" (UniqueName: \"kubernetes.io/projected/beed4aec-0cf6-48ef-b3a1-55843db48bf7-kube-api-access-kwxxz\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.847670 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32147ec7-e387-4d0b-9536-1341f982729e-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:23 crc kubenswrapper[4841]: I0130 05:24:23.940370 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lbv2q"] Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.187340 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-88rdn" event={"ID":"47d25b55-9643-45fd-b2fe-eb593334924d","Type":"ContainerStarted","Data":"298c3e61574db7def40be6b74e7df4b9777b7f2eb9efaf52c249f4dc85769ee6"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.188760 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5df66af1-0c57-44f7-8b54-bc351a3faa66","Type":"ContainerStarted","Data":"cc0f049e0885820ab6e679e1bba1a27774c34c43371b4e35e8197ce8b7b0491e"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.192542 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8e2461f9-732a-448f-a5d7-7528bc3956e3","Type":"ContainerStarted","Data":"f380bc1b4527c3685cfad874202c3a7cdb8c53b0054c3b0f3a48d255f8ca2234"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.194903 4841 generic.go:334] "Generic (PLEG): container finished" podID="c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" containerID="b14a3af1e41ea0c1e07b8525b9292f95aad023593ee39c8c036d470088239bdb" exitCode=0 Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.194939 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" event={"ID":"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe","Type":"ContainerDied","Data":"b14a3af1e41ea0c1e07b8525b9292f95aad023593ee39c8c036d470088239bdb"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.196266 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" event={"ID":"beed4aec-0cf6-48ef-b3a1-55843db48bf7","Type":"ContainerDied","Data":"2cd945876f52946045dfe01de944633e0ae16592a207aecaca5b8694045e8102"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.196377 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-j7qw7" Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.199562 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad7779ad-0912-4695-853f-3ce786c2e9ae","Type":"ContainerStarted","Data":"8600967c2ade3e891b207c101977cc2ccfbdba76d18ef1953587344914c55439"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.202306 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" event={"ID":"89c123b6-19c8-4eb3-b34e-103dcd6cc16e","Type":"ContainerStarted","Data":"a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.202557 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.204072 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lbv2q" event={"ID":"582a9577-0530-4793-8723-01681bdcfda4","Type":"ContainerStarted","Data":"71c2d9813a015d8a631b3ab4999791ef8e0fb3377202fc4db173f9657230e629"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.209270 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc423120-ba93-465b-8ef8-871904b901ef","Type":"ContainerStarted","Data":"4aad50bd4b29190901f79507be62ff07001fd77ed292118463a34ecb6a2de1f7"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.217564 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"365caacf-756c-4558-b281-f8644c9c1c5f","Type":"ContainerStarted","Data":"fc053fb5b0d026c63c908afd011d2cb35880c88e1d810dafc464dc552806e00f"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.219944 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f","Type":"ContainerStarted","Data":"692f951e0050bdc3d6c9526d02e8fbdbfe850520ea37731736a809e0bc15b7eb"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.221687 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" event={"ID":"32147ec7-e387-4d0b-9536-1341f982729e","Type":"ContainerDied","Data":"0e99cca3a74382d1c8f3f581d137dd037eccda69638b4d4b282538a9439381ab"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.221831 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-5r7fq" Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.226200 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"90edf3da-3cbc-407f-9cfa-de97879f3834","Type":"ContainerStarted","Data":"972295e9c0c3b66164e7fbe32b6501eced61aa6288391ea197e6a4619df11e29"} Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.255119 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" podStartSLOduration=2.45101618 podStartE2EDuration="15.255101023s" podCreationTimestamp="2026-01-30 05:24:09 +0000 UTC" firstStartedPulling="2026-01-30 05:24:10.170455677 +0000 UTC m=+987.163928315" lastFinishedPulling="2026-01-30 05:24:22.97454052 +0000 UTC m=+999.968013158" observedRunningTime="2026-01-30 05:24:24.25045006 +0000 UTC m=+1001.243922708" watchObservedRunningTime="2026-01-30 05:24:24.255101023 +0000 UTC m=+1001.248573661" Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.328776 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-j7qw7"] Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.346387 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-j7qw7"] Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.368686 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-5r7fq"] Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.374854 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-5r7fq"] Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.442535 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32147ec7-e387-4d0b-9536-1341f982729e" path="/var/lib/kubelet/pods/32147ec7-e387-4d0b-9536-1341f982729e/volumes" Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.443007 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beed4aec-0cf6-48ef-b3a1-55843db48bf7" path="/var/lib/kubelet/pods/beed4aec-0cf6-48ef-b3a1-55843db48bf7/volumes" Jan 30 05:24:24 crc kubenswrapper[4841]: I0130 05:24:24.658148 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 05:24:24 crc kubenswrapper[4841]: W0130 05:24:24.692826 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd33f000_ac38_400f_95b4_d9f6a68d13c0.slice/crio-0758459fb9e4f3940596789815cd623a5bc6ab5827144d6d8e84cf5952e31bcd WatchSource:0}: Error finding container 0758459fb9e4f3940596789815cd623a5bc6ab5827144d6d8e84cf5952e31bcd: Status 404 returned error can't find the container with id 0758459fb9e4f3940596789815cd623a5bc6ab5827144d6d8e84cf5952e31bcd Jan 30 05:24:25 crc kubenswrapper[4841]: I0130 05:24:25.232729 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" event={"ID":"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe","Type":"ContainerStarted","Data":"c21ab3c6538e7669856774046f70544317d893e2307ccf623b6394caff778552"} Jan 30 05:24:25 crc kubenswrapper[4841]: I0130 05:24:25.233718 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:25 crc kubenswrapper[4841]: I0130 05:24:25.234781 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cd33f000-ac38-400f-95b4-d9f6a68d13c0","Type":"ContainerStarted","Data":"0758459fb9e4f3940596789815cd623a5bc6ab5827144d6d8e84cf5952e31bcd"} Jan 30 05:24:25 crc kubenswrapper[4841]: I0130 05:24:25.278974 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" podStartSLOduration=-9223372020.575817 podStartE2EDuration="16.27895927s" podCreationTimestamp="2026-01-30 05:24:09 +0000 UTC" firstStartedPulling="2026-01-30 05:24:09.973708919 +0000 UTC m=+986.967181557" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:24:25.275691685 +0000 UTC m=+1002.269164323" watchObservedRunningTime="2026-01-30 05:24:25.27895927 +0000 UTC m=+1002.272431908" Jan 30 05:24:29 crc kubenswrapper[4841]: I0130 05:24:29.454793 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:29 crc kubenswrapper[4841]: I0130 05:24:29.733624 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:29 crc kubenswrapper[4841]: I0130 05:24:29.781341 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-99rj4"] Jan 30 05:24:30 crc kubenswrapper[4841]: I0130 05:24:30.309213 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" podUID="c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" containerName="dnsmasq-dns" containerID="cri-o://c21ab3c6538e7669856774046f70544317d893e2307ccf623b6394caff778552" gracePeriod=10 Jan 30 05:24:30 crc kubenswrapper[4841]: E0130 05:24:30.470684 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc535d999_3ff1_4fc1_8e7e_2e073c9a1fbe.slice/crio-c21ab3c6538e7669856774046f70544317d893e2307ccf623b6394caff778552.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc535d999_3ff1_4fc1_8e7e_2e073c9a1fbe.slice/crio-conmon-c21ab3c6538e7669856774046f70544317d893e2307ccf623b6394caff778552.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:24:31 crc kubenswrapper[4841]: I0130 05:24:31.319083 4841 generic.go:334] "Generic (PLEG): container finished" podID="c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" containerID="c21ab3c6538e7669856774046f70544317d893e2307ccf623b6394caff778552" exitCode=0 Jan 30 05:24:31 crc kubenswrapper[4841]: I0130 05:24:31.319163 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" event={"ID":"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe","Type":"ContainerDied","Data":"c21ab3c6538e7669856774046f70544317d893e2307ccf623b6394caff778552"} Jan 30 05:24:31 crc kubenswrapper[4841]: I0130 05:24:31.817142 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:31 crc kubenswrapper[4841]: I0130 05:24:31.832495 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-dns-svc\") pod \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\" (UID: \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\") " Jan 30 05:24:31 crc kubenswrapper[4841]: I0130 05:24:31.832540 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-config\") pod \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\" (UID: \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\") " Jan 30 05:24:31 crc kubenswrapper[4841]: I0130 05:24:31.832609 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp69r\" (UniqueName: \"kubernetes.io/projected/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-kube-api-access-qp69r\") pod \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\" (UID: \"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe\") " Jan 30 05:24:31 crc kubenswrapper[4841]: I0130 05:24:31.842968 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-kube-api-access-qp69r" (OuterVolumeSpecName: "kube-api-access-qp69r") pod "c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" (UID: "c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe"). InnerVolumeSpecName "kube-api-access-qp69r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:31 crc kubenswrapper[4841]: I0130 05:24:31.875006 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" (UID: "c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:31 crc kubenswrapper[4841]: I0130 05:24:31.901716 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-config" (OuterVolumeSpecName: "config") pod "c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" (UID: "c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:31 crc kubenswrapper[4841]: I0130 05:24:31.934604 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:31 crc kubenswrapper[4841]: I0130 05:24:31.934643 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:31 crc kubenswrapper[4841]: I0130 05:24:31.934683 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp69r\" (UniqueName: \"kubernetes.io/projected/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe-kube-api-access-qp69r\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:32 crc kubenswrapper[4841]: I0130 05:24:32.334028 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" event={"ID":"c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe","Type":"ContainerDied","Data":"d3a69b8c234a0db631e87a5124b38211d38b859cee62fc079314bda00ad1254c"} Jan 30 05:24:32 crc kubenswrapper[4841]: I0130 05:24:32.334108 4841 scope.go:117] "RemoveContainer" containerID="c21ab3c6538e7669856774046f70544317d893e2307ccf623b6394caff778552" Jan 30 05:24:32 crc kubenswrapper[4841]: I0130 05:24:32.334282 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-99rj4" Jan 30 05:24:32 crc kubenswrapper[4841]: I0130 05:24:32.388905 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-99rj4"] Jan 30 05:24:32 crc kubenswrapper[4841]: I0130 05:24:32.398934 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-99rj4"] Jan 30 05:24:32 crc kubenswrapper[4841]: I0130 05:24:32.448939 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" path="/var/lib/kubelet/pods/c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe/volumes" Jan 30 05:24:32 crc kubenswrapper[4841]: I0130 05:24:32.673673 4841 scope.go:117] "RemoveContainer" containerID="b14a3af1e41ea0c1e07b8525b9292f95aad023593ee39c8c036d470088239bdb" Jan 30 05:24:33 crc kubenswrapper[4841]: I0130 05:24:33.346244 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"365caacf-756c-4558-b281-f8644c9c1c5f","Type":"ContainerStarted","Data":"e0d76e4beb885ef332ab1fc012958772ae48e732b5830a56d72303260efbbb68"} Jan 30 05:24:33 crc kubenswrapper[4841]: I0130 05:24:33.349983 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"90edf3da-3cbc-407f-9cfa-de97879f3834","Type":"ContainerStarted","Data":"33a2da7865353e0526f0692b43cfed86797432813b0f51302a7eafdb6082c4e2"} Jan 30 05:24:33 crc kubenswrapper[4841]: I0130 05:24:33.350195 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 30 05:24:33 crc kubenswrapper[4841]: I0130 05:24:33.405951 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.175590669 podStartE2EDuration="20.405921694s" podCreationTimestamp="2026-01-30 05:24:13 +0000 UTC" firstStartedPulling="2026-01-30 05:24:23.46583534 +0000 UTC m=+1000.459307978" lastFinishedPulling="2026-01-30 05:24:31.696166355 +0000 UTC m=+1008.689639003" observedRunningTime="2026-01-30 05:24:33.398844557 +0000 UTC m=+1010.392317225" watchObservedRunningTime="2026-01-30 05:24:33.405921694 +0000 UTC m=+1010.399394362" Jan 30 05:24:34 crc kubenswrapper[4841]: I0130 05:24:34.363795 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cd33f000-ac38-400f-95b4-d9f6a68d13c0","Type":"ContainerStarted","Data":"eaca801d46ea5dc0cda81a019aed95bd7823bddb489326243d7b1a3fbaa6599a"} Jan 30 05:24:34 crc kubenswrapper[4841]: I0130 05:24:34.365819 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f","Type":"ContainerStarted","Data":"5bd35b53f796dccea4917358db368bf48a05386eb29d163817739bfea5cb9461"} Jan 30 05:24:34 crc kubenswrapper[4841]: I0130 05:24:34.365929 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 05:24:34 crc kubenswrapper[4841]: I0130 05:24:34.367742 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-88rdn" event={"ID":"47d25b55-9643-45fd-b2fe-eb593334924d","Type":"ContainerStarted","Data":"6b3c23c498cb0fcf52b1b3ba92ec5b60803d90067523bd6519592f18536e1fc3"} Jan 30 05:24:34 crc kubenswrapper[4841]: I0130 05:24:34.367880 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-88rdn" Jan 30 05:24:34 crc kubenswrapper[4841]: I0130 05:24:34.370499 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5df66af1-0c57-44f7-8b54-bc351a3faa66","Type":"ContainerStarted","Data":"8b17702bd7db482eb4d14d6e36c3a54793a366a5aa87ef51064106768aa101bc"} Jan 30 05:24:34 crc kubenswrapper[4841]: I0130 05:24:34.373255 4841 generic.go:334] "Generic (PLEG): container finished" podID="582a9577-0530-4793-8723-01681bdcfda4" containerID="f264178a5a1264f97f0566a592589097cee3ffb7247b2cd741139e99d585ed2f" exitCode=0 Jan 30 05:24:34 crc kubenswrapper[4841]: I0130 05:24:34.373285 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lbv2q" event={"ID":"582a9577-0530-4793-8723-01681bdcfda4","Type":"ContainerDied","Data":"f264178a5a1264f97f0566a592589097cee3ffb7247b2cd741139e99d585ed2f"} Jan 30 05:24:34 crc kubenswrapper[4841]: I0130 05:24:34.375072 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8e2461f9-732a-448f-a5d7-7528bc3956e3","Type":"ContainerStarted","Data":"876a3fddd107a1f4afa02010cf302a5b9e18105c753d854a957c74cfaba89c34"} Jan 30 05:24:34 crc kubenswrapper[4841]: I0130 05:24:34.376805 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc423120-ba93-465b-8ef8-871904b901ef","Type":"ContainerStarted","Data":"7f1ddbf696f7e72c83c4c0ee8f83a7e88cf635df1e2c2faa46855be05f07c1a9"} Jan 30 05:24:34 crc kubenswrapper[4841]: I0130 05:24:34.390553 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.227663081 podStartE2EDuration="19.390523738s" podCreationTimestamp="2026-01-30 05:24:15 +0000 UTC" firstStartedPulling="2026-01-30 05:24:23.75268391 +0000 UTC m=+1000.746156558" lastFinishedPulling="2026-01-30 05:24:32.915544537 +0000 UTC m=+1009.909017215" observedRunningTime="2026-01-30 05:24:34.382192939 +0000 UTC m=+1011.375665617" watchObservedRunningTime="2026-01-30 05:24:34.390523738 +0000 UTC m=+1011.383996416" Jan 30 05:24:34 crc kubenswrapper[4841]: I0130 05:24:34.430864 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-88rdn" podStartSLOduration=6.525469428 podStartE2EDuration="15.430847919s" podCreationTimestamp="2026-01-30 05:24:19 +0000 UTC" firstStartedPulling="2026-01-30 05:24:23.757959059 +0000 UTC m=+1000.751431697" lastFinishedPulling="2026-01-30 05:24:32.66333755 +0000 UTC m=+1009.656810188" observedRunningTime="2026-01-30 05:24:34.421691578 +0000 UTC m=+1011.415164216" watchObservedRunningTime="2026-01-30 05:24:34.430847919 +0000 UTC m=+1011.424320557" Jan 30 05:24:35 crc kubenswrapper[4841]: I0130 05:24:35.395257 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lbv2q" event={"ID":"582a9577-0530-4793-8723-01681bdcfda4","Type":"ContainerStarted","Data":"0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15"} Jan 30 05:24:35 crc kubenswrapper[4841]: I0130 05:24:35.399074 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad7779ad-0912-4695-853f-3ce786c2e9ae","Type":"ContainerStarted","Data":"4dfd3bf8acad84bdc8434c960a6f69f824e54adf209164c30f09c4f21207c7b8"} Jan 30 05:24:36 crc kubenswrapper[4841]: I0130 05:24:36.410365 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lbv2q" event={"ID":"582a9577-0530-4793-8723-01681bdcfda4","Type":"ContainerStarted","Data":"6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857"} Jan 30 05:24:36 crc kubenswrapper[4841]: I0130 05:24:36.410650 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:36 crc kubenswrapper[4841]: I0130 05:24:36.413589 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cd33f000-ac38-400f-95b4-d9f6a68d13c0","Type":"ContainerStarted","Data":"d9d5bddafde38d3fd047f16407b7b2377643bde704520227bda10004fa73dc8d"} Jan 30 05:24:36 crc kubenswrapper[4841]: I0130 05:24:36.417108 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5df66af1-0c57-44f7-8b54-bc351a3faa66","Type":"ContainerStarted","Data":"770c37b5591c01b91dbf8c41b92ea066c0501a7dc4909ac010c8ff458d9d3823"} Jan 30 05:24:36 crc kubenswrapper[4841]: I0130 05:24:36.465011 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-lbv2q" podStartSLOduration=9.495053004 podStartE2EDuration="17.464974965s" podCreationTimestamp="2026-01-30 05:24:19 +0000 UTC" firstStartedPulling="2026-01-30 05:24:23.952054397 +0000 UTC m=+1000.945527035" lastFinishedPulling="2026-01-30 05:24:31.921976318 +0000 UTC m=+1008.915448996" observedRunningTime="2026-01-30 05:24:36.45069771 +0000 UTC m=+1013.444170388" watchObservedRunningTime="2026-01-30 05:24:36.464974965 +0000 UTC m=+1013.458447633" Jan 30 05:24:36 crc kubenswrapper[4841]: I0130 05:24:36.491732 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.125892391 podStartE2EDuration="18.491706449s" podCreationTimestamp="2026-01-30 05:24:18 +0000 UTC" firstStartedPulling="2026-01-30 05:24:23.868980241 +0000 UTC m=+1000.862452889" lastFinishedPulling="2026-01-30 05:24:35.234794289 +0000 UTC m=+1012.228266947" observedRunningTime="2026-01-30 05:24:36.488276309 +0000 UTC m=+1013.481748977" watchObservedRunningTime="2026-01-30 05:24:36.491706449 +0000 UTC m=+1013.485179117" Jan 30 05:24:36 crc kubenswrapper[4841]: I0130 05:24:36.537032 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.988597258 podStartE2EDuration="15.537004552s" podCreationTimestamp="2026-01-30 05:24:21 +0000 UTC" firstStartedPulling="2026-01-30 05:24:24.695777501 +0000 UTC m=+1001.689250129" lastFinishedPulling="2026-01-30 05:24:35.244184785 +0000 UTC m=+1012.237657423" observedRunningTime="2026-01-30 05:24:36.521998947 +0000 UTC m=+1013.515471655" watchObservedRunningTime="2026-01-30 05:24:36.537004552 +0000 UTC m=+1013.530477220" Jan 30 05:24:37 crc kubenswrapper[4841]: I0130 05:24:37.430058 4841 generic.go:334] "Generic (PLEG): container finished" podID="8e2461f9-732a-448f-a5d7-7528bc3956e3" containerID="876a3fddd107a1f4afa02010cf302a5b9e18105c753d854a957c74cfaba89c34" exitCode=0 Jan 30 05:24:37 crc kubenswrapper[4841]: I0130 05:24:37.430179 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8e2461f9-732a-448f-a5d7-7528bc3956e3","Type":"ContainerDied","Data":"876a3fddd107a1f4afa02010cf302a5b9e18105c753d854a957c74cfaba89c34"} Jan 30 05:24:37 crc kubenswrapper[4841]: I0130 05:24:37.434801 4841 generic.go:334] "Generic (PLEG): container finished" podID="365caacf-756c-4558-b281-f8644c9c1c5f" containerID="e0d76e4beb885ef332ab1fc012958772ae48e732b5830a56d72303260efbbb68" exitCode=0 Jan 30 05:24:37 crc kubenswrapper[4841]: I0130 05:24:37.434993 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"365caacf-756c-4558-b281-f8644c9c1c5f","Type":"ContainerDied","Data":"e0d76e4beb885ef332ab1fc012958772ae48e732b5830a56d72303260efbbb68"} Jan 30 05:24:37 crc kubenswrapper[4841]: I0130 05:24:37.436089 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:24:37 crc kubenswrapper[4841]: I0130 05:24:37.548023 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:37 crc kubenswrapper[4841]: I0130 05:24:37.548349 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:37 crc kubenswrapper[4841]: I0130 05:24:37.615744 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:37 crc kubenswrapper[4841]: I0130 05:24:37.761907 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:37 crc kubenswrapper[4841]: I0130 05:24:37.821586 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.454524 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"365caacf-756c-4558-b281-f8644c9c1c5f","Type":"ContainerStarted","Data":"42b9436e6ff50cbea7adadc24b6d483db87ac5a4389559de150639c961eb2f31"} Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.454573 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.454586 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8e2461f9-732a-448f-a5d7-7528bc3956e3","Type":"ContainerStarted","Data":"11b34a44cee08eb57f7dd12e3101d19de35b79f567abf07b3e86b4e5ced1412b"} Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.513245 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.106886048 podStartE2EDuration="28.513213643s" podCreationTimestamp="2026-01-30 05:24:10 +0000 UTC" firstStartedPulling="2026-01-30 05:24:23.778763487 +0000 UTC m=+1000.772236125" lastFinishedPulling="2026-01-30 05:24:32.185091052 +0000 UTC m=+1009.178563720" observedRunningTime="2026-01-30 05:24:38.4891694 +0000 UTC m=+1015.482642078" watchObservedRunningTime="2026-01-30 05:24:38.513213643 +0000 UTC m=+1015.506686321" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.528077 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.554894978 podStartE2EDuration="26.528055844s" podCreationTimestamp="2026-01-30 05:24:12 +0000 UTC" firstStartedPulling="2026-01-30 05:24:23.753637615 +0000 UTC m=+1000.747110253" lastFinishedPulling="2026-01-30 05:24:31.726798461 +0000 UTC m=+1008.720271119" observedRunningTime="2026-01-30 05:24:38.527958411 +0000 UTC m=+1015.521431079" watchObservedRunningTime="2026-01-30 05:24:38.528055844 +0000 UTC m=+1015.521528492" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.531883 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.543120 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.797195 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7878659675-2q6fg"] Jan 30 05:24:38 crc kubenswrapper[4841]: E0130 05:24:38.797635 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" containerName="dnsmasq-dns" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.797681 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" containerName="dnsmasq-dns" Jan 30 05:24:38 crc kubenswrapper[4841]: E0130 05:24:38.797700 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" containerName="init" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.797709 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" containerName="init" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.798046 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c535d999-3ff1-4fc1-8e7e-2e073c9a1fbe" containerName="dnsmasq-dns" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.799034 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.811305 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.822259 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7878659675-2q6fg"] Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.879947 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvtgm\" (UniqueName: \"kubernetes.io/projected/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-kube-api-access-nvtgm\") pod \"dnsmasq-dns-7878659675-2q6fg\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.880213 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-dns-svc\") pod \"dnsmasq-dns-7878659675-2q6fg\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.880313 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-config\") pod \"dnsmasq-dns-7878659675-2q6fg\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.880514 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-ovsdbserver-nb\") pod \"dnsmasq-dns-7878659675-2q6fg\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.982495 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-ovsdbserver-nb\") pod \"dnsmasq-dns-7878659675-2q6fg\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.982619 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvtgm\" (UniqueName: \"kubernetes.io/projected/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-kube-api-access-nvtgm\") pod \"dnsmasq-dns-7878659675-2q6fg\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.982656 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-dns-svc\") pod \"dnsmasq-dns-7878659675-2q6fg\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.982708 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-config\") pod \"dnsmasq-dns-7878659675-2q6fg\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.983558 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-ovsdbserver-nb\") pod \"dnsmasq-dns-7878659675-2q6fg\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.983717 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-dns-svc\") pod \"dnsmasq-dns-7878659675-2q6fg\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:38 crc kubenswrapper[4841]: I0130 05:24:38.983790 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-config\") pod \"dnsmasq-dns-7878659675-2q6fg\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.013230 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvtgm\" (UniqueName: \"kubernetes.io/projected/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-kube-api-access-nvtgm\") pod \"dnsmasq-dns-7878659675-2q6fg\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.026346 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-n25tn"] Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.036078 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.039328 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.041552 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-n25tn"] Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.044264 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.084767 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/30c47888-1780-4539-8777-5914009b862f-ovn-rundir\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.084818 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/30c47888-1780-4539-8777-5914009b862f-ovs-rundir\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.084918 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30c47888-1780-4539-8777-5914009b862f-config\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.084948 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c47888-1780-4539-8777-5914009b862f-combined-ca-bundle\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.084963 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c47888-1780-4539-8777-5914009b862f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.084977 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf5gr\" (UniqueName: \"kubernetes.io/projected/30c47888-1780-4539-8777-5914009b862f-kube-api-access-vf5gr\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.101066 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7878659675-2q6fg"] Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.102288 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.112751 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.113939 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.116794 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.116937 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.117039 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5fvm7" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.117143 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.128268 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.136393 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-8lrwg"] Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.137628 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.141004 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.165856 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-8lrwg"] Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186532 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30c47888-1780-4539-8777-5914009b862f-config\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186579 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186613 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-dns-svc\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186644 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c47888-1780-4539-8777-5914009b862f-combined-ca-bundle\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186660 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c47888-1780-4539-8777-5914009b862f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186677 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf5gr\" (UniqueName: \"kubernetes.io/projected/30c47888-1780-4539-8777-5914009b862f-kube-api-access-vf5gr\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186698 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186727 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/30c47888-1780-4539-8777-5914009b862f-ovn-rundir\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186747 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skk6j\" (UniqueName: \"kubernetes.io/projected/64779ffd-87e4-4348-b342-89e104dfb706-kube-api-access-skk6j\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186772 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/30c47888-1780-4539-8777-5914009b862f-ovs-rundir\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186809 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84924340-1dd2-488e-a6e6-adfe62b61f2f-config\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186825 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vlbb\" (UniqueName: \"kubernetes.io/projected/84924340-1dd2-488e-a6e6-adfe62b61f2f-kube-api-access-9vlbb\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186853 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186871 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/84924340-1dd2-488e-a6e6-adfe62b61f2f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186887 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84924340-1dd2-488e-a6e6-adfe62b61f2f-scripts\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186904 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186920 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.186943 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-config\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.187639 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30c47888-1780-4539-8777-5914009b862f-config\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.191695 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/30c47888-1780-4539-8777-5914009b862f-ovn-rundir\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.192023 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/30c47888-1780-4539-8777-5914009b862f-ovs-rundir\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.193586 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c47888-1780-4539-8777-5914009b862f-combined-ca-bundle\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.194811 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c47888-1780-4539-8777-5914009b862f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.221674 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf5gr\" (UniqueName: \"kubernetes.io/projected/30c47888-1780-4539-8777-5914009b862f-kube-api-access-vf5gr\") pod \"ovn-controller-metrics-n25tn\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.289230 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84924340-1dd2-488e-a6e6-adfe62b61f2f-config\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.289289 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vlbb\" (UniqueName: \"kubernetes.io/projected/84924340-1dd2-488e-a6e6-adfe62b61f2f-kube-api-access-9vlbb\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.289329 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.289349 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/84924340-1dd2-488e-a6e6-adfe62b61f2f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.289367 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84924340-1dd2-488e-a6e6-adfe62b61f2f-scripts\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.289389 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.289447 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.289478 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-config\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.289510 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.289547 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-dns-svc\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.289573 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.289616 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skk6j\" (UniqueName: \"kubernetes.io/projected/64779ffd-87e4-4348-b342-89e104dfb706-kube-api-access-skk6j\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.290753 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84924340-1dd2-488e-a6e6-adfe62b61f2f-config\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.302178 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84924340-1dd2-488e-a6e6-adfe62b61f2f-scripts\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.302490 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/84924340-1dd2-488e-a6e6-adfe62b61f2f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.303054 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.307179 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-dns-svc\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.307851 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-config\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.308484 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.320175 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.320416 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.321161 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.321459 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skk6j\" (UniqueName: \"kubernetes.io/projected/64779ffd-87e4-4348-b342-89e104dfb706-kube-api-access-skk6j\") pod \"dnsmasq-dns-586b989cdc-8lrwg\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.326869 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vlbb\" (UniqueName: \"kubernetes.io/projected/84924340-1dd2-488e-a6e6-adfe62b61f2f-kube-api-access-9vlbb\") pod \"ovn-northd-0\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.364578 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.566971 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.574799 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.697847 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7878659675-2q6fg"] Jan 30 05:24:39 crc kubenswrapper[4841]: I0130 05:24:39.958238 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-n25tn"] Jan 30 05:24:40 crc kubenswrapper[4841]: I0130 05:24:40.363484 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 05:24:40 crc kubenswrapper[4841]: W0130 05:24:40.369143 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64779ffd_87e4_4348_b342_89e104dfb706.slice/crio-f0d1fa9b1dc1376c372dbdbcaa69098bf9214a50c8b1214328c5a63dc2dc13e1 WatchSource:0}: Error finding container f0d1fa9b1dc1376c372dbdbcaa69098bf9214a50c8b1214328c5a63dc2dc13e1: Status 404 returned error can't find the container with id f0d1fa9b1dc1376c372dbdbcaa69098bf9214a50c8b1214328c5a63dc2dc13e1 Jan 30 05:24:40 crc kubenswrapper[4841]: I0130 05:24:40.371651 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-8lrwg"] Jan 30 05:24:40 crc kubenswrapper[4841]: I0130 05:24:40.466547 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-n25tn" event={"ID":"30c47888-1780-4539-8777-5914009b862f","Type":"ContainerStarted","Data":"91d787781b4d6ae54f4e8271b150a24f444dd086faa4daa6e699af2344ce85e6"} Jan 30 05:24:40 crc kubenswrapper[4841]: I0130 05:24:40.468793 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7878659675-2q6fg" event={"ID":"a878694e-5ff5-4bcf-99a9-e06bda1b87cb","Type":"ContainerStarted","Data":"9e6e81331f003061dbb361196c2d973ca992d3722fbf7d96b241d8be52a06436"} Jan 30 05:24:40 crc kubenswrapper[4841]: I0130 05:24:40.471575 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" event={"ID":"64779ffd-87e4-4348-b342-89e104dfb706","Type":"ContainerStarted","Data":"f0d1fa9b1dc1376c372dbdbcaa69098bf9214a50c8b1214328c5a63dc2dc13e1"} Jan 30 05:24:40 crc kubenswrapper[4841]: I0130 05:24:40.472861 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"84924340-1dd2-488e-a6e6-adfe62b61f2f","Type":"ContainerStarted","Data":"5d2d793ab0fb7ca6219d5551b99e77832cc435ad9e83e1edcc674406dd3badf9"} Jan 30 05:24:42 crc kubenswrapper[4841]: I0130 05:24:42.173847 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 30 05:24:42 crc kubenswrapper[4841]: I0130 05:24:42.174238 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 30 05:24:43 crc kubenswrapper[4841]: I0130 05:24:43.596891 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:43 crc kubenswrapper[4841]: I0130 05:24:43.597317 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.517331 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" event={"ID":"64779ffd-87e4-4348-b342-89e104dfb706","Type":"ContainerStarted","Data":"6e17541031973c346ad1ade10644d1fe5e2230acbe3ad578e98b100535c76b79"} Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.714775 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-8lrwg"] Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.751089 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-6vbsx"] Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.752295 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.753449 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.770046 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-6vbsx"] Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.784127 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.897082 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.925574 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhghd\" (UniqueName: \"kubernetes.io/projected/1b9dbcfe-b92e-4c53-a410-86416ae62413-kube-api-access-jhghd\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.925823 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-config\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.925890 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.925920 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:45 crc kubenswrapper[4841]: I0130 05:24:45.926097 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.027280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-config\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.027325 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.027366 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.027443 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.027472 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhghd\" (UniqueName: \"kubernetes.io/projected/1b9dbcfe-b92e-4c53-a410-86416ae62413-kube-api-access-jhghd\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.028803 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-config\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.028887 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.029177 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.029203 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.051440 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhghd\" (UniqueName: \"kubernetes.io/projected/1b9dbcfe-b92e-4c53-a410-86416ae62413-kube-api-access-jhghd\") pod \"dnsmasq-dns-67fdf7998c-6vbsx\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.091780 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.524170 4841 generic.go:334] "Generic (PLEG): container finished" podID="a878694e-5ff5-4bcf-99a9-e06bda1b87cb" containerID="58d536eb1d3e56bcfeccccac33f655dab62dce090bfa4330fea4c507d2adc6f3" exitCode=0 Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.524380 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7878659675-2q6fg" event={"ID":"a878694e-5ff5-4bcf-99a9-e06bda1b87cb","Type":"ContainerDied","Data":"58d536eb1d3e56bcfeccccac33f655dab62dce090bfa4330fea4c507d2adc6f3"} Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.527431 4841 generic.go:334] "Generic (PLEG): container finished" podID="64779ffd-87e4-4348-b342-89e104dfb706" containerID="6e17541031973c346ad1ade10644d1fe5e2230acbe3ad578e98b100535c76b79" exitCode=0 Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.527735 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" event={"ID":"64779ffd-87e4-4348-b342-89e104dfb706","Type":"ContainerDied","Data":"6e17541031973c346ad1ade10644d1fe5e2230acbe3ad578e98b100535c76b79"} Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.531206 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-n25tn" event={"ID":"30c47888-1780-4539-8777-5914009b862f","Type":"ContainerStarted","Data":"97894b49cfe91943a3d82f73c54427c16ccf3ca0fab823dd8f5f0d111192e569"} Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.571953 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-n25tn" podStartSLOduration=7.571933222 podStartE2EDuration="7.571933222s" podCreationTimestamp="2026-01-30 05:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:24:46.564582248 +0000 UTC m=+1023.558054886" watchObservedRunningTime="2026-01-30 05:24:46.571933222 +0000 UTC m=+1023.565405880" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.766883 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-6vbsx"] Jan 30 05:24:46 crc kubenswrapper[4841]: E0130 05:24:46.769317 4841 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Jan 30 05:24:46 crc kubenswrapper[4841]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/64779ffd-87e4-4348-b342-89e104dfb706/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 30 05:24:46 crc kubenswrapper[4841]: > podSandboxID="f0d1fa9b1dc1376c372dbdbcaa69098bf9214a50c8b1214328c5a63dc2dc13e1" Jan 30 05:24:46 crc kubenswrapper[4841]: E0130 05:24:46.769535 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:24:46 crc kubenswrapper[4841]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-skk6j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-586b989cdc-8lrwg_openstack(64779ffd-87e4-4348-b342-89e104dfb706): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/64779ffd-87e4-4348-b342-89e104dfb706/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 30 05:24:46 crc kubenswrapper[4841]: > logger="UnhandledError" Jan 30 05:24:46 crc kubenswrapper[4841]: E0130 05:24:46.771016 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/64779ffd-87e4-4348-b342-89e104dfb706/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" podUID="64779ffd-87e4-4348-b342-89e104dfb706" Jan 30 05:24:46 crc kubenswrapper[4841]: W0130 05:24:46.803022 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b9dbcfe_b92e_4c53_a410_86416ae62413.slice/crio-db9ec6f522ae76077ce78ba3dec5cb60f5313fb37859a09b8eb3e085871db08d WatchSource:0}: Error finding container db9ec6f522ae76077ce78ba3dec5cb60f5313fb37859a09b8eb3e085871db08d: Status 404 returned error can't find the container with id db9ec6f522ae76077ce78ba3dec5cb60f5313fb37859a09b8eb3e085871db08d Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.844106 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.901953 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 30 05:24:46 crc kubenswrapper[4841]: E0130 05:24:46.902274 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a878694e-5ff5-4bcf-99a9-e06bda1b87cb" containerName="init" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.902292 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a878694e-5ff5-4bcf-99a9-e06bda1b87cb" containerName="init" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.902471 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a878694e-5ff5-4bcf-99a9-e06bda1b87cb" containerName="init" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.906590 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.908858 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-vw9cm" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.909048 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.909186 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.909355 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.930213 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.940851 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-ovsdbserver-nb\") pod \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.940918 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvtgm\" (UniqueName: \"kubernetes.io/projected/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-kube-api-access-nvtgm\") pod \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.940985 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-config\") pod \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.941069 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-dns-svc\") pod \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\" (UID: \"a878694e-5ff5-4bcf-99a9-e06bda1b87cb\") " Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.941357 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2jw7\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-kube-api-access-g2jw7\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.941389 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.941434 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b70c69eb-7b62-446a-8748-9a80d6fbe28b-cache\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.941451 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b70c69eb-7b62-446a-8748-9a80d6fbe28b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.941469 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b70c69eb-7b62-446a-8748-9a80d6fbe28b-lock\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.941509 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.945292 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-kube-api-access-nvtgm" (OuterVolumeSpecName: "kube-api-access-nvtgm") pod "a878694e-5ff5-4bcf-99a9-e06bda1b87cb" (UID: "a878694e-5ff5-4bcf-99a9-e06bda1b87cb"). InnerVolumeSpecName "kube-api-access-nvtgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.963184 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a878694e-5ff5-4bcf-99a9-e06bda1b87cb" (UID: "a878694e-5ff5-4bcf-99a9-e06bda1b87cb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.969801 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-config" (OuterVolumeSpecName: "config") pod "a878694e-5ff5-4bcf-99a9-e06bda1b87cb" (UID: "a878694e-5ff5-4bcf-99a9-e06bda1b87cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:46 crc kubenswrapper[4841]: I0130 05:24:46.973621 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a878694e-5ff5-4bcf-99a9-e06bda1b87cb" (UID: "a878694e-5ff5-4bcf-99a9-e06bda1b87cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.042469 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2jw7\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-kube-api-access-g2jw7\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.042511 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.042545 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b70c69eb-7b62-446a-8748-9a80d6fbe28b-cache\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.042559 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b70c69eb-7b62-446a-8748-9a80d6fbe28b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.042574 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b70c69eb-7b62-446a-8748-9a80d6fbe28b-lock\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.042599 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.042665 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.042677 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvtgm\" (UniqueName: \"kubernetes.io/projected/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-kube-api-access-nvtgm\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.042686 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.042694 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a878694e-5ff5-4bcf-99a9-e06bda1b87cb-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.043276 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b70c69eb-7b62-446a-8748-9a80d6fbe28b-cache\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: E0130 05:24:47.043602 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 05:24:47 crc kubenswrapper[4841]: E0130 05:24:47.043634 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 05:24:47 crc kubenswrapper[4841]: E0130 05:24:47.043742 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift podName:b70c69eb-7b62-446a-8748-9a80d6fbe28b nodeName:}" failed. No retries permitted until 2026-01-30 05:24:47.543721199 +0000 UTC m=+1024.537193847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift") pod "swift-storage-0" (UID: "b70c69eb-7b62-446a-8748-9a80d6fbe28b") : configmap "swift-ring-files" not found Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.043751 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.043735 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b70c69eb-7b62-446a-8748-9a80d6fbe28b-lock\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.048335 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b70c69eb-7b62-446a-8748-9a80d6fbe28b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.060298 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2jw7\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-kube-api-access-g2jw7\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.069111 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.543633 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7878659675-2q6fg" event={"ID":"a878694e-5ff5-4bcf-99a9-e06bda1b87cb","Type":"ContainerDied","Data":"9e6e81331f003061dbb361196c2d973ca992d3722fbf7d96b241d8be52a06436"} Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.543733 4841 scope.go:117] "RemoveContainer" containerID="58d536eb1d3e56bcfeccccac33f655dab62dce090bfa4330fea4c507d2adc6f3" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.543662 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7878659675-2q6fg" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.549394 4841 generic.go:334] "Generic (PLEG): container finished" podID="1b9dbcfe-b92e-4c53-a410-86416ae62413" containerID="042b8d579a22310a2de0e98f37bdc1bd2e5e2047dcadb4df5582af91e9a92bae" exitCode=0 Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.549530 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" event={"ID":"1b9dbcfe-b92e-4c53-a410-86416ae62413","Type":"ContainerDied","Data":"042b8d579a22310a2de0e98f37bdc1bd2e5e2047dcadb4df5582af91e9a92bae"} Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.549570 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" event={"ID":"1b9dbcfe-b92e-4c53-a410-86416ae62413","Type":"ContainerStarted","Data":"db9ec6f522ae76077ce78ba3dec5cb60f5313fb37859a09b8eb3e085871db08d"} Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.556142 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.556710 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"84924340-1dd2-488e-a6e6-adfe62b61f2f","Type":"ContainerStarted","Data":"465a65be37ff3c79b8c121932b87f14d4319db96cad76bedc4f91b6b55b13cf0"} Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.556763 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"84924340-1dd2-488e-a6e6-adfe62b61f2f","Type":"ContainerStarted","Data":"b18eea1e80216e394517133207cd5dd3e554316b05364a6776862ef3821a0f05"} Jan 30 05:24:47 crc kubenswrapper[4841]: E0130 05:24:47.556845 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 05:24:47 crc kubenswrapper[4841]: E0130 05:24:47.556871 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 05:24:47 crc kubenswrapper[4841]: E0130 05:24:47.556938 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift podName:b70c69eb-7b62-446a-8748-9a80d6fbe28b nodeName:}" failed. No retries permitted until 2026-01-30 05:24:48.556915676 +0000 UTC m=+1025.550388344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift") pod "swift-storage-0" (UID: "b70c69eb-7b62-446a-8748-9a80d6fbe28b") : configmap "swift-ring-files" not found Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.636667 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.663405254 podStartE2EDuration="8.636622814s" podCreationTimestamp="2026-01-30 05:24:39 +0000 UTC" firstStartedPulling="2026-01-30 05:24:40.385228173 +0000 UTC m=+1017.378700821" lastFinishedPulling="2026-01-30 05:24:46.358445743 +0000 UTC m=+1023.351918381" observedRunningTime="2026-01-30 05:24:47.632658799 +0000 UTC m=+1024.626131447" watchObservedRunningTime="2026-01-30 05:24:47.636622814 +0000 UTC m=+1024.630095462" Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.721844 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7878659675-2q6fg"] Jan 30 05:24:47 crc kubenswrapper[4841]: I0130 05:24:47.734035 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7878659675-2q6fg"] Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.128026 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.166174 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skk6j\" (UniqueName: \"kubernetes.io/projected/64779ffd-87e4-4348-b342-89e104dfb706-kube-api-access-skk6j\") pod \"64779ffd-87e4-4348-b342-89e104dfb706\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.166244 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-dns-svc\") pod \"64779ffd-87e4-4348-b342-89e104dfb706\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.166267 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-ovsdbserver-sb\") pod \"64779ffd-87e4-4348-b342-89e104dfb706\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.166321 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-ovsdbserver-nb\") pod \"64779ffd-87e4-4348-b342-89e104dfb706\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.166363 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-config\") pod \"64779ffd-87e4-4348-b342-89e104dfb706\" (UID: \"64779ffd-87e4-4348-b342-89e104dfb706\") " Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.181212 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64779ffd-87e4-4348-b342-89e104dfb706-kube-api-access-skk6j" (OuterVolumeSpecName: "kube-api-access-skk6j") pod "64779ffd-87e4-4348-b342-89e104dfb706" (UID: "64779ffd-87e4-4348-b342-89e104dfb706"). InnerVolumeSpecName "kube-api-access-skk6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.197628 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-config" (OuterVolumeSpecName: "config") pod "64779ffd-87e4-4348-b342-89e104dfb706" (UID: "64779ffd-87e4-4348-b342-89e104dfb706"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.203164 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "64779ffd-87e4-4348-b342-89e104dfb706" (UID: "64779ffd-87e4-4348-b342-89e104dfb706"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.207124 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "64779ffd-87e4-4348-b342-89e104dfb706" (UID: "64779ffd-87e4-4348-b342-89e104dfb706"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.215550 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "64779ffd-87e4-4348-b342-89e104dfb706" (UID: "64779ffd-87e4-4348-b342-89e104dfb706"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.268191 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skk6j\" (UniqueName: \"kubernetes.io/projected/64779ffd-87e4-4348-b342-89e104dfb706-kube-api-access-skk6j\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.268240 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.268259 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.268276 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.268294 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64779ffd-87e4-4348-b342-89e104dfb706-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.445731 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a878694e-5ff5-4bcf-99a9-e06bda1b87cb" path="/var/lib/kubelet/pods/a878694e-5ff5-4bcf-99a9-e06bda1b87cb/volumes" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.569911 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" event={"ID":"1b9dbcfe-b92e-4c53-a410-86416ae62413","Type":"ContainerStarted","Data":"6e40304e7bb30193215e7c6aa1b64cb5e94495d6e9c2a8ef9b6c2738d50d1707"} Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.570382 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.574334 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:48 crc kubenswrapper[4841]: E0130 05:24:48.574591 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 05:24:48 crc kubenswrapper[4841]: E0130 05:24:48.574631 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 05:24:48 crc kubenswrapper[4841]: E0130 05:24:48.574717 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift podName:b70c69eb-7b62-446a-8748-9a80d6fbe28b nodeName:}" failed. No retries permitted until 2026-01-30 05:24:50.574689233 +0000 UTC m=+1027.568161911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift") pod "swift-storage-0" (UID: "b70c69eb-7b62-446a-8748-9a80d6fbe28b") : configmap "swift-ring-files" not found Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.574812 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" event={"ID":"64779ffd-87e4-4348-b342-89e104dfb706","Type":"ContainerDied","Data":"f0d1fa9b1dc1376c372dbdbcaa69098bf9214a50c8b1214328c5a63dc2dc13e1"} Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.574862 4841 scope.go:117] "RemoveContainer" containerID="6e17541031973c346ad1ade10644d1fe5e2230acbe3ad578e98b100535c76b79" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.574902 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-8lrwg" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.575058 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.596643 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" podStartSLOduration=3.59662866 podStartE2EDuration="3.59662866s" podCreationTimestamp="2026-01-30 05:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:24:48.59474581 +0000 UTC m=+1025.588218438" watchObservedRunningTime="2026-01-30 05:24:48.59662866 +0000 UTC m=+1025.590101298" Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.657601 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-8lrwg"] Jan 30 05:24:48 crc kubenswrapper[4841]: I0130 05:24:48.687743 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-8lrwg"] Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.416211 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9a61-account-create-update-hb6tt"] Jan 30 05:24:49 crc kubenswrapper[4841]: E0130 05:24:49.417248 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64779ffd-87e4-4348-b342-89e104dfb706" containerName="init" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.417424 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="64779ffd-87e4-4348-b342-89e104dfb706" containerName="init" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.417883 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="64779ffd-87e4-4348-b342-89e104dfb706" containerName="init" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.418913 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9a61-account-create-update-hb6tt" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.422171 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.430108 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9a61-account-create-update-hb6tt"] Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.465024 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-w85md"] Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.466563 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-w85md" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.474213 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-w85md"] Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.493239 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x44pq\" (UniqueName: \"kubernetes.io/projected/66d4a4be-d1db-4e1e-81de-09de7396cb0a-kube-api-access-x44pq\") pod \"glance-9a61-account-create-update-hb6tt\" (UID: \"66d4a4be-d1db-4e1e-81de-09de7396cb0a\") " pod="openstack/glance-9a61-account-create-update-hb6tt" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.493482 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d4a4be-d1db-4e1e-81de-09de7396cb0a-operator-scripts\") pod \"glance-9a61-account-create-update-hb6tt\" (UID: \"66d4a4be-d1db-4e1e-81de-09de7396cb0a\") " pod="openstack/glance-9a61-account-create-update-hb6tt" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.493654 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhvnt\" (UniqueName: \"kubernetes.io/projected/fa02d9bc-89ad-4e58-aa4b-62455308f9e7-kube-api-access-zhvnt\") pod \"glance-db-create-w85md\" (UID: \"fa02d9bc-89ad-4e58-aa4b-62455308f9e7\") " pod="openstack/glance-db-create-w85md" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.493790 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa02d9bc-89ad-4e58-aa4b-62455308f9e7-operator-scripts\") pod \"glance-db-create-w85md\" (UID: \"fa02d9bc-89ad-4e58-aa4b-62455308f9e7\") " pod="openstack/glance-db-create-w85md" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.595169 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d4a4be-d1db-4e1e-81de-09de7396cb0a-operator-scripts\") pod \"glance-9a61-account-create-update-hb6tt\" (UID: \"66d4a4be-d1db-4e1e-81de-09de7396cb0a\") " pod="openstack/glance-9a61-account-create-update-hb6tt" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.595286 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhvnt\" (UniqueName: \"kubernetes.io/projected/fa02d9bc-89ad-4e58-aa4b-62455308f9e7-kube-api-access-zhvnt\") pod \"glance-db-create-w85md\" (UID: \"fa02d9bc-89ad-4e58-aa4b-62455308f9e7\") " pod="openstack/glance-db-create-w85md" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.595327 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa02d9bc-89ad-4e58-aa4b-62455308f9e7-operator-scripts\") pod \"glance-db-create-w85md\" (UID: \"fa02d9bc-89ad-4e58-aa4b-62455308f9e7\") " pod="openstack/glance-db-create-w85md" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.595384 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x44pq\" (UniqueName: \"kubernetes.io/projected/66d4a4be-d1db-4e1e-81de-09de7396cb0a-kube-api-access-x44pq\") pod \"glance-9a61-account-create-update-hb6tt\" (UID: \"66d4a4be-d1db-4e1e-81de-09de7396cb0a\") " pod="openstack/glance-9a61-account-create-update-hb6tt" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.596589 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d4a4be-d1db-4e1e-81de-09de7396cb0a-operator-scripts\") pod \"glance-9a61-account-create-update-hb6tt\" (UID: \"66d4a4be-d1db-4e1e-81de-09de7396cb0a\") " pod="openstack/glance-9a61-account-create-update-hb6tt" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.597422 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa02d9bc-89ad-4e58-aa4b-62455308f9e7-operator-scripts\") pod \"glance-db-create-w85md\" (UID: \"fa02d9bc-89ad-4e58-aa4b-62455308f9e7\") " pod="openstack/glance-db-create-w85md" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.614928 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x44pq\" (UniqueName: \"kubernetes.io/projected/66d4a4be-d1db-4e1e-81de-09de7396cb0a-kube-api-access-x44pq\") pod \"glance-9a61-account-create-update-hb6tt\" (UID: \"66d4a4be-d1db-4e1e-81de-09de7396cb0a\") " pod="openstack/glance-9a61-account-create-update-hb6tt" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.617445 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhvnt\" (UniqueName: \"kubernetes.io/projected/fa02d9bc-89ad-4e58-aa4b-62455308f9e7-kube-api-access-zhvnt\") pod \"glance-db-create-w85md\" (UID: \"fa02d9bc-89ad-4e58-aa4b-62455308f9e7\") " pod="openstack/glance-db-create-w85md" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.742034 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9a61-account-create-update-hb6tt" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.749774 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.789069 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-w85md" Jan 30 05:24:49 crc kubenswrapper[4841]: I0130 05:24:49.881943 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.244834 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9a61-account-create-update-hb6tt"] Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.403048 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-w85md"] Jan 30 05:24:50 crc kubenswrapper[4841]: W0130 05:24:50.408576 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa02d9bc_89ad_4e58_aa4b_62455308f9e7.slice/crio-33643660ad6234b35a632026e43d2bee6bf43514f10ad0371c57f5269c491e79 WatchSource:0}: Error finding container 33643660ad6234b35a632026e43d2bee6bf43514f10ad0371c57f5269c491e79: Status 404 returned error can't find the container with id 33643660ad6234b35a632026e43d2bee6bf43514f10ad0371c57f5269c491e79 Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.448295 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64779ffd-87e4-4348-b342-89e104dfb706" path="/var/lib/kubelet/pods/64779ffd-87e4-4348-b342-89e104dfb706/volumes" Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.597203 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9a61-account-create-update-hb6tt" event={"ID":"66d4a4be-d1db-4e1e-81de-09de7396cb0a","Type":"ContainerStarted","Data":"116957a31ecc7810eee65d27433abb88c8fec810845e23d0583e85d00bf7ce89"} Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.597262 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9a61-account-create-update-hb6tt" event={"ID":"66d4a4be-d1db-4e1e-81de-09de7396cb0a","Type":"ContainerStarted","Data":"295accf07068997f727e877a08320a0c22f67ecb92887f63e4a0eff555b7a2b1"} Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.602173 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-w85md" event={"ID":"fa02d9bc-89ad-4e58-aa4b-62455308f9e7","Type":"ContainerStarted","Data":"33643660ad6234b35a632026e43d2bee6bf43514f10ad0371c57f5269c491e79"} Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.612280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:50 crc kubenswrapper[4841]: E0130 05:24:50.612520 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 05:24:50 crc kubenswrapper[4841]: E0130 05:24:50.612564 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 05:24:50 crc kubenswrapper[4841]: E0130 05:24:50.612647 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift podName:b70c69eb-7b62-446a-8748-9a80d6fbe28b nodeName:}" failed. No retries permitted until 2026-01-30 05:24:54.6126149 +0000 UTC m=+1031.606087578 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift") pod "swift-storage-0" (UID: "b70c69eb-7b62-446a-8748-9a80d6fbe28b") : configmap "swift-ring-files" not found Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.625426 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-9a61-account-create-update-hb6tt" podStartSLOduration=1.625379755 podStartE2EDuration="1.625379755s" podCreationTimestamp="2026-01-30 05:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:24:50.617852117 +0000 UTC m=+1027.611324795" watchObservedRunningTime="2026-01-30 05:24:50.625379755 +0000 UTC m=+1027.618852433" Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.851987 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9t6gr"] Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.853356 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9t6gr" Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.856255 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.865557 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jkv7n"] Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.866808 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.869860 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.870067 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.870176 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.872913 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9t6gr"] Jan 30 05:24:50 crc kubenswrapper[4841]: I0130 05:24:50.878507 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jkv7n"] Jan 30 05:24:50 crc kubenswrapper[4841]: E0130 05:24:50.995603 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa02d9bc_89ad_4e58_aa4b_62455308f9e7.slice/crio-conmon-91f9480848574a237405ae1da5561a7e4158f6f071fd8b1259fa2288ebb97e00.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.024992 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvvmt\" (UniqueName: \"kubernetes.io/projected/ff309626-60f6-4110-8b20-5354dab1ca68-kube-api-access-mvvmt\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.025033 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbz9c\" (UniqueName: \"kubernetes.io/projected/455dfc60-4744-463e-88b9-b58de1aa5e4f-kube-api-access-mbz9c\") pod \"root-account-create-update-9t6gr\" (UID: \"455dfc60-4744-463e-88b9-b58de1aa5e4f\") " pod="openstack/root-account-create-update-9t6gr" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.025067 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff309626-60f6-4110-8b20-5354dab1ca68-ring-data-devices\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.025084 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff309626-60f6-4110-8b20-5354dab1ca68-scripts\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.025113 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-swiftconf\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.025135 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-dispersionconf\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.025160 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/455dfc60-4744-463e-88b9-b58de1aa5e4f-operator-scripts\") pod \"root-account-create-update-9t6gr\" (UID: \"455dfc60-4744-463e-88b9-b58de1aa5e4f\") " pod="openstack/root-account-create-update-9t6gr" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.025200 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-combined-ca-bundle\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.025237 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff309626-60f6-4110-8b20-5354dab1ca68-etc-swift\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.126447 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff309626-60f6-4110-8b20-5354dab1ca68-ring-data-devices\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.126488 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff309626-60f6-4110-8b20-5354dab1ca68-scripts\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.126519 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-swiftconf\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.126548 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-dispersionconf\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.126575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/455dfc60-4744-463e-88b9-b58de1aa5e4f-operator-scripts\") pod \"root-account-create-update-9t6gr\" (UID: \"455dfc60-4744-463e-88b9-b58de1aa5e4f\") " pod="openstack/root-account-create-update-9t6gr" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.126615 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-combined-ca-bundle\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.126655 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff309626-60f6-4110-8b20-5354dab1ca68-etc-swift\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.126698 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbz9c\" (UniqueName: \"kubernetes.io/projected/455dfc60-4744-463e-88b9-b58de1aa5e4f-kube-api-access-mbz9c\") pod \"root-account-create-update-9t6gr\" (UID: \"455dfc60-4744-463e-88b9-b58de1aa5e4f\") " pod="openstack/root-account-create-update-9t6gr" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.126713 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvvmt\" (UniqueName: \"kubernetes.io/projected/ff309626-60f6-4110-8b20-5354dab1ca68-kube-api-access-mvvmt\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.128147 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff309626-60f6-4110-8b20-5354dab1ca68-ring-data-devices\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.128411 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff309626-60f6-4110-8b20-5354dab1ca68-etc-swift\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.128582 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff309626-60f6-4110-8b20-5354dab1ca68-scripts\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.129055 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/455dfc60-4744-463e-88b9-b58de1aa5e4f-operator-scripts\") pod \"root-account-create-update-9t6gr\" (UID: \"455dfc60-4744-463e-88b9-b58de1aa5e4f\") " pod="openstack/root-account-create-update-9t6gr" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.132322 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-swiftconf\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.135542 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-dispersionconf\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.136687 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-combined-ca-bundle\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.154231 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvvmt\" (UniqueName: \"kubernetes.io/projected/ff309626-60f6-4110-8b20-5354dab1ca68-kube-api-access-mvvmt\") pod \"swift-ring-rebalance-jkv7n\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.155961 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbz9c\" (UniqueName: \"kubernetes.io/projected/455dfc60-4744-463e-88b9-b58de1aa5e4f-kube-api-access-mbz9c\") pod \"root-account-create-update-9t6gr\" (UID: \"455dfc60-4744-463e-88b9-b58de1aa5e4f\") " pod="openstack/root-account-create-update-9t6gr" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.167815 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9t6gr" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.182175 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.611155 4841 generic.go:334] "Generic (PLEG): container finished" podID="66d4a4be-d1db-4e1e-81de-09de7396cb0a" containerID="116957a31ecc7810eee65d27433abb88c8fec810845e23d0583e85d00bf7ce89" exitCode=0 Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.611384 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9a61-account-create-update-hb6tt" event={"ID":"66d4a4be-d1db-4e1e-81de-09de7396cb0a","Type":"ContainerDied","Data":"116957a31ecc7810eee65d27433abb88c8fec810845e23d0583e85d00bf7ce89"} Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.615327 4841 generic.go:334] "Generic (PLEG): container finished" podID="fa02d9bc-89ad-4e58-aa4b-62455308f9e7" containerID="91f9480848574a237405ae1da5561a7e4158f6f071fd8b1259fa2288ebb97e00" exitCode=0 Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.615392 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-w85md" event={"ID":"fa02d9bc-89ad-4e58-aa4b-62455308f9e7","Type":"ContainerDied","Data":"91f9480848574a237405ae1da5561a7e4158f6f071fd8b1259fa2288ebb97e00"} Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.696353 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jkv7n"] Jan 30 05:24:51 crc kubenswrapper[4841]: W0130 05:24:51.700217 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff309626_60f6_4110_8b20_5354dab1ca68.slice/crio-382de7a4b251ae28aa4add4ffb25c244e79f5de637cfc79ac31a53eada41f353 WatchSource:0}: Error finding container 382de7a4b251ae28aa4add4ffb25c244e79f5de637cfc79ac31a53eada41f353: Status 404 returned error can't find the container with id 382de7a4b251ae28aa4add4ffb25c244e79f5de637cfc79ac31a53eada41f353 Jan 30 05:24:51 crc kubenswrapper[4841]: W0130 05:24:51.747189 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod455dfc60_4744_463e_88b9_b58de1aa5e4f.slice/crio-fcb379700371fa309b64d17f10ace416365d7e21b863c1e09580d804c2eecbe3 WatchSource:0}: Error finding container fcb379700371fa309b64d17f10ace416365d7e21b863c1e09580d804c2eecbe3: Status 404 returned error can't find the container with id fcb379700371fa309b64d17f10ace416365d7e21b863c1e09580d804c2eecbe3 Jan 30 05:24:51 crc kubenswrapper[4841]: I0130 05:24:51.748490 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9t6gr"] Jan 30 05:24:52 crc kubenswrapper[4841]: I0130 05:24:52.626439 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jkv7n" event={"ID":"ff309626-60f6-4110-8b20-5354dab1ca68","Type":"ContainerStarted","Data":"382de7a4b251ae28aa4add4ffb25c244e79f5de637cfc79ac31a53eada41f353"} Jan 30 05:24:52 crc kubenswrapper[4841]: I0130 05:24:52.635253 4841 generic.go:334] "Generic (PLEG): container finished" podID="455dfc60-4744-463e-88b9-b58de1aa5e4f" containerID="3848f383886419071be51c8b532717e09cd0c6ae367b14c933b86749a45d48de" exitCode=0 Jan 30 05:24:52 crc kubenswrapper[4841]: I0130 05:24:52.635423 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9t6gr" event={"ID":"455dfc60-4744-463e-88b9-b58de1aa5e4f","Type":"ContainerDied","Data":"3848f383886419071be51c8b532717e09cd0c6ae367b14c933b86749a45d48de"} Jan 30 05:24:52 crc kubenswrapper[4841]: I0130 05:24:52.635495 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9t6gr" event={"ID":"455dfc60-4744-463e-88b9-b58de1aa5e4f","Type":"ContainerStarted","Data":"fcb379700371fa309b64d17f10ace416365d7e21b863c1e09580d804c2eecbe3"} Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.065581 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9a61-account-create-update-hb6tt" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.160692 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x44pq\" (UniqueName: \"kubernetes.io/projected/66d4a4be-d1db-4e1e-81de-09de7396cb0a-kube-api-access-x44pq\") pod \"66d4a4be-d1db-4e1e-81de-09de7396cb0a\" (UID: \"66d4a4be-d1db-4e1e-81de-09de7396cb0a\") " Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.161060 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d4a4be-d1db-4e1e-81de-09de7396cb0a-operator-scripts\") pod \"66d4a4be-d1db-4e1e-81de-09de7396cb0a\" (UID: \"66d4a4be-d1db-4e1e-81de-09de7396cb0a\") " Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.162164 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66d4a4be-d1db-4e1e-81de-09de7396cb0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66d4a4be-d1db-4e1e-81de-09de7396cb0a" (UID: "66d4a4be-d1db-4e1e-81de-09de7396cb0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.167437 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d4a4be-d1db-4e1e-81de-09de7396cb0a-kube-api-access-x44pq" (OuterVolumeSpecName: "kube-api-access-x44pq") pod "66d4a4be-d1db-4e1e-81de-09de7396cb0a" (UID: "66d4a4be-d1db-4e1e-81de-09de7396cb0a"). InnerVolumeSpecName "kube-api-access-x44pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.263004 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d4a4be-d1db-4e1e-81de-09de7396cb0a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.263049 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x44pq\" (UniqueName: \"kubernetes.io/projected/66d4a4be-d1db-4e1e-81de-09de7396cb0a-kube-api-access-x44pq\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.624299 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2knp8"] Jan 30 05:24:53 crc kubenswrapper[4841]: E0130 05:24:53.624813 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d4a4be-d1db-4e1e-81de-09de7396cb0a" containerName="mariadb-account-create-update" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.624837 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d4a4be-d1db-4e1e-81de-09de7396cb0a" containerName="mariadb-account-create-update" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.625080 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d4a4be-d1db-4e1e-81de-09de7396cb0a" containerName="mariadb-account-create-update" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.625772 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2knp8" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.631702 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2knp8"] Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.647022 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9a61-account-create-update-hb6tt" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.646991 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9a61-account-create-update-hb6tt" event={"ID":"66d4a4be-d1db-4e1e-81de-09de7396cb0a","Type":"ContainerDied","Data":"295accf07068997f727e877a08320a0c22f67ecb92887f63e4a0eff555b7a2b1"} Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.647161 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="295accf07068997f727e877a08320a0c22f67ecb92887f63e4a0eff555b7a2b1" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.731746 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2e68-account-create-update-9fkzh"] Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.733107 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e68-account-create-update-9fkzh" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.744526 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.744998 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2e68-account-create-update-9fkzh"] Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.771207 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzs5k\" (UniqueName: \"kubernetes.io/projected/69c5a675-0799-4ed6-b76b-3bfc55c6acbc-kube-api-access-dzs5k\") pod \"keystone-db-create-2knp8\" (UID: \"69c5a675-0799-4ed6-b76b-3bfc55c6acbc\") " pod="openstack/keystone-db-create-2knp8" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.771309 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69c5a675-0799-4ed6-b76b-3bfc55c6acbc-operator-scripts\") pod \"keystone-db-create-2knp8\" (UID: \"69c5a675-0799-4ed6-b76b-3bfc55c6acbc\") " pod="openstack/keystone-db-create-2knp8" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.872705 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69c5a675-0799-4ed6-b76b-3bfc55c6acbc-operator-scripts\") pod \"keystone-db-create-2knp8\" (UID: \"69c5a675-0799-4ed6-b76b-3bfc55c6acbc\") " pod="openstack/keystone-db-create-2knp8" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.872811 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshl7\" (UniqueName: \"kubernetes.io/projected/079879f7-939a-47a7-bbeb-98e1f8d7159b-kube-api-access-wshl7\") pod \"keystone-2e68-account-create-update-9fkzh\" (UID: \"079879f7-939a-47a7-bbeb-98e1f8d7159b\") " pod="openstack/keystone-2e68-account-create-update-9fkzh" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.872887 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzs5k\" (UniqueName: \"kubernetes.io/projected/69c5a675-0799-4ed6-b76b-3bfc55c6acbc-kube-api-access-dzs5k\") pod \"keystone-db-create-2knp8\" (UID: \"69c5a675-0799-4ed6-b76b-3bfc55c6acbc\") " pod="openstack/keystone-db-create-2knp8" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.872965 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079879f7-939a-47a7-bbeb-98e1f8d7159b-operator-scripts\") pod \"keystone-2e68-account-create-update-9fkzh\" (UID: \"079879f7-939a-47a7-bbeb-98e1f8d7159b\") " pod="openstack/keystone-2e68-account-create-update-9fkzh" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.873757 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69c5a675-0799-4ed6-b76b-3bfc55c6acbc-operator-scripts\") pod \"keystone-db-create-2knp8\" (UID: \"69c5a675-0799-4ed6-b76b-3bfc55c6acbc\") " pod="openstack/keystone-db-create-2knp8" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.902689 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzs5k\" (UniqueName: \"kubernetes.io/projected/69c5a675-0799-4ed6-b76b-3bfc55c6acbc-kube-api-access-dzs5k\") pod \"keystone-db-create-2knp8\" (UID: \"69c5a675-0799-4ed6-b76b-3bfc55c6acbc\") " pod="openstack/keystone-db-create-2knp8" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.935481 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-hrbft"] Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.936682 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hrbft" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.945037 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hrbft"] Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.951032 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2knp8" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.974858 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079879f7-939a-47a7-bbeb-98e1f8d7159b-operator-scripts\") pod \"keystone-2e68-account-create-update-9fkzh\" (UID: \"079879f7-939a-47a7-bbeb-98e1f8d7159b\") " pod="openstack/keystone-2e68-account-create-update-9fkzh" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.975226 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wshl7\" (UniqueName: \"kubernetes.io/projected/079879f7-939a-47a7-bbeb-98e1f8d7159b-kube-api-access-wshl7\") pod \"keystone-2e68-account-create-update-9fkzh\" (UID: \"079879f7-939a-47a7-bbeb-98e1f8d7159b\") " pod="openstack/keystone-2e68-account-create-update-9fkzh" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.975749 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079879f7-939a-47a7-bbeb-98e1f8d7159b-operator-scripts\") pod \"keystone-2e68-account-create-update-9fkzh\" (UID: \"079879f7-939a-47a7-bbeb-98e1f8d7159b\") " pod="openstack/keystone-2e68-account-create-update-9fkzh" Jan 30 05:24:53 crc kubenswrapper[4841]: I0130 05:24:53.995156 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshl7\" (UniqueName: \"kubernetes.io/projected/079879f7-939a-47a7-bbeb-98e1f8d7159b-kube-api-access-wshl7\") pod \"keystone-2e68-account-create-update-9fkzh\" (UID: \"079879f7-939a-47a7-bbeb-98e1f8d7159b\") " pod="openstack/keystone-2e68-account-create-update-9fkzh" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.043230 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2cee-account-create-update-qbxlv"] Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.044377 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2cee-account-create-update-qbxlv" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.048826 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.054080 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e68-account-create-update-9fkzh" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.054712 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2cee-account-create-update-qbxlv"] Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.077012 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st4nf\" (UniqueName: \"kubernetes.io/projected/80ca3de9-8a40-47c6-8214-2c7cdb5724c9-kube-api-access-st4nf\") pod \"placement-db-create-hrbft\" (UID: \"80ca3de9-8a40-47c6-8214-2c7cdb5724c9\") " pod="openstack/placement-db-create-hrbft" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.077285 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ca3de9-8a40-47c6-8214-2c7cdb5724c9-operator-scripts\") pod \"placement-db-create-hrbft\" (UID: \"80ca3de9-8a40-47c6-8214-2c7cdb5724c9\") " pod="openstack/placement-db-create-hrbft" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.178782 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwwtv\" (UniqueName: \"kubernetes.io/projected/cbafcabd-b6f1-4c03-839d-4f837803974c-kube-api-access-cwwtv\") pod \"placement-2cee-account-create-update-qbxlv\" (UID: \"cbafcabd-b6f1-4c03-839d-4f837803974c\") " pod="openstack/placement-2cee-account-create-update-qbxlv" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.178891 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ca3de9-8a40-47c6-8214-2c7cdb5724c9-operator-scripts\") pod \"placement-db-create-hrbft\" (UID: \"80ca3de9-8a40-47c6-8214-2c7cdb5724c9\") " pod="openstack/placement-db-create-hrbft" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.178934 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbafcabd-b6f1-4c03-839d-4f837803974c-operator-scripts\") pod \"placement-2cee-account-create-update-qbxlv\" (UID: \"cbafcabd-b6f1-4c03-839d-4f837803974c\") " pod="openstack/placement-2cee-account-create-update-qbxlv" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.178994 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st4nf\" (UniqueName: \"kubernetes.io/projected/80ca3de9-8a40-47c6-8214-2c7cdb5724c9-kube-api-access-st4nf\") pod \"placement-db-create-hrbft\" (UID: \"80ca3de9-8a40-47c6-8214-2c7cdb5724c9\") " pod="openstack/placement-db-create-hrbft" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.181164 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ca3de9-8a40-47c6-8214-2c7cdb5724c9-operator-scripts\") pod \"placement-db-create-hrbft\" (UID: \"80ca3de9-8a40-47c6-8214-2c7cdb5724c9\") " pod="openstack/placement-db-create-hrbft" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.200329 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st4nf\" (UniqueName: \"kubernetes.io/projected/80ca3de9-8a40-47c6-8214-2c7cdb5724c9-kube-api-access-st4nf\") pod \"placement-db-create-hrbft\" (UID: \"80ca3de9-8a40-47c6-8214-2c7cdb5724c9\") " pod="openstack/placement-db-create-hrbft" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.257098 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hrbft" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.280852 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbafcabd-b6f1-4c03-839d-4f837803974c-operator-scripts\") pod \"placement-2cee-account-create-update-qbxlv\" (UID: \"cbafcabd-b6f1-4c03-839d-4f837803974c\") " pod="openstack/placement-2cee-account-create-update-qbxlv" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.281026 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwwtv\" (UniqueName: \"kubernetes.io/projected/cbafcabd-b6f1-4c03-839d-4f837803974c-kube-api-access-cwwtv\") pod \"placement-2cee-account-create-update-qbxlv\" (UID: \"cbafcabd-b6f1-4c03-839d-4f837803974c\") " pod="openstack/placement-2cee-account-create-update-qbxlv" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.281502 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbafcabd-b6f1-4c03-839d-4f837803974c-operator-scripts\") pod \"placement-2cee-account-create-update-qbxlv\" (UID: \"cbafcabd-b6f1-4c03-839d-4f837803974c\") " pod="openstack/placement-2cee-account-create-update-qbxlv" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.295786 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwwtv\" (UniqueName: \"kubernetes.io/projected/cbafcabd-b6f1-4c03-839d-4f837803974c-kube-api-access-cwwtv\") pod \"placement-2cee-account-create-update-qbxlv\" (UID: \"cbafcabd-b6f1-4c03-839d-4f837803974c\") " pod="openstack/placement-2cee-account-create-update-qbxlv" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.361897 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2cee-account-create-update-qbxlv" Jan 30 05:24:54 crc kubenswrapper[4841]: I0130 05:24:54.688172 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:24:54 crc kubenswrapper[4841]: E0130 05:24:54.688375 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 05:24:54 crc kubenswrapper[4841]: E0130 05:24:54.688432 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 05:24:54 crc kubenswrapper[4841]: E0130 05:24:54.688512 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift podName:b70c69eb-7b62-446a-8748-9a80d6fbe28b nodeName:}" failed. No retries permitted until 2026-01-30 05:25:02.688484603 +0000 UTC m=+1039.681957261 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift") pod "swift-storage-0" (UID: "b70c69eb-7b62-446a-8748-9a80d6fbe28b") : configmap "swift-ring-files" not found Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.356191 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-w85md" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.375760 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9t6gr" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.526708 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/455dfc60-4744-463e-88b9-b58de1aa5e4f-operator-scripts\") pod \"455dfc60-4744-463e-88b9-b58de1aa5e4f\" (UID: \"455dfc60-4744-463e-88b9-b58de1aa5e4f\") " Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.527129 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa02d9bc-89ad-4e58-aa4b-62455308f9e7-operator-scripts\") pod \"fa02d9bc-89ad-4e58-aa4b-62455308f9e7\" (UID: \"fa02d9bc-89ad-4e58-aa4b-62455308f9e7\") " Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.527233 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbz9c\" (UniqueName: \"kubernetes.io/projected/455dfc60-4744-463e-88b9-b58de1aa5e4f-kube-api-access-mbz9c\") pod \"455dfc60-4744-463e-88b9-b58de1aa5e4f\" (UID: \"455dfc60-4744-463e-88b9-b58de1aa5e4f\") " Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.527275 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhvnt\" (UniqueName: \"kubernetes.io/projected/fa02d9bc-89ad-4e58-aa4b-62455308f9e7-kube-api-access-zhvnt\") pod \"fa02d9bc-89ad-4e58-aa4b-62455308f9e7\" (UID: \"fa02d9bc-89ad-4e58-aa4b-62455308f9e7\") " Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.527425 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/455dfc60-4744-463e-88b9-b58de1aa5e4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "455dfc60-4744-463e-88b9-b58de1aa5e4f" (UID: "455dfc60-4744-463e-88b9-b58de1aa5e4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.527847 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/455dfc60-4744-463e-88b9-b58de1aa5e4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.528175 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa02d9bc-89ad-4e58-aa4b-62455308f9e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fa02d9bc-89ad-4e58-aa4b-62455308f9e7" (UID: "fa02d9bc-89ad-4e58-aa4b-62455308f9e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.531452 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa02d9bc-89ad-4e58-aa4b-62455308f9e7-kube-api-access-zhvnt" (OuterVolumeSpecName: "kube-api-access-zhvnt") pod "fa02d9bc-89ad-4e58-aa4b-62455308f9e7" (UID: "fa02d9bc-89ad-4e58-aa4b-62455308f9e7"). InnerVolumeSpecName "kube-api-access-zhvnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.532009 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/455dfc60-4744-463e-88b9-b58de1aa5e4f-kube-api-access-mbz9c" (OuterVolumeSpecName: "kube-api-access-mbz9c") pod "455dfc60-4744-463e-88b9-b58de1aa5e4f" (UID: "455dfc60-4744-463e-88b9-b58de1aa5e4f"). InnerVolumeSpecName "kube-api-access-mbz9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.629588 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa02d9bc-89ad-4e58-aa4b-62455308f9e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.629626 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbz9c\" (UniqueName: \"kubernetes.io/projected/455dfc60-4744-463e-88b9-b58de1aa5e4f-kube-api-access-mbz9c\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.629641 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhvnt\" (UniqueName: \"kubernetes.io/projected/fa02d9bc-89ad-4e58-aa4b-62455308f9e7-kube-api-access-zhvnt\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.662634 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-w85md" event={"ID":"fa02d9bc-89ad-4e58-aa4b-62455308f9e7","Type":"ContainerDied","Data":"33643660ad6234b35a632026e43d2bee6bf43514f10ad0371c57f5269c491e79"} Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.662678 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33643660ad6234b35a632026e43d2bee6bf43514f10ad0371c57f5269c491e79" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.662737 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-w85md" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.671632 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jkv7n" event={"ID":"ff309626-60f6-4110-8b20-5354dab1ca68","Type":"ContainerStarted","Data":"34e0d86f2d28c1bd9d913121757607f09ba8a336c043340479e6bcb08471c2ac"} Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.678977 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9t6gr" event={"ID":"455dfc60-4744-463e-88b9-b58de1aa5e4f","Type":"ContainerDied","Data":"fcb379700371fa309b64d17f10ace416365d7e21b863c1e09580d804c2eecbe3"} Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.679017 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcb379700371fa309b64d17f10ace416365d7e21b863c1e09580d804c2eecbe3" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.679065 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9t6gr" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.702214 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2knp8"] Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.702207 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-jkv7n" podStartSLOduration=2.135547442 podStartE2EDuration="5.702181273s" podCreationTimestamp="2026-01-30 05:24:50 +0000 UTC" firstStartedPulling="2026-01-30 05:24:51.70349144 +0000 UTC m=+1028.696964088" lastFinishedPulling="2026-01-30 05:24:55.270125241 +0000 UTC m=+1032.263597919" observedRunningTime="2026-01-30 05:24:55.698243139 +0000 UTC m=+1032.691715797" watchObservedRunningTime="2026-01-30 05:24:55.702181273 +0000 UTC m=+1032.695653951" Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.750273 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2e68-account-create-update-9fkzh"] Jan 30 05:24:55 crc kubenswrapper[4841]: W0130 05:24:55.765237 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod079879f7_939a_47a7_bbeb_98e1f8d7159b.slice/crio-6be2adb3a206bfc338bdbd3e9031c370d2175ece30a4639a7db227230194a135 WatchSource:0}: Error finding container 6be2adb3a206bfc338bdbd3e9031c370d2175ece30a4639a7db227230194a135: Status 404 returned error can't find the container with id 6be2adb3a206bfc338bdbd3e9031c370d2175ece30a4639a7db227230194a135 Jan 30 05:24:55 crc kubenswrapper[4841]: W0130 05:24:55.838195 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80ca3de9_8a40_47c6_8214_2c7cdb5724c9.slice/crio-11d2f95f7c58ff471b58a91173048e68b1ddb2058af7fc869e2a8690c40a6f3c WatchSource:0}: Error finding container 11d2f95f7c58ff471b58a91173048e68b1ddb2058af7fc869e2a8690c40a6f3c: Status 404 returned error can't find the container with id 11d2f95f7c58ff471b58a91173048e68b1ddb2058af7fc869e2a8690c40a6f3c Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.842098 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hrbft"] Jan 30 05:24:55 crc kubenswrapper[4841]: I0130 05:24:55.848043 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2cee-account-create-update-qbxlv"] Jan 30 05:24:55 crc kubenswrapper[4841]: W0130 05:24:55.870503 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbafcabd_b6f1_4c03_839d_4f837803974c.slice/crio-98f9da66dc2ac6479a76a9faa816e07385024a5ad8b276fa4e23a52fe8cfbcf5 WatchSource:0}: Error finding container 98f9da66dc2ac6479a76a9faa816e07385024a5ad8b276fa4e23a52fe8cfbcf5: Status 404 returned error can't find the container with id 98f9da66dc2ac6479a76a9faa816e07385024a5ad8b276fa4e23a52fe8cfbcf5 Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.093523 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.156526 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-2szrx"] Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.156788 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" podUID="89c123b6-19c8-4eb3-b34e-103dcd6cc16e" containerName="dnsmasq-dns" containerID="cri-o://a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab" gracePeriod=10 Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.616731 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.650035 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-dns-svc\") pod \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\" (UID: \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\") " Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.650142 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-config\") pod \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\" (UID: \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\") " Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.650261 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55blv\" (UniqueName: \"kubernetes.io/projected/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-kube-api-access-55blv\") pod \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\" (UID: \"89c123b6-19c8-4eb3-b34e-103dcd6cc16e\") " Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.656804 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-kube-api-access-55blv" (OuterVolumeSpecName: "kube-api-access-55blv") pod "89c123b6-19c8-4eb3-b34e-103dcd6cc16e" (UID: "89c123b6-19c8-4eb3-b34e-103dcd6cc16e"). InnerVolumeSpecName "kube-api-access-55blv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.684932 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-config" (OuterVolumeSpecName: "config") pod "89c123b6-19c8-4eb3-b34e-103dcd6cc16e" (UID: "89c123b6-19c8-4eb3-b34e-103dcd6cc16e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.690569 4841 generic.go:334] "Generic (PLEG): container finished" podID="079879f7-939a-47a7-bbeb-98e1f8d7159b" containerID="e3a84a22abdca1f0ccd6cb67b6a898ad9c44aaeffcafccdee75399832d892e6a" exitCode=0 Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.690638 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2e68-account-create-update-9fkzh" event={"ID":"079879f7-939a-47a7-bbeb-98e1f8d7159b","Type":"ContainerDied","Data":"e3a84a22abdca1f0ccd6cb67b6a898ad9c44aaeffcafccdee75399832d892e6a"} Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.690668 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2e68-account-create-update-9fkzh" event={"ID":"079879f7-939a-47a7-bbeb-98e1f8d7159b","Type":"ContainerStarted","Data":"6be2adb3a206bfc338bdbd3e9031c370d2175ece30a4639a7db227230194a135"} Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.695594 4841 generic.go:334] "Generic (PLEG): container finished" podID="cbafcabd-b6f1-4c03-839d-4f837803974c" containerID="beddd0e5331c71a232c6ceda578e9012b0f3d2f15d7b1915e9466873b9283ff4" exitCode=0 Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.695614 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2cee-account-create-update-qbxlv" event={"ID":"cbafcabd-b6f1-4c03-839d-4f837803974c","Type":"ContainerDied","Data":"beddd0e5331c71a232c6ceda578e9012b0f3d2f15d7b1915e9466873b9283ff4"} Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.695646 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2cee-account-create-update-qbxlv" event={"ID":"cbafcabd-b6f1-4c03-839d-4f837803974c","Type":"ContainerStarted","Data":"98f9da66dc2ac6479a76a9faa816e07385024a5ad8b276fa4e23a52fe8cfbcf5"} Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.697203 4841 generic.go:334] "Generic (PLEG): container finished" podID="80ca3de9-8a40-47c6-8214-2c7cdb5724c9" containerID="d429e82bce41d5cf25298d70fbda502e5d597b897e823ca0e7f4e7bee8495c64" exitCode=0 Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.697376 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hrbft" event={"ID":"80ca3de9-8a40-47c6-8214-2c7cdb5724c9","Type":"ContainerDied","Data":"d429e82bce41d5cf25298d70fbda502e5d597b897e823ca0e7f4e7bee8495c64"} Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.697395 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hrbft" event={"ID":"80ca3de9-8a40-47c6-8214-2c7cdb5724c9","Type":"ContainerStarted","Data":"11d2f95f7c58ff471b58a91173048e68b1ddb2058af7fc869e2a8690c40a6f3c"} Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.698073 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "89c123b6-19c8-4eb3-b34e-103dcd6cc16e" (UID: "89c123b6-19c8-4eb3-b34e-103dcd6cc16e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.699187 4841 generic.go:334] "Generic (PLEG): container finished" podID="89c123b6-19c8-4eb3-b34e-103dcd6cc16e" containerID="a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab" exitCode=0 Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.699214 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.699274 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" event={"ID":"89c123b6-19c8-4eb3-b34e-103dcd6cc16e","Type":"ContainerDied","Data":"a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab"} Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.699328 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-2szrx" event={"ID":"89c123b6-19c8-4eb3-b34e-103dcd6cc16e","Type":"ContainerDied","Data":"e549ff8f4dc6d61a441649f612ea78299cbed0d8a060616e1583c568b4b2df36"} Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.699356 4841 scope.go:117] "RemoveContainer" containerID="a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.706913 4841 generic.go:334] "Generic (PLEG): container finished" podID="69c5a675-0799-4ed6-b76b-3bfc55c6acbc" containerID="bf835ac6f606ee2b27b85b5eed4b650030810616e051c84d480b09ee6b694955" exitCode=0 Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.707864 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2knp8" event={"ID":"69c5a675-0799-4ed6-b76b-3bfc55c6acbc","Type":"ContainerDied","Data":"bf835ac6f606ee2b27b85b5eed4b650030810616e051c84d480b09ee6b694955"} Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.707885 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2knp8" event={"ID":"69c5a675-0799-4ed6-b76b-3bfc55c6acbc","Type":"ContainerStarted","Data":"fa0b27c4644d33b63c07bccee05afc00b62b1136470bdbc05a5b8efa72877164"} Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.737714 4841 scope.go:117] "RemoveContainer" containerID="4ad9e5e5cc9d026cc88df21dc9afd36f60c141301f41ff957b0346cc4f584f5e" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.751877 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.751902 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.751911 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55blv\" (UniqueName: \"kubernetes.io/projected/89c123b6-19c8-4eb3-b34e-103dcd6cc16e-kube-api-access-55blv\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.757136 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-2szrx"] Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.763668 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-2szrx"] Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.765549 4841 scope.go:117] "RemoveContainer" containerID="a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab" Jan 30 05:24:56 crc kubenswrapper[4841]: E0130 05:24:56.765999 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab\": container with ID starting with a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab not found: ID does not exist" containerID="a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.766025 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab"} err="failed to get container status \"a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab\": rpc error: code = NotFound desc = could not find container \"a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab\": container with ID starting with a2e08c2260291cb1d4a38210cd5f939369b128b7735ab85944c0a097f5fc4cab not found: ID does not exist" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.766045 4841 scope.go:117] "RemoveContainer" containerID="4ad9e5e5cc9d026cc88df21dc9afd36f60c141301f41ff957b0346cc4f584f5e" Jan 30 05:24:56 crc kubenswrapper[4841]: E0130 05:24:56.766349 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad9e5e5cc9d026cc88df21dc9afd36f60c141301f41ff957b0346cc4f584f5e\": container with ID starting with 4ad9e5e5cc9d026cc88df21dc9afd36f60c141301f41ff957b0346cc4f584f5e not found: ID does not exist" containerID="4ad9e5e5cc9d026cc88df21dc9afd36f60c141301f41ff957b0346cc4f584f5e" Jan 30 05:24:56 crc kubenswrapper[4841]: I0130 05:24:56.766372 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad9e5e5cc9d026cc88df21dc9afd36f60c141301f41ff957b0346cc4f584f5e"} err="failed to get container status \"4ad9e5e5cc9d026cc88df21dc9afd36f60c141301f41ff957b0346cc4f584f5e\": rpc error: code = NotFound desc = could not find container \"4ad9e5e5cc9d026cc88df21dc9afd36f60c141301f41ff957b0346cc4f584f5e\": container with ID starting with 4ad9e5e5cc9d026cc88df21dc9afd36f60c141301f41ff957b0346cc4f584f5e not found: ID does not exist" Jan 30 05:24:57 crc kubenswrapper[4841]: I0130 05:24:57.280281 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9t6gr"] Jan 30 05:24:57 crc kubenswrapper[4841]: I0130 05:24:57.290997 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9t6gr"] Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.184876 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2knp8" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.281134 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69c5a675-0799-4ed6-b76b-3bfc55c6acbc-operator-scripts\") pod \"69c5a675-0799-4ed6-b76b-3bfc55c6acbc\" (UID: \"69c5a675-0799-4ed6-b76b-3bfc55c6acbc\") " Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.281205 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzs5k\" (UniqueName: \"kubernetes.io/projected/69c5a675-0799-4ed6-b76b-3bfc55c6acbc-kube-api-access-dzs5k\") pod \"69c5a675-0799-4ed6-b76b-3bfc55c6acbc\" (UID: \"69c5a675-0799-4ed6-b76b-3bfc55c6acbc\") " Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.282310 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69c5a675-0799-4ed6-b76b-3bfc55c6acbc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69c5a675-0799-4ed6-b76b-3bfc55c6acbc" (UID: "69c5a675-0799-4ed6-b76b-3bfc55c6acbc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.282523 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69c5a675-0799-4ed6-b76b-3bfc55c6acbc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.286079 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c5a675-0799-4ed6-b76b-3bfc55c6acbc-kube-api-access-dzs5k" (OuterVolumeSpecName: "kube-api-access-dzs5k") pod "69c5a675-0799-4ed6-b76b-3bfc55c6acbc" (UID: "69c5a675-0799-4ed6-b76b-3bfc55c6acbc"). InnerVolumeSpecName "kube-api-access-dzs5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.322026 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e68-account-create-update-9fkzh" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.328164 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2cee-account-create-update-qbxlv" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.332337 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hrbft" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.383935 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079879f7-939a-47a7-bbeb-98e1f8d7159b-operator-scripts\") pod \"079879f7-939a-47a7-bbeb-98e1f8d7159b\" (UID: \"079879f7-939a-47a7-bbeb-98e1f8d7159b\") " Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.384109 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st4nf\" (UniqueName: \"kubernetes.io/projected/80ca3de9-8a40-47c6-8214-2c7cdb5724c9-kube-api-access-st4nf\") pod \"80ca3de9-8a40-47c6-8214-2c7cdb5724c9\" (UID: \"80ca3de9-8a40-47c6-8214-2c7cdb5724c9\") " Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.384156 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wshl7\" (UniqueName: \"kubernetes.io/projected/079879f7-939a-47a7-bbeb-98e1f8d7159b-kube-api-access-wshl7\") pod \"079879f7-939a-47a7-bbeb-98e1f8d7159b\" (UID: \"079879f7-939a-47a7-bbeb-98e1f8d7159b\") " Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.384225 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ca3de9-8a40-47c6-8214-2c7cdb5724c9-operator-scripts\") pod \"80ca3de9-8a40-47c6-8214-2c7cdb5724c9\" (UID: \"80ca3de9-8a40-47c6-8214-2c7cdb5724c9\") " Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.384306 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwwtv\" (UniqueName: \"kubernetes.io/projected/cbafcabd-b6f1-4c03-839d-4f837803974c-kube-api-access-cwwtv\") pod \"cbafcabd-b6f1-4c03-839d-4f837803974c\" (UID: \"cbafcabd-b6f1-4c03-839d-4f837803974c\") " Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.384377 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbafcabd-b6f1-4c03-839d-4f837803974c-operator-scripts\") pod \"cbafcabd-b6f1-4c03-839d-4f837803974c\" (UID: \"cbafcabd-b6f1-4c03-839d-4f837803974c\") " Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.384749 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzs5k\" (UniqueName: \"kubernetes.io/projected/69c5a675-0799-4ed6-b76b-3bfc55c6acbc-kube-api-access-dzs5k\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.385340 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80ca3de9-8a40-47c6-8214-2c7cdb5724c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80ca3de9-8a40-47c6-8214-2c7cdb5724c9" (UID: "80ca3de9-8a40-47c6-8214-2c7cdb5724c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.385598 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079879f7-939a-47a7-bbeb-98e1f8d7159b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "079879f7-939a-47a7-bbeb-98e1f8d7159b" (UID: "079879f7-939a-47a7-bbeb-98e1f8d7159b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.386052 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbafcabd-b6f1-4c03-839d-4f837803974c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cbafcabd-b6f1-4c03-839d-4f837803974c" (UID: "cbafcabd-b6f1-4c03-839d-4f837803974c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.389544 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbafcabd-b6f1-4c03-839d-4f837803974c-kube-api-access-cwwtv" (OuterVolumeSpecName: "kube-api-access-cwwtv") pod "cbafcabd-b6f1-4c03-839d-4f837803974c" (UID: "cbafcabd-b6f1-4c03-839d-4f837803974c"). InnerVolumeSpecName "kube-api-access-cwwtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.389574 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ca3de9-8a40-47c6-8214-2c7cdb5724c9-kube-api-access-st4nf" (OuterVolumeSpecName: "kube-api-access-st4nf") pod "80ca3de9-8a40-47c6-8214-2c7cdb5724c9" (UID: "80ca3de9-8a40-47c6-8214-2c7cdb5724c9"). InnerVolumeSpecName "kube-api-access-st4nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.389623 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079879f7-939a-47a7-bbeb-98e1f8d7159b-kube-api-access-wshl7" (OuterVolumeSpecName: "kube-api-access-wshl7") pod "079879f7-939a-47a7-bbeb-98e1f8d7159b" (UID: "079879f7-939a-47a7-bbeb-98e1f8d7159b"). InnerVolumeSpecName "kube-api-access-wshl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.441083 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="455dfc60-4744-463e-88b9-b58de1aa5e4f" path="/var/lib/kubelet/pods/455dfc60-4744-463e-88b9-b58de1aa5e4f/volumes" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.441583 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89c123b6-19c8-4eb3-b34e-103dcd6cc16e" path="/var/lib/kubelet/pods/89c123b6-19c8-4eb3-b34e-103dcd6cc16e/volumes" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.486454 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st4nf\" (UniqueName: \"kubernetes.io/projected/80ca3de9-8a40-47c6-8214-2c7cdb5724c9-kube-api-access-st4nf\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.486486 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wshl7\" (UniqueName: \"kubernetes.io/projected/079879f7-939a-47a7-bbeb-98e1f8d7159b-kube-api-access-wshl7\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.486496 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ca3de9-8a40-47c6-8214-2c7cdb5724c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.486505 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwwtv\" (UniqueName: \"kubernetes.io/projected/cbafcabd-b6f1-4c03-839d-4f837803974c-kube-api-access-cwwtv\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.486514 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbafcabd-b6f1-4c03-839d-4f837803974c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.486523 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079879f7-939a-47a7-bbeb-98e1f8d7159b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.729348 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2knp8" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.729365 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2knp8" event={"ID":"69c5a675-0799-4ed6-b76b-3bfc55c6acbc","Type":"ContainerDied","Data":"fa0b27c4644d33b63c07bccee05afc00b62b1136470bdbc05a5b8efa72877164"} Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.729481 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa0b27c4644d33b63c07bccee05afc00b62b1136470bdbc05a5b8efa72877164" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.731260 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2e68-account-create-update-9fkzh" event={"ID":"079879f7-939a-47a7-bbeb-98e1f8d7159b","Type":"ContainerDied","Data":"6be2adb3a206bfc338bdbd3e9031c370d2175ece30a4639a7db227230194a135"} Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.731342 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6be2adb3a206bfc338bdbd3e9031c370d2175ece30a4639a7db227230194a135" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.731562 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e68-account-create-update-9fkzh" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.736542 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2cee-account-create-update-qbxlv" event={"ID":"cbafcabd-b6f1-4c03-839d-4f837803974c","Type":"ContainerDied","Data":"98f9da66dc2ac6479a76a9faa816e07385024a5ad8b276fa4e23a52fe8cfbcf5"} Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.736575 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98f9da66dc2ac6479a76a9faa816e07385024a5ad8b276fa4e23a52fe8cfbcf5" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.736646 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2cee-account-create-update-qbxlv" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.738713 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hrbft" event={"ID":"80ca3de9-8a40-47c6-8214-2c7cdb5724c9","Type":"ContainerDied","Data":"11d2f95f7c58ff471b58a91173048e68b1ddb2058af7fc869e2a8690c40a6f3c"} Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.738732 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11d2f95f7c58ff471b58a91173048e68b1ddb2058af7fc869e2a8690c40a6f3c" Jan 30 05:24:58 crc kubenswrapper[4841]: I0130 05:24:58.738807 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hrbft" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.654751 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.686904 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-mtgqw"] Jan 30 05:24:59 crc kubenswrapper[4841]: E0130 05:24:59.687279 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c123b6-19c8-4eb3-b34e-103dcd6cc16e" containerName="dnsmasq-dns" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687306 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c123b6-19c8-4eb3-b34e-103dcd6cc16e" containerName="dnsmasq-dns" Jan 30 05:24:59 crc kubenswrapper[4841]: E0130 05:24:59.687339 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ca3de9-8a40-47c6-8214-2c7cdb5724c9" containerName="mariadb-database-create" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687348 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ca3de9-8a40-47c6-8214-2c7cdb5724c9" containerName="mariadb-database-create" Jan 30 05:24:59 crc kubenswrapper[4841]: E0130 05:24:59.687362 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c5a675-0799-4ed6-b76b-3bfc55c6acbc" containerName="mariadb-database-create" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687370 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c5a675-0799-4ed6-b76b-3bfc55c6acbc" containerName="mariadb-database-create" Jan 30 05:24:59 crc kubenswrapper[4841]: E0130 05:24:59.687390 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa02d9bc-89ad-4e58-aa4b-62455308f9e7" containerName="mariadb-database-create" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687417 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa02d9bc-89ad-4e58-aa4b-62455308f9e7" containerName="mariadb-database-create" Jan 30 05:24:59 crc kubenswrapper[4841]: E0130 05:24:59.687435 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079879f7-939a-47a7-bbeb-98e1f8d7159b" containerName="mariadb-account-create-update" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687443 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="079879f7-939a-47a7-bbeb-98e1f8d7159b" containerName="mariadb-account-create-update" Jan 30 05:24:59 crc kubenswrapper[4841]: E0130 05:24:59.687455 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455dfc60-4744-463e-88b9-b58de1aa5e4f" containerName="mariadb-account-create-update" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687463 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="455dfc60-4744-463e-88b9-b58de1aa5e4f" containerName="mariadb-account-create-update" Jan 30 05:24:59 crc kubenswrapper[4841]: E0130 05:24:59.687472 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbafcabd-b6f1-4c03-839d-4f837803974c" containerName="mariadb-account-create-update" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687479 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbafcabd-b6f1-4c03-839d-4f837803974c" containerName="mariadb-account-create-update" Jan 30 05:24:59 crc kubenswrapper[4841]: E0130 05:24:59.687492 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c123b6-19c8-4eb3-b34e-103dcd6cc16e" containerName="init" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687499 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c123b6-19c8-4eb3-b34e-103dcd6cc16e" containerName="init" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687696 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbafcabd-b6f1-4c03-839d-4f837803974c" containerName="mariadb-account-create-update" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687717 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c123b6-19c8-4eb3-b34e-103dcd6cc16e" containerName="dnsmasq-dns" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687730 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="079879f7-939a-47a7-bbeb-98e1f8d7159b" containerName="mariadb-account-create-update" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687742 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="80ca3de9-8a40-47c6-8214-2c7cdb5724c9" containerName="mariadb-database-create" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687756 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c5a675-0799-4ed6-b76b-3bfc55c6acbc" containerName="mariadb-database-create" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687767 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa02d9bc-89ad-4e58-aa4b-62455308f9e7" containerName="mariadb-database-create" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.687777 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="455dfc60-4744-463e-88b9-b58de1aa5e4f" containerName="mariadb-account-create-update" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.688342 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mtgqw" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.690654 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tfq6h" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.691118 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.705216 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-db-sync-config-data\") pod \"glance-db-sync-mtgqw\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " pod="openstack/glance-db-sync-mtgqw" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.705589 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c974q\" (UniqueName: \"kubernetes.io/projected/bef5639f-2a56-4255-ae24-a8f794a7b715-kube-api-access-c974q\") pod \"glance-db-sync-mtgqw\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " pod="openstack/glance-db-sync-mtgqw" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.705651 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-config-data\") pod \"glance-db-sync-mtgqw\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " pod="openstack/glance-db-sync-mtgqw" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.705806 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-combined-ca-bundle\") pod \"glance-db-sync-mtgqw\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " pod="openstack/glance-db-sync-mtgqw" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.709866 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mtgqw"] Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.808100 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-combined-ca-bundle\") pod \"glance-db-sync-mtgqw\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " pod="openstack/glance-db-sync-mtgqw" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.808218 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-db-sync-config-data\") pod \"glance-db-sync-mtgqw\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " pod="openstack/glance-db-sync-mtgqw" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.808251 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c974q\" (UniqueName: \"kubernetes.io/projected/bef5639f-2a56-4255-ae24-a8f794a7b715-kube-api-access-c974q\") pod \"glance-db-sync-mtgqw\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " pod="openstack/glance-db-sync-mtgqw" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.808298 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-config-data\") pod \"glance-db-sync-mtgqw\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " pod="openstack/glance-db-sync-mtgqw" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.811976 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-db-sync-config-data\") pod \"glance-db-sync-mtgqw\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " pod="openstack/glance-db-sync-mtgqw" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.812239 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-config-data\") pod \"glance-db-sync-mtgqw\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " pod="openstack/glance-db-sync-mtgqw" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.824185 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-combined-ca-bundle\") pod \"glance-db-sync-mtgqw\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " pod="openstack/glance-db-sync-mtgqw" Jan 30 05:24:59 crc kubenswrapper[4841]: I0130 05:24:59.833507 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c974q\" (UniqueName: \"kubernetes.io/projected/bef5639f-2a56-4255-ae24-a8f794a7b715-kube-api-access-c974q\") pod \"glance-db-sync-mtgqw\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " pod="openstack/glance-db-sync-mtgqw" Jan 30 05:25:00 crc kubenswrapper[4841]: I0130 05:25:00.004498 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mtgqw" Jan 30 05:25:00 crc kubenswrapper[4841]: I0130 05:25:00.350983 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mtgqw"] Jan 30 05:25:00 crc kubenswrapper[4841]: W0130 05:25:00.357372 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbef5639f_2a56_4255_ae24_a8f794a7b715.slice/crio-771694525e8d562e923111942b353ecef6077d76a9710f623bed79843db591e1 WatchSource:0}: Error finding container 771694525e8d562e923111942b353ecef6077d76a9710f623bed79843db591e1: Status 404 returned error can't find the container with id 771694525e8d562e923111942b353ecef6077d76a9710f623bed79843db591e1 Jan 30 05:25:00 crc kubenswrapper[4841]: I0130 05:25:00.755932 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mtgqw" event={"ID":"bef5639f-2a56-4255-ae24-a8f794a7b715","Type":"ContainerStarted","Data":"771694525e8d562e923111942b353ecef6077d76a9710f623bed79843db591e1"} Jan 30 05:25:00 crc kubenswrapper[4841]: I0130 05:25:00.889009 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bjfr4"] Jan 30 05:25:00 crc kubenswrapper[4841]: I0130 05:25:00.890193 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bjfr4" Jan 30 05:25:00 crc kubenswrapper[4841]: I0130 05:25:00.892900 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 05:25:00 crc kubenswrapper[4841]: I0130 05:25:00.908748 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bjfr4"] Jan 30 05:25:00 crc kubenswrapper[4841]: I0130 05:25:00.928058 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhvdc\" (UniqueName: \"kubernetes.io/projected/6183a9f6-a078-47ff-afa9-fbd512678218-kube-api-access-vhvdc\") pod \"root-account-create-update-bjfr4\" (UID: \"6183a9f6-a078-47ff-afa9-fbd512678218\") " pod="openstack/root-account-create-update-bjfr4" Jan 30 05:25:00 crc kubenswrapper[4841]: I0130 05:25:00.928381 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6183a9f6-a078-47ff-afa9-fbd512678218-operator-scripts\") pod \"root-account-create-update-bjfr4\" (UID: \"6183a9f6-a078-47ff-afa9-fbd512678218\") " pod="openstack/root-account-create-update-bjfr4" Jan 30 05:25:01 crc kubenswrapper[4841]: I0130 05:25:01.029838 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6183a9f6-a078-47ff-afa9-fbd512678218-operator-scripts\") pod \"root-account-create-update-bjfr4\" (UID: \"6183a9f6-a078-47ff-afa9-fbd512678218\") " pod="openstack/root-account-create-update-bjfr4" Jan 30 05:25:01 crc kubenswrapper[4841]: I0130 05:25:01.029906 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhvdc\" (UniqueName: \"kubernetes.io/projected/6183a9f6-a078-47ff-afa9-fbd512678218-kube-api-access-vhvdc\") pod \"root-account-create-update-bjfr4\" (UID: \"6183a9f6-a078-47ff-afa9-fbd512678218\") " pod="openstack/root-account-create-update-bjfr4" Jan 30 05:25:01 crc kubenswrapper[4841]: I0130 05:25:01.030728 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6183a9f6-a078-47ff-afa9-fbd512678218-operator-scripts\") pod \"root-account-create-update-bjfr4\" (UID: \"6183a9f6-a078-47ff-afa9-fbd512678218\") " pod="openstack/root-account-create-update-bjfr4" Jan 30 05:25:01 crc kubenswrapper[4841]: I0130 05:25:01.073875 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhvdc\" (UniqueName: \"kubernetes.io/projected/6183a9f6-a078-47ff-afa9-fbd512678218-kube-api-access-vhvdc\") pod \"root-account-create-update-bjfr4\" (UID: \"6183a9f6-a078-47ff-afa9-fbd512678218\") " pod="openstack/root-account-create-update-bjfr4" Jan 30 05:25:01 crc kubenswrapper[4841]: I0130 05:25:01.216953 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bjfr4" Jan 30 05:25:01 crc kubenswrapper[4841]: I0130 05:25:01.690086 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bjfr4"] Jan 30 05:25:01 crc kubenswrapper[4841]: W0130 05:25:01.711945 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6183a9f6_a078_47ff_afa9_fbd512678218.slice/crio-2f524a757c900201c10f68195e825075c13cace5c3f2f25a825f03d121865f2d WatchSource:0}: Error finding container 2f524a757c900201c10f68195e825075c13cace5c3f2f25a825f03d121865f2d: Status 404 returned error can't find the container with id 2f524a757c900201c10f68195e825075c13cace5c3f2f25a825f03d121865f2d Jan 30 05:25:01 crc kubenswrapper[4841]: I0130 05:25:01.771739 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bjfr4" event={"ID":"6183a9f6-a078-47ff-afa9-fbd512678218","Type":"ContainerStarted","Data":"2f524a757c900201c10f68195e825075c13cace5c3f2f25a825f03d121865f2d"} Jan 30 05:25:02 crc kubenswrapper[4841]: I0130 05:25:02.779645 4841 generic.go:334] "Generic (PLEG): container finished" podID="ff309626-60f6-4110-8b20-5354dab1ca68" containerID="34e0d86f2d28c1bd9d913121757607f09ba8a336c043340479e6bcb08471c2ac" exitCode=0 Jan 30 05:25:02 crc kubenswrapper[4841]: I0130 05:25:02.779983 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jkv7n" event={"ID":"ff309626-60f6-4110-8b20-5354dab1ca68","Type":"ContainerDied","Data":"34e0d86f2d28c1bd9d913121757607f09ba8a336c043340479e6bcb08471c2ac"} Jan 30 05:25:02 crc kubenswrapper[4841]: I0130 05:25:02.781896 4841 generic.go:334] "Generic (PLEG): container finished" podID="6183a9f6-a078-47ff-afa9-fbd512678218" containerID="a56f6d727a91da435870dbaf6b07fb62f03128af7ecaf22716aced269514b47c" exitCode=0 Jan 30 05:25:02 crc kubenswrapper[4841]: I0130 05:25:02.781944 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bjfr4" event={"ID":"6183a9f6-a078-47ff-afa9-fbd512678218","Type":"ContainerDied","Data":"a56f6d727a91da435870dbaf6b07fb62f03128af7ecaf22716aced269514b47c"} Jan 30 05:25:02 crc kubenswrapper[4841]: I0130 05:25:02.789908 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:25:02 crc kubenswrapper[4841]: I0130 05:25:02.807494 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift\") pod \"swift-storage-0\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " pod="openstack/swift-storage-0" Jan 30 05:25:02 crc kubenswrapper[4841]: I0130 05:25:02.827908 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 05:25:03 crc kubenswrapper[4841]: I0130 05:25:03.379881 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 30 05:25:03 crc kubenswrapper[4841]: W0130 05:25:03.390573 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb70c69eb_7b62_446a_8748_9a80d6fbe28b.slice/crio-b6be38aa978c80c7d631871602aee59c9e2747140cd9edc429408448302bfdaa WatchSource:0}: Error finding container b6be38aa978c80c7d631871602aee59c9e2747140cd9edc429408448302bfdaa: Status 404 returned error can't find the container with id b6be38aa978c80c7d631871602aee59c9e2747140cd9edc429408448302bfdaa Jan 30 05:25:03 crc kubenswrapper[4841]: I0130 05:25:03.794582 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"b6be38aa978c80c7d631871602aee59c9e2747140cd9edc429408448302bfdaa"} Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.267021 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bjfr4" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.274460 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.460745 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff309626-60f6-4110-8b20-5354dab1ca68-etc-swift\") pod \"ff309626-60f6-4110-8b20-5354dab1ca68\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.460814 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhvdc\" (UniqueName: \"kubernetes.io/projected/6183a9f6-a078-47ff-afa9-fbd512678218-kube-api-access-vhvdc\") pod \"6183a9f6-a078-47ff-afa9-fbd512678218\" (UID: \"6183a9f6-a078-47ff-afa9-fbd512678218\") " Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.460854 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff309626-60f6-4110-8b20-5354dab1ca68-ring-data-devices\") pod \"ff309626-60f6-4110-8b20-5354dab1ca68\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.460881 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6183a9f6-a078-47ff-afa9-fbd512678218-operator-scripts\") pod \"6183a9f6-a078-47ff-afa9-fbd512678218\" (UID: \"6183a9f6-a078-47ff-afa9-fbd512678218\") " Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.460914 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvvmt\" (UniqueName: \"kubernetes.io/projected/ff309626-60f6-4110-8b20-5354dab1ca68-kube-api-access-mvvmt\") pod \"ff309626-60f6-4110-8b20-5354dab1ca68\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.460973 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-dispersionconf\") pod \"ff309626-60f6-4110-8b20-5354dab1ca68\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.461009 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-swiftconf\") pod \"ff309626-60f6-4110-8b20-5354dab1ca68\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.461045 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-combined-ca-bundle\") pod \"ff309626-60f6-4110-8b20-5354dab1ca68\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.461116 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff309626-60f6-4110-8b20-5354dab1ca68-scripts\") pod \"ff309626-60f6-4110-8b20-5354dab1ca68\" (UID: \"ff309626-60f6-4110-8b20-5354dab1ca68\") " Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.461878 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff309626-60f6-4110-8b20-5354dab1ca68-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ff309626-60f6-4110-8b20-5354dab1ca68" (UID: "ff309626-60f6-4110-8b20-5354dab1ca68"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.465183 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff309626-60f6-4110-8b20-5354dab1ca68-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ff309626-60f6-4110-8b20-5354dab1ca68" (UID: "ff309626-60f6-4110-8b20-5354dab1ca68"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.465985 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6183a9f6-a078-47ff-afa9-fbd512678218-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6183a9f6-a078-47ff-afa9-fbd512678218" (UID: "6183a9f6-a078-47ff-afa9-fbd512678218"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.471012 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6183a9f6-a078-47ff-afa9-fbd512678218-kube-api-access-vhvdc" (OuterVolumeSpecName: "kube-api-access-vhvdc") pod "6183a9f6-a078-47ff-afa9-fbd512678218" (UID: "6183a9f6-a078-47ff-afa9-fbd512678218"). InnerVolumeSpecName "kube-api-access-vhvdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.483787 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff309626-60f6-4110-8b20-5354dab1ca68-kube-api-access-mvvmt" (OuterVolumeSpecName: "kube-api-access-mvvmt") pod "ff309626-60f6-4110-8b20-5354dab1ca68" (UID: "ff309626-60f6-4110-8b20-5354dab1ca68"). InnerVolumeSpecName "kube-api-access-mvvmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.485012 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ff309626-60f6-4110-8b20-5354dab1ca68" (UID: "ff309626-60f6-4110-8b20-5354dab1ca68"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.501966 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ff309626-60f6-4110-8b20-5354dab1ca68" (UID: "ff309626-60f6-4110-8b20-5354dab1ca68"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.512181 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff309626-60f6-4110-8b20-5354dab1ca68-scripts" (OuterVolumeSpecName: "scripts") pod "ff309626-60f6-4110-8b20-5354dab1ca68" (UID: "ff309626-60f6-4110-8b20-5354dab1ca68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.515522 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff309626-60f6-4110-8b20-5354dab1ca68" (UID: "ff309626-60f6-4110-8b20-5354dab1ca68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.563314 4841 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff309626-60f6-4110-8b20-5354dab1ca68-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.563350 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6183a9f6-a078-47ff-afa9-fbd512678218-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.563363 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvvmt\" (UniqueName: \"kubernetes.io/projected/ff309626-60f6-4110-8b20-5354dab1ca68-kube-api-access-mvvmt\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.563378 4841 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.563389 4841 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.563431 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff309626-60f6-4110-8b20-5354dab1ca68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.563443 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff309626-60f6-4110-8b20-5354dab1ca68-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.563455 4841 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff309626-60f6-4110-8b20-5354dab1ca68-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.563466 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhvdc\" (UniqueName: \"kubernetes.io/projected/6183a9f6-a078-47ff-afa9-fbd512678218-kube-api-access-vhvdc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.596606 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-88rdn" podUID="47d25b55-9643-45fd-b2fe-eb593334924d" containerName="ovn-controller" probeResult="failure" output=< Jan 30 05:25:04 crc kubenswrapper[4841]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 05:25:04 crc kubenswrapper[4841]: > Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.602107 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.806199 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jkv7n" event={"ID":"ff309626-60f6-4110-8b20-5354dab1ca68","Type":"ContainerDied","Data":"382de7a4b251ae28aa4add4ffb25c244e79f5de637cfc79ac31a53eada41f353"} Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.806237 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="382de7a4b251ae28aa4add4ffb25c244e79f5de637cfc79ac31a53eada41f353" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.806338 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jkv7n" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.815247 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bjfr4" event={"ID":"6183a9f6-a078-47ff-afa9-fbd512678218","Type":"ContainerDied","Data":"2f524a757c900201c10f68195e825075c13cace5c3f2f25a825f03d121865f2d"} Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.815286 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f524a757c900201c10f68195e825075c13cace5c3f2f25a825f03d121865f2d" Jan 30 05:25:04 crc kubenswrapper[4841]: I0130 05:25:04.815340 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bjfr4" Jan 30 05:25:05 crc kubenswrapper[4841]: I0130 05:25:05.824720 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c"} Jan 30 05:25:05 crc kubenswrapper[4841]: I0130 05:25:05.825269 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405"} Jan 30 05:25:05 crc kubenswrapper[4841]: I0130 05:25:05.825280 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14"} Jan 30 05:25:05 crc kubenswrapper[4841]: I0130 05:25:05.825287 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb"} Jan 30 05:25:06 crc kubenswrapper[4841]: I0130 05:25:06.834964 4841 generic.go:334] "Generic (PLEG): container finished" podID="cc423120-ba93-465b-8ef8-871904b901ef" containerID="7f1ddbf696f7e72c83c4c0ee8f83a7e88cf635df1e2c2faa46855be05f07c1a9" exitCode=0 Jan 30 05:25:06 crc kubenswrapper[4841]: I0130 05:25:06.835015 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc423120-ba93-465b-8ef8-871904b901ef","Type":"ContainerDied","Data":"7f1ddbf696f7e72c83c4c0ee8f83a7e88cf635df1e2c2faa46855be05f07c1a9"} Jan 30 05:25:07 crc kubenswrapper[4841]: I0130 05:25:07.268140 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bjfr4"] Jan 30 05:25:07 crc kubenswrapper[4841]: I0130 05:25:07.273461 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bjfr4"] Jan 30 05:25:07 crc kubenswrapper[4841]: I0130 05:25:07.845634 4841 generic.go:334] "Generic (PLEG): container finished" podID="ad7779ad-0912-4695-853f-3ce786c2e9ae" containerID="4dfd3bf8acad84bdc8434c960a6f69f824e54adf209164c30f09c4f21207c7b8" exitCode=0 Jan 30 05:25:07 crc kubenswrapper[4841]: I0130 05:25:07.845695 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad7779ad-0912-4695-853f-3ce786c2e9ae","Type":"ContainerDied","Data":"4dfd3bf8acad84bdc8434c960a6f69f824e54adf209164c30f09c4f21207c7b8"} Jan 30 05:25:08 crc kubenswrapper[4841]: I0130 05:25:08.444573 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6183a9f6-a078-47ff-afa9-fbd512678218" path="/var/lib/kubelet/pods/6183a9f6-a078-47ff-afa9-fbd512678218/volumes" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.594571 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-88rdn" podUID="47d25b55-9643-45fd-b2fe-eb593334924d" containerName="ovn-controller" probeResult="failure" output=< Jan 30 05:25:09 crc kubenswrapper[4841]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 05:25:09 crc kubenswrapper[4841]: > Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.609186 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.831459 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-88rdn-config-d2h44"] Jan 30 05:25:09 crc kubenswrapper[4841]: E0130 05:25:09.831898 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff309626-60f6-4110-8b20-5354dab1ca68" containerName="swift-ring-rebalance" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.831912 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff309626-60f6-4110-8b20-5354dab1ca68" containerName="swift-ring-rebalance" Jan 30 05:25:09 crc kubenswrapper[4841]: E0130 05:25:09.831929 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6183a9f6-a078-47ff-afa9-fbd512678218" containerName="mariadb-account-create-update" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.831939 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6183a9f6-a078-47ff-afa9-fbd512678218" containerName="mariadb-account-create-update" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.832164 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff309626-60f6-4110-8b20-5354dab1ca68" containerName="swift-ring-rebalance" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.832182 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6183a9f6-a078-47ff-afa9-fbd512678218" containerName="mariadb-account-create-update" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.832934 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.840850 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.843911 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-88rdn-config-d2h44"] Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.964843 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-scripts\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.965193 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxvm5\" (UniqueName: \"kubernetes.io/projected/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-kube-api-access-lxvm5\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.965287 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-run-ovn\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.965444 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-log-ovn\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.965610 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-additional-scripts\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:09 crc kubenswrapper[4841]: I0130 05:25:09.965664 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-run\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.067224 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxvm5\" (UniqueName: \"kubernetes.io/projected/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-kube-api-access-lxvm5\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.067302 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-run-ovn\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.067353 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-log-ovn\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.067764 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-run-ovn\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.067814 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-log-ovn\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.068647 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-additional-scripts\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.069514 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-run\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.069528 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-additional-scripts\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.069581 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-run\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.069991 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-scripts\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.074299 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-scripts\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.105515 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxvm5\" (UniqueName: \"kubernetes.io/projected/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-kube-api-access-lxvm5\") pod \"ovn-controller-88rdn-config-d2h44\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:10 crc kubenswrapper[4841]: I0130 05:25:10.166334 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.303313 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xxskp"] Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.306193 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xxskp" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.309923 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.312912 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xxskp"] Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.425007 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d80c9d6e-25f1-4629-97f0-724c2353944b-operator-scripts\") pod \"root-account-create-update-xxskp\" (UID: \"d80c9d6e-25f1-4629-97f0-724c2353944b\") " pod="openstack/root-account-create-update-xxskp" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.425299 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86xhj\" (UniqueName: \"kubernetes.io/projected/d80c9d6e-25f1-4629-97f0-724c2353944b-kube-api-access-86xhj\") pod \"root-account-create-update-xxskp\" (UID: \"d80c9d6e-25f1-4629-97f0-724c2353944b\") " pod="openstack/root-account-create-update-xxskp" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.527238 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d80c9d6e-25f1-4629-97f0-724c2353944b-operator-scripts\") pod \"root-account-create-update-xxskp\" (UID: \"d80c9d6e-25f1-4629-97f0-724c2353944b\") " pod="openstack/root-account-create-update-xxskp" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.527314 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86xhj\" (UniqueName: \"kubernetes.io/projected/d80c9d6e-25f1-4629-97f0-724c2353944b-kube-api-access-86xhj\") pod \"root-account-create-update-xxskp\" (UID: \"d80c9d6e-25f1-4629-97f0-724c2353944b\") " pod="openstack/root-account-create-update-xxskp" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.528960 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d80c9d6e-25f1-4629-97f0-724c2353944b-operator-scripts\") pod \"root-account-create-update-xxskp\" (UID: \"d80c9d6e-25f1-4629-97f0-724c2353944b\") " pod="openstack/root-account-create-update-xxskp" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.543627 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86xhj\" (UniqueName: \"kubernetes.io/projected/d80c9d6e-25f1-4629-97f0-724c2353944b-kube-api-access-86xhj\") pod \"root-account-create-update-xxskp\" (UID: \"d80c9d6e-25f1-4629-97f0-724c2353944b\") " pod="openstack/root-account-create-update-xxskp" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.715186 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xxskp" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.744544 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-88rdn-config-d2h44"] Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.892593 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc423120-ba93-465b-8ef8-871904b901ef","Type":"ContainerStarted","Data":"a19190f1ad60a61a31c984cadceaa8ab89c01149a3adc2e54a53efba5d740bd4"} Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.892826 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.895511 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad7779ad-0912-4695-853f-3ce786c2e9ae","Type":"ContainerStarted","Data":"7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97"} Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.895694 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.923765 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=56.183522179 podStartE2EDuration="1m3.923747874s" podCreationTimestamp="2026-01-30 05:24:09 +0000 UTC" firstStartedPulling="2026-01-30 05:24:23.309369812 +0000 UTC m=+1000.302842450" lastFinishedPulling="2026-01-30 05:24:31.049595497 +0000 UTC m=+1008.043068145" observedRunningTime="2026-01-30 05:25:12.918335069 +0000 UTC m=+1049.911807707" watchObservedRunningTime="2026-01-30 05:25:12.923747874 +0000 UTC m=+1049.917220522" Jan 30 05:25:12 crc kubenswrapper[4841]: I0130 05:25:12.954147 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.980922838 podStartE2EDuration="1m3.954130416s" podCreationTimestamp="2026-01-30 05:24:09 +0000 UTC" firstStartedPulling="2026-01-30 05:24:23.786763456 +0000 UTC m=+1000.780236104" lastFinishedPulling="2026-01-30 05:24:31.759971034 +0000 UTC m=+1008.753443682" observedRunningTime="2026-01-30 05:25:12.952087961 +0000 UTC m=+1049.945560599" watchObservedRunningTime="2026-01-30 05:25:12.954130416 +0000 UTC m=+1049.947603054" Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.202074 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xxskp"] Jan 30 05:25:13 crc kubenswrapper[4841]: W0130 05:25:13.237491 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd80c9d6e_25f1_4629_97f0_724c2353944b.slice/crio-72eaf23484520ca615dfec5ef9d10e92964c1a29f9ae95d6ce2a8bdb5b2fdffc WatchSource:0}: Error finding container 72eaf23484520ca615dfec5ef9d10e92964c1a29f9ae95d6ce2a8bdb5b2fdffc: Status 404 returned error can't find the container with id 72eaf23484520ca615dfec5ef9d10e92964c1a29f9ae95d6ce2a8bdb5b2fdffc Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.903410 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mtgqw" event={"ID":"bef5639f-2a56-4255-ae24-a8f794a7b715","Type":"ContainerStarted","Data":"99b725593da174a758fc675f27f806a5cc026e7e2b8a1e1f72cd1b69bbef305c"} Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.905069 4841 generic.go:334] "Generic (PLEG): container finished" podID="d80c9d6e-25f1-4629-97f0-724c2353944b" containerID="15adcc49aa8a94852a40b3a0596b418fd77363d9823b75d779030e55e32d5311" exitCode=0 Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.905119 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xxskp" event={"ID":"d80c9d6e-25f1-4629-97f0-724c2353944b","Type":"ContainerDied","Data":"15adcc49aa8a94852a40b3a0596b418fd77363d9823b75d779030e55e32d5311"} Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.905135 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xxskp" event={"ID":"d80c9d6e-25f1-4629-97f0-724c2353944b","Type":"ContainerStarted","Data":"72eaf23484520ca615dfec5ef9d10e92964c1a29f9ae95d6ce2a8bdb5b2fdffc"} Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.908754 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284"} Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.908777 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131"} Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.908787 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4"} Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.908795 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252"} Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.909952 4841 generic.go:334] "Generic (PLEG): container finished" podID="07f8c080-6cdc-4ea5-af1e-b361ae2cccdc" containerID="85111eda2e1bde25e3a60cad27ea30e9be9e11c5f182fa65d1bb53519be46498" exitCode=0 Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.910068 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-88rdn-config-d2h44" event={"ID":"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc","Type":"ContainerDied","Data":"85111eda2e1bde25e3a60cad27ea30e9be9e11c5f182fa65d1bb53519be46498"} Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.910112 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-88rdn-config-d2h44" event={"ID":"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc","Type":"ContainerStarted","Data":"0d563ede3e34ded89d7b73bb63a43577aa3ba7cc0a5ba5c809a833767dd9e629"} Jan 30 05:25:13 crc kubenswrapper[4841]: I0130 05:25:13.921356 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-mtgqw" podStartSLOduration=2.882345129 podStartE2EDuration="14.921344313s" podCreationTimestamp="2026-01-30 05:24:59 +0000 UTC" firstStartedPulling="2026-01-30 05:25:00.359754908 +0000 UTC m=+1037.353227546" lastFinishedPulling="2026-01-30 05:25:12.398754092 +0000 UTC m=+1049.392226730" observedRunningTime="2026-01-30 05:25:13.91790148 +0000 UTC m=+1050.911374118" watchObservedRunningTime="2026-01-30 05:25:13.921344313 +0000 UTC m=+1050.914816961" Jan 30 05:25:14 crc kubenswrapper[4841]: I0130 05:25:14.600482 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-88rdn" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.267747 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xxskp" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.378323 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d80c9d6e-25f1-4629-97f0-724c2353944b-operator-scripts\") pod \"d80c9d6e-25f1-4629-97f0-724c2353944b\" (UID: \"d80c9d6e-25f1-4629-97f0-724c2353944b\") " Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.378871 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86xhj\" (UniqueName: \"kubernetes.io/projected/d80c9d6e-25f1-4629-97f0-724c2353944b-kube-api-access-86xhj\") pod \"d80c9d6e-25f1-4629-97f0-724c2353944b\" (UID: \"d80c9d6e-25f1-4629-97f0-724c2353944b\") " Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.378950 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d80c9d6e-25f1-4629-97f0-724c2353944b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d80c9d6e-25f1-4629-97f0-724c2353944b" (UID: "d80c9d6e-25f1-4629-97f0-724c2353944b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.379495 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d80c9d6e-25f1-4629-97f0-724c2353944b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.385492 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d80c9d6e-25f1-4629-97f0-724c2353944b-kube-api-access-86xhj" (OuterVolumeSpecName: "kube-api-access-86xhj") pod "d80c9d6e-25f1-4629-97f0-724c2353944b" (UID: "d80c9d6e-25f1-4629-97f0-724c2353944b"). InnerVolumeSpecName "kube-api-access-86xhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.441646 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.480925 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86xhj\" (UniqueName: \"kubernetes.io/projected/d80c9d6e-25f1-4629-97f0-724c2353944b-kube-api-access-86xhj\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.581888 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxvm5\" (UniqueName: \"kubernetes.io/projected/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-kube-api-access-lxvm5\") pod \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.581927 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-log-ovn\") pod \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.581977 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-run\") pod \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.582015 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-run-ovn\") pod \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.582132 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-scripts\") pod \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.582048 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "07f8c080-6cdc-4ea5-af1e-b361ae2cccdc" (UID: "07f8c080-6cdc-4ea5-af1e-b361ae2cccdc"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.582068 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "07f8c080-6cdc-4ea5-af1e-b361ae2cccdc" (UID: "07f8c080-6cdc-4ea5-af1e-b361ae2cccdc"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.582090 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-run" (OuterVolumeSpecName: "var-run") pod "07f8c080-6cdc-4ea5-af1e-b361ae2cccdc" (UID: "07f8c080-6cdc-4ea5-af1e-b361ae2cccdc"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.582266 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-additional-scripts\") pod \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\" (UID: \"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc\") " Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.582599 4841 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.582614 4841 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.582621 4841 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.582974 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "07f8c080-6cdc-4ea5-af1e-b361ae2cccdc" (UID: "07f8c080-6cdc-4ea5-af1e-b361ae2cccdc"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.582996 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-scripts" (OuterVolumeSpecName: "scripts") pod "07f8c080-6cdc-4ea5-af1e-b361ae2cccdc" (UID: "07f8c080-6cdc-4ea5-af1e-b361ae2cccdc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.586572 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-kube-api-access-lxvm5" (OuterVolumeSpecName: "kube-api-access-lxvm5") pod "07f8c080-6cdc-4ea5-af1e-b361ae2cccdc" (UID: "07f8c080-6cdc-4ea5-af1e-b361ae2cccdc"). InnerVolumeSpecName "kube-api-access-lxvm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.684685 4841 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.684721 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxvm5\" (UniqueName: \"kubernetes.io/projected/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-kube-api-access-lxvm5\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.684737 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.924106 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-88rdn-config-d2h44" event={"ID":"07f8c080-6cdc-4ea5-af1e-b361ae2cccdc","Type":"ContainerDied","Data":"0d563ede3e34ded89d7b73bb63a43577aa3ba7cc0a5ba5c809a833767dd9e629"} Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.924162 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d563ede3e34ded89d7b73bb63a43577aa3ba7cc0a5ba5c809a833767dd9e629" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.924140 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-88rdn-config-d2h44" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.925767 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xxskp" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.925775 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xxskp" event={"ID":"d80c9d6e-25f1-4629-97f0-724c2353944b","Type":"ContainerDied","Data":"72eaf23484520ca615dfec5ef9d10e92964c1a29f9ae95d6ce2a8bdb5b2fdffc"} Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.925818 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72eaf23484520ca615dfec5ef9d10e92964c1a29f9ae95d6ce2a8bdb5b2fdffc" Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.930710 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403"} Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.930751 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e"} Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.930760 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab"} Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.930769 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63"} Jan 30 05:25:15 crc kubenswrapper[4841]: I0130 05:25:15.930777 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07"} Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.549888 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-88rdn-config-d2h44"] Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.557234 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-88rdn-config-d2h44"] Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.718242 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-88rdn-config-v7px2"] Jan 30 05:25:16 crc kubenswrapper[4841]: E0130 05:25:16.718566 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d80c9d6e-25f1-4629-97f0-724c2353944b" containerName="mariadb-account-create-update" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.718581 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d80c9d6e-25f1-4629-97f0-724c2353944b" containerName="mariadb-account-create-update" Jan 30 05:25:16 crc kubenswrapper[4841]: E0130 05:25:16.718598 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f8c080-6cdc-4ea5-af1e-b361ae2cccdc" containerName="ovn-config" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.718604 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f8c080-6cdc-4ea5-af1e-b361ae2cccdc" containerName="ovn-config" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.718758 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f8c080-6cdc-4ea5-af1e-b361ae2cccdc" containerName="ovn-config" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.718778 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d80c9d6e-25f1-4629-97f0-724c2353944b" containerName="mariadb-account-create-update" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.719231 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.720831 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.733893 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-88rdn-config-v7px2"] Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.801996 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-run\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.802230 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-log-ovn\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.802311 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-run-ovn\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.802435 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vzp6\" (UniqueName: \"kubernetes.io/projected/85a15a38-806b-4448-acee-22be2a90388f-kube-api-access-7vzp6\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.802546 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/85a15a38-806b-4448-acee-22be2a90388f-additional-scripts\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.802623 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85a15a38-806b-4448-acee-22be2a90388f-scripts\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.904232 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/85a15a38-806b-4448-acee-22be2a90388f-additional-scripts\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.904275 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85a15a38-806b-4448-acee-22be2a90388f-scripts\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.904344 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-run\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.904382 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-log-ovn\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.904417 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-run-ovn\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.904449 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vzp6\" (UniqueName: \"kubernetes.io/projected/85a15a38-806b-4448-acee-22be2a90388f-kube-api-access-7vzp6\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.904751 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-log-ovn\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.904768 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-run\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.904814 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-run-ovn\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.905842 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/85a15a38-806b-4448-acee-22be2a90388f-additional-scripts\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.906509 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85a15a38-806b-4448-acee-22be2a90388f-scripts\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.922011 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vzp6\" (UniqueName: \"kubernetes.io/projected/85a15a38-806b-4448-acee-22be2a90388f-kube-api-access-7vzp6\") pod \"ovn-controller-88rdn-config-v7px2\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.944650 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c"} Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.944709 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerStarted","Data":"ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864"} Jan 30 05:25:16 crc kubenswrapper[4841]: I0130 05:25:16.988283 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.43246029 podStartE2EDuration="31.988257952s" podCreationTimestamp="2026-01-30 05:24:45 +0000 UTC" firstStartedPulling="2026-01-30 05:25:03.392673388 +0000 UTC m=+1040.386146026" lastFinishedPulling="2026-01-30 05:25:14.94847105 +0000 UTC m=+1051.941943688" observedRunningTime="2026-01-30 05:25:16.981131209 +0000 UTC m=+1053.974603887" watchObservedRunningTime="2026-01-30 05:25:16.988257952 +0000 UTC m=+1053.981730630" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.038899 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.271560 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8db84466c-2l7ck"] Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.273174 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.275297 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.279558 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8db84466c-2l7ck"] Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.412385 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-ovsdbserver-nb\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.412441 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-dns-swift-storage-0\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.412679 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-ovsdbserver-sb\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.412754 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-config\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.413031 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-dns-svc\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.413083 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkks9\" (UniqueName: \"kubernetes.io/projected/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-kube-api-access-vkks9\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.514469 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-config\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.514498 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-88rdn-config-v7px2"] Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.514580 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-dns-svc\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.514603 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkks9\" (UniqueName: \"kubernetes.io/projected/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-kube-api-access-vkks9\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.514656 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-ovsdbserver-nb\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.514684 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-dns-swift-storage-0\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.514740 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-ovsdbserver-sb\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.515446 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-dns-svc\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.515536 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-ovsdbserver-sb\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.515608 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-dns-swift-storage-0\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.516152 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-ovsdbserver-nb\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.516566 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-config\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.543799 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkks9\" (UniqueName: \"kubernetes.io/projected/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-kube-api-access-vkks9\") pod \"dnsmasq-dns-8db84466c-2l7ck\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.598960 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.953014 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-88rdn-config-v7px2" event={"ID":"85a15a38-806b-4448-acee-22be2a90388f","Type":"ContainerStarted","Data":"0182770a249e71ae68983a8beb058b839e3edb9972bf7c96097b9a39e6587d5f"} Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.953324 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-88rdn-config-v7px2" event={"ID":"85a15a38-806b-4448-acee-22be2a90388f","Type":"ContainerStarted","Data":"b8e9e5c7f7568b66a6c638cbf47f8d07c6088312ce5a5ae4fe71e46312270311"} Jan 30 05:25:17 crc kubenswrapper[4841]: I0130 05:25:17.985277 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-88rdn-config-v7px2" podStartSLOduration=1.985258874 podStartE2EDuration="1.985258874s" podCreationTimestamp="2026-01-30 05:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:17.980212088 +0000 UTC m=+1054.973684726" watchObservedRunningTime="2026-01-30 05:25:17.985258874 +0000 UTC m=+1054.978731512" Jan 30 05:25:18 crc kubenswrapper[4841]: I0130 05:25:18.083251 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8db84466c-2l7ck"] Jan 30 05:25:18 crc kubenswrapper[4841]: W0130 05:25:18.091141 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c4ca664_bb24_47c4_a5d8_c242f414b2ac.slice/crio-0b664b4f8b9d7528f694c2740cf35dbff8e6ed5294480f79c152be5f0c8740eb WatchSource:0}: Error finding container 0b664b4f8b9d7528f694c2740cf35dbff8e6ed5294480f79c152be5f0c8740eb: Status 404 returned error can't find the container with id 0b664b4f8b9d7528f694c2740cf35dbff8e6ed5294480f79c152be5f0c8740eb Jan 30 05:25:18 crc kubenswrapper[4841]: I0130 05:25:18.440474 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f8c080-6cdc-4ea5-af1e-b361ae2cccdc" path="/var/lib/kubelet/pods/07f8c080-6cdc-4ea5-af1e-b361ae2cccdc/volumes" Jan 30 05:25:18 crc kubenswrapper[4841]: I0130 05:25:18.964734 4841 generic.go:334] "Generic (PLEG): container finished" podID="8c4ca664-bb24-47c4-a5d8-c242f414b2ac" containerID="1583d7cabd448257fb461f6087b1f4a6d1533fee9944bba136609bbd6e09e398" exitCode=0 Jan 30 05:25:18 crc kubenswrapper[4841]: I0130 05:25:18.964838 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8db84466c-2l7ck" event={"ID":"8c4ca664-bb24-47c4-a5d8-c242f414b2ac","Type":"ContainerDied","Data":"1583d7cabd448257fb461f6087b1f4a6d1533fee9944bba136609bbd6e09e398"} Jan 30 05:25:18 crc kubenswrapper[4841]: I0130 05:25:18.965122 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8db84466c-2l7ck" event={"ID":"8c4ca664-bb24-47c4-a5d8-c242f414b2ac","Type":"ContainerStarted","Data":"0b664b4f8b9d7528f694c2740cf35dbff8e6ed5294480f79c152be5f0c8740eb"} Jan 30 05:25:18 crc kubenswrapper[4841]: I0130 05:25:18.970357 4841 generic.go:334] "Generic (PLEG): container finished" podID="85a15a38-806b-4448-acee-22be2a90388f" containerID="0182770a249e71ae68983a8beb058b839e3edb9972bf7c96097b9a39e6587d5f" exitCode=0 Jan 30 05:25:18 crc kubenswrapper[4841]: I0130 05:25:18.970394 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-88rdn-config-v7px2" event={"ID":"85a15a38-806b-4448-acee-22be2a90388f","Type":"ContainerDied","Data":"0182770a249e71ae68983a8beb058b839e3edb9972bf7c96097b9a39e6587d5f"} Jan 30 05:25:19 crc kubenswrapper[4841]: I0130 05:25:19.984951 4841 generic.go:334] "Generic (PLEG): container finished" podID="bef5639f-2a56-4255-ae24-a8f794a7b715" containerID="99b725593da174a758fc675f27f806a5cc026e7e2b8a1e1f72cd1b69bbef305c" exitCode=0 Jan 30 05:25:19 crc kubenswrapper[4841]: I0130 05:25:19.985066 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mtgqw" event={"ID":"bef5639f-2a56-4255-ae24-a8f794a7b715","Type":"ContainerDied","Data":"99b725593da174a758fc675f27f806a5cc026e7e2b8a1e1f72cd1b69bbef305c"} Jan 30 05:25:19 crc kubenswrapper[4841]: I0130 05:25:19.989741 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8db84466c-2l7ck" event={"ID":"8c4ca664-bb24-47c4-a5d8-c242f414b2ac","Type":"ContainerStarted","Data":"1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee"} Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.052782 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8db84466c-2l7ck" podStartSLOduration=3.052764916 podStartE2EDuration="3.052764916s" podCreationTimestamp="2026-01-30 05:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:20.051362348 +0000 UTC m=+1057.044835036" watchObservedRunningTime="2026-01-30 05:25:20.052764916 +0000 UTC m=+1057.046237564" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.456495 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.583106 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-run\") pod \"85a15a38-806b-4448-acee-22be2a90388f\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.583182 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-run-ovn\") pod \"85a15a38-806b-4448-acee-22be2a90388f\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.583222 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/85a15a38-806b-4448-acee-22be2a90388f-additional-scripts\") pod \"85a15a38-806b-4448-acee-22be2a90388f\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.583250 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-log-ovn\") pod \"85a15a38-806b-4448-acee-22be2a90388f\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.583307 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85a15a38-806b-4448-acee-22be2a90388f-scripts\") pod \"85a15a38-806b-4448-acee-22be2a90388f\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.583378 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vzp6\" (UniqueName: \"kubernetes.io/projected/85a15a38-806b-4448-acee-22be2a90388f-kube-api-access-7vzp6\") pod \"85a15a38-806b-4448-acee-22be2a90388f\" (UID: \"85a15a38-806b-4448-acee-22be2a90388f\") " Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.583626 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-run" (OuterVolumeSpecName: "var-run") pod "85a15a38-806b-4448-acee-22be2a90388f" (UID: "85a15a38-806b-4448-acee-22be2a90388f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.583681 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "85a15a38-806b-4448-acee-22be2a90388f" (UID: "85a15a38-806b-4448-acee-22be2a90388f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.583706 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "85a15a38-806b-4448-acee-22be2a90388f" (UID: "85a15a38-806b-4448-acee-22be2a90388f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.584046 4841 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.584070 4841 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.584090 4841 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85a15a38-806b-4448-acee-22be2a90388f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.584191 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a15a38-806b-4448-acee-22be2a90388f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "85a15a38-806b-4448-acee-22be2a90388f" (UID: "85a15a38-806b-4448-acee-22be2a90388f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.584545 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a15a38-806b-4448-acee-22be2a90388f-scripts" (OuterVolumeSpecName: "scripts") pod "85a15a38-806b-4448-acee-22be2a90388f" (UID: "85a15a38-806b-4448-acee-22be2a90388f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.594422 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-88rdn-config-v7px2"] Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.604030 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a15a38-806b-4448-acee-22be2a90388f-kube-api-access-7vzp6" (OuterVolumeSpecName: "kube-api-access-7vzp6") pod "85a15a38-806b-4448-acee-22be2a90388f" (UID: "85a15a38-806b-4448-acee-22be2a90388f"). InnerVolumeSpecName "kube-api-access-7vzp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.604197 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-88rdn-config-v7px2"] Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.685037 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85a15a38-806b-4448-acee-22be2a90388f-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.685072 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vzp6\" (UniqueName: \"kubernetes.io/projected/85a15a38-806b-4448-acee-22be2a90388f-kube-api-access-7vzp6\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:20 crc kubenswrapper[4841]: I0130 05:25:20.685087 4841 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/85a15a38-806b-4448-acee-22be2a90388f-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.002506 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-88rdn-config-v7px2" Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.002617 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8e9e5c7f7568b66a6c638cbf47f8d07c6088312ce5a5ae4fe71e46312270311" Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.003107 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.522812 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mtgqw" Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.601025 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-db-sync-config-data\") pod \"bef5639f-2a56-4255-ae24-a8f794a7b715\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.601224 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c974q\" (UniqueName: \"kubernetes.io/projected/bef5639f-2a56-4255-ae24-a8f794a7b715-kube-api-access-c974q\") pod \"bef5639f-2a56-4255-ae24-a8f794a7b715\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.601256 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-config-data\") pod \"bef5639f-2a56-4255-ae24-a8f794a7b715\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.601281 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-combined-ca-bundle\") pod \"bef5639f-2a56-4255-ae24-a8f794a7b715\" (UID: \"bef5639f-2a56-4255-ae24-a8f794a7b715\") " Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.613595 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bef5639f-2a56-4255-ae24-a8f794a7b715" (UID: "bef5639f-2a56-4255-ae24-a8f794a7b715"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.615003 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef5639f-2a56-4255-ae24-a8f794a7b715-kube-api-access-c974q" (OuterVolumeSpecName: "kube-api-access-c974q") pod "bef5639f-2a56-4255-ae24-a8f794a7b715" (UID: "bef5639f-2a56-4255-ae24-a8f794a7b715"). InnerVolumeSpecName "kube-api-access-c974q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.642666 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bef5639f-2a56-4255-ae24-a8f794a7b715" (UID: "bef5639f-2a56-4255-ae24-a8f794a7b715"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.681955 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-config-data" (OuterVolumeSpecName: "config-data") pod "bef5639f-2a56-4255-ae24-a8f794a7b715" (UID: "bef5639f-2a56-4255-ae24-a8f794a7b715"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.703203 4841 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.703260 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c974q\" (UniqueName: \"kubernetes.io/projected/bef5639f-2a56-4255-ae24-a8f794a7b715-kube-api-access-c974q\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.703279 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:21 crc kubenswrapper[4841]: I0130 05:25:21.703295 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef5639f-2a56-4255-ae24-a8f794a7b715-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.018461 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mtgqw" event={"ID":"bef5639f-2a56-4255-ae24-a8f794a7b715","Type":"ContainerDied","Data":"771694525e8d562e923111942b353ecef6077d76a9710f623bed79843db591e1"} Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.018532 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="771694525e8d562e923111942b353ecef6077d76a9710f623bed79843db591e1" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.018554 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mtgqw" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.452707 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85a15a38-806b-4448-acee-22be2a90388f" path="/var/lib/kubelet/pods/85a15a38-806b-4448-acee-22be2a90388f/volumes" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.522292 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8db84466c-2l7ck"] Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.554648 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-fb28v"] Jan 30 05:25:22 crc kubenswrapper[4841]: E0130 05:25:22.554948 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef5639f-2a56-4255-ae24-a8f794a7b715" containerName="glance-db-sync" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.554963 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef5639f-2a56-4255-ae24-a8f794a7b715" containerName="glance-db-sync" Jan 30 05:25:22 crc kubenswrapper[4841]: E0130 05:25:22.554976 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a15a38-806b-4448-acee-22be2a90388f" containerName="ovn-config" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.554982 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a15a38-806b-4448-acee-22be2a90388f" containerName="ovn-config" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.555147 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef5639f-2a56-4255-ae24-a8f794a7b715" containerName="glance-db-sync" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.555155 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a15a38-806b-4448-acee-22be2a90388f" containerName="ovn-config" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.556067 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.615919 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-fb28v"] Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.619662 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-dns-svc\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.619711 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-config\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.619742 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmvg4\" (UniqueName: \"kubernetes.io/projected/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-kube-api-access-nmvg4\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.619769 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-ovsdbserver-nb\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.619897 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-ovsdbserver-sb\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.619975 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-dns-swift-storage-0\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.722066 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-dns-svc\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.722121 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-config\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.722145 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmvg4\" (UniqueName: \"kubernetes.io/projected/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-kube-api-access-nmvg4\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.722982 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-ovsdbserver-nb\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.723102 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-config\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.723156 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-ovsdbserver-nb\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.723224 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-ovsdbserver-sb\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.723253 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-dns-swift-storage-0\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.723576 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-dns-svc\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.724003 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-dns-swift-storage-0\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.724037 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-ovsdbserver-sb\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.741982 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmvg4\" (UniqueName: \"kubernetes.io/projected/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-kube-api-access-nmvg4\") pod \"dnsmasq-dns-74dfc89d77-fb28v\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:22 crc kubenswrapper[4841]: I0130 05:25:22.887651 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.027640 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8db84466c-2l7ck" podUID="8c4ca664-bb24-47c4-a5d8-c242f414b2ac" containerName="dnsmasq-dns" containerID="cri-o://1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee" gracePeriod=10 Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.361337 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-fb28v"] Jan 30 05:25:23 crc kubenswrapper[4841]: W0130 05:25:23.368063 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod797269f4_a97a_4a49_a5a0_f5a5623f5e0c.slice/crio-e75f7ea47764b2aad1d1e17337fd1d5fbc04c6cf0f011e342a9696358fbfe0c9 WatchSource:0}: Error finding container e75f7ea47764b2aad1d1e17337fd1d5fbc04c6cf0f011e342a9696358fbfe0c9: Status 404 returned error can't find the container with id e75f7ea47764b2aad1d1e17337fd1d5fbc04c6cf0f011e342a9696358fbfe0c9 Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.385340 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.440860 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-dns-swift-storage-0\") pod \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.441107 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-ovsdbserver-nb\") pod \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.441182 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkks9\" (UniqueName: \"kubernetes.io/projected/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-kube-api-access-vkks9\") pod \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.441330 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-ovsdbserver-sb\") pod \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.441437 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-dns-svc\") pod \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.441524 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-config\") pod \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\" (UID: \"8c4ca664-bb24-47c4-a5d8-c242f414b2ac\") " Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.450557 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-kube-api-access-vkks9" (OuterVolumeSpecName: "kube-api-access-vkks9") pod "8c4ca664-bb24-47c4-a5d8-c242f414b2ac" (UID: "8c4ca664-bb24-47c4-a5d8-c242f414b2ac"). InnerVolumeSpecName "kube-api-access-vkks9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.497908 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-config" (OuterVolumeSpecName: "config") pod "8c4ca664-bb24-47c4-a5d8-c242f414b2ac" (UID: "8c4ca664-bb24-47c4-a5d8-c242f414b2ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.501824 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c4ca664-bb24-47c4-a5d8-c242f414b2ac" (UID: "8c4ca664-bb24-47c4-a5d8-c242f414b2ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.503789 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8c4ca664-bb24-47c4-a5d8-c242f414b2ac" (UID: "8c4ca664-bb24-47c4-a5d8-c242f414b2ac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.510247 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8c4ca664-bb24-47c4-a5d8-c242f414b2ac" (UID: "8c4ca664-bb24-47c4-a5d8-c242f414b2ac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.518887 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8c4ca664-bb24-47c4-a5d8-c242f414b2ac" (UID: "8c4ca664-bb24-47c4-a5d8-c242f414b2ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.543105 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.543139 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.543148 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.543157 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.543166 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:23 crc kubenswrapper[4841]: I0130 05:25:23.543196 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkks9\" (UniqueName: \"kubernetes.io/projected/8c4ca664-bb24-47c4-a5d8-c242f414b2ac-kube-api-access-vkks9\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.039275 4841 generic.go:334] "Generic (PLEG): container finished" podID="8c4ca664-bb24-47c4-a5d8-c242f414b2ac" containerID="1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee" exitCode=0 Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.039720 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8db84466c-2l7ck" Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.040271 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8db84466c-2l7ck" event={"ID":"8c4ca664-bb24-47c4-a5d8-c242f414b2ac","Type":"ContainerDied","Data":"1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee"} Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.040315 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8db84466c-2l7ck" event={"ID":"8c4ca664-bb24-47c4-a5d8-c242f414b2ac","Type":"ContainerDied","Data":"0b664b4f8b9d7528f694c2740cf35dbff8e6ed5294480f79c152be5f0c8740eb"} Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.040332 4841 scope.go:117] "RemoveContainer" containerID="1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee" Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.044172 4841 generic.go:334] "Generic (PLEG): container finished" podID="797269f4-a97a-4a49-a5a0-f5a5623f5e0c" containerID="889730a22c7b6cea75cc2bd04ff5c116d6193d6c36fbcaba34622c33276ee496" exitCode=0 Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.044224 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" event={"ID":"797269f4-a97a-4a49-a5a0-f5a5623f5e0c","Type":"ContainerDied","Data":"889730a22c7b6cea75cc2bd04ff5c116d6193d6c36fbcaba34622c33276ee496"} Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.044262 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" event={"ID":"797269f4-a97a-4a49-a5a0-f5a5623f5e0c","Type":"ContainerStarted","Data":"e75f7ea47764b2aad1d1e17337fd1d5fbc04c6cf0f011e342a9696358fbfe0c9"} Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.090168 4841 scope.go:117] "RemoveContainer" containerID="1583d7cabd448257fb461f6087b1f4a6d1533fee9944bba136609bbd6e09e398" Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.147621 4841 scope.go:117] "RemoveContainer" containerID="1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee" Jan 30 05:25:24 crc kubenswrapper[4841]: E0130 05:25:24.156550 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee\": container with ID starting with 1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee not found: ID does not exist" containerID="1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee" Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.156598 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee"} err="failed to get container status \"1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee\": rpc error: code = NotFound desc = could not find container \"1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee\": container with ID starting with 1adc169f0900677778ef20043e47b295c8c2a64e3b1adaaf476bae43a86d65ee not found: ID does not exist" Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.156625 4841 scope.go:117] "RemoveContainer" containerID="1583d7cabd448257fb461f6087b1f4a6d1533fee9944bba136609bbd6e09e398" Jan 30 05:25:24 crc kubenswrapper[4841]: E0130 05:25:24.159763 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1583d7cabd448257fb461f6087b1f4a6d1533fee9944bba136609bbd6e09e398\": container with ID starting with 1583d7cabd448257fb461f6087b1f4a6d1533fee9944bba136609bbd6e09e398 not found: ID does not exist" containerID="1583d7cabd448257fb461f6087b1f4a6d1533fee9944bba136609bbd6e09e398" Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.159791 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1583d7cabd448257fb461f6087b1f4a6d1533fee9944bba136609bbd6e09e398"} err="failed to get container status \"1583d7cabd448257fb461f6087b1f4a6d1533fee9944bba136609bbd6e09e398\": rpc error: code = NotFound desc = could not find container \"1583d7cabd448257fb461f6087b1f4a6d1533fee9944bba136609bbd6e09e398\": container with ID starting with 1583d7cabd448257fb461f6087b1f4a6d1533fee9944bba136609bbd6e09e398 not found: ID does not exist" Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.162614 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8db84466c-2l7ck"] Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.170450 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8db84466c-2l7ck"] Jan 30 05:25:24 crc kubenswrapper[4841]: I0130 05:25:24.443323 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c4ca664-bb24-47c4-a5d8-c242f414b2ac" path="/var/lib/kubelet/pods/8c4ca664-bb24-47c4-a5d8-c242f414b2ac/volumes" Jan 30 05:25:25 crc kubenswrapper[4841]: I0130 05:25:25.091792 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" event={"ID":"797269f4-a97a-4a49-a5a0-f5a5623f5e0c","Type":"ContainerStarted","Data":"4505dd4fa912678acacc7f9626ff3079bef2687f832e614da822c9cdb2620da4"} Jan 30 05:25:25 crc kubenswrapper[4841]: I0130 05:25:25.092142 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:25 crc kubenswrapper[4841]: I0130 05:25:25.125990 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" podStartSLOduration=3.125974191 podStartE2EDuration="3.125974191s" podCreationTimestamp="2026-01-30 05:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:25.125429077 +0000 UTC m=+1062.118901715" watchObservedRunningTime="2026-01-30 05:25:25.125974191 +0000 UTC m=+1062.119446829" Jan 30 05:25:30 crc kubenswrapper[4841]: I0130 05:25:30.627710 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 05:25:30 crc kubenswrapper[4841]: I0130 05:25:30.932104 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.093877 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-hpd9d"] Jan 30 05:25:31 crc kubenswrapper[4841]: E0130 05:25:31.094171 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4ca664-bb24-47c4-a5d8-c242f414b2ac" containerName="init" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.094183 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4ca664-bb24-47c4-a5d8-c242f414b2ac" containerName="init" Jan 30 05:25:31 crc kubenswrapper[4841]: E0130 05:25:31.094206 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4ca664-bb24-47c4-a5d8-c242f414b2ac" containerName="dnsmasq-dns" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.094212 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4ca664-bb24-47c4-a5d8-c242f414b2ac" containerName="dnsmasq-dns" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.094373 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c4ca664-bb24-47c4-a5d8-c242f414b2ac" containerName="dnsmasq-dns" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.094873 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hpd9d" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.107638 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hpd9d"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.172115 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-26gs6"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.172971 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-26gs6" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.190708 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-676d-account-create-update-bdk8r"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.192180 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-676d-account-create-update-bdk8r" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.196570 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.199359 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-26gs6"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.218845 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-676d-account-create-update-bdk8r"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.222170 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4fsv\" (UniqueName: \"kubernetes.io/projected/1611ffb0-e1aa-487c-aabb-a0f71f4856ff-kube-api-access-q4fsv\") pod \"barbican-db-create-26gs6\" (UID: \"1611ffb0-e1aa-487c-aabb-a0f71f4856ff\") " pod="openstack/barbican-db-create-26gs6" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.222209 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1611ffb0-e1aa-487c-aabb-a0f71f4856ff-operator-scripts\") pod \"barbican-db-create-26gs6\" (UID: \"1611ffb0-e1aa-487c-aabb-a0f71f4856ff\") " pod="openstack/barbican-db-create-26gs6" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.222233 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9wvm\" (UniqueName: \"kubernetes.io/projected/8a361348-e06e-4aa4-b180-0450782b1dfc-kube-api-access-w9wvm\") pod \"cinder-db-create-hpd9d\" (UID: \"8a361348-e06e-4aa4-b180-0450782b1dfc\") " pod="openstack/cinder-db-create-hpd9d" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.222255 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a361348-e06e-4aa4-b180-0450782b1dfc-operator-scripts\") pod \"cinder-db-create-hpd9d\" (UID: \"8a361348-e06e-4aa4-b180-0450782b1dfc\") " pod="openstack/cinder-db-create-hpd9d" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.324562 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4fsv\" (UniqueName: \"kubernetes.io/projected/1611ffb0-e1aa-487c-aabb-a0f71f4856ff-kube-api-access-q4fsv\") pod \"barbican-db-create-26gs6\" (UID: \"1611ffb0-e1aa-487c-aabb-a0f71f4856ff\") " pod="openstack/barbican-db-create-26gs6" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.324637 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1611ffb0-e1aa-487c-aabb-a0f71f4856ff-operator-scripts\") pod \"barbican-db-create-26gs6\" (UID: \"1611ffb0-e1aa-487c-aabb-a0f71f4856ff\") " pod="openstack/barbican-db-create-26gs6" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.324668 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9wvm\" (UniqueName: \"kubernetes.io/projected/8a361348-e06e-4aa4-b180-0450782b1dfc-kube-api-access-w9wvm\") pod \"cinder-db-create-hpd9d\" (UID: \"8a361348-e06e-4aa4-b180-0450782b1dfc\") " pod="openstack/cinder-db-create-hpd9d" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.324703 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a361348-e06e-4aa4-b180-0450782b1dfc-operator-scripts\") pod \"cinder-db-create-hpd9d\" (UID: \"8a361348-e06e-4aa4-b180-0450782b1dfc\") " pod="openstack/cinder-db-create-hpd9d" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.324782 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d823011-229a-426a-99d7-0af611df4000-operator-scripts\") pod \"barbican-676d-account-create-update-bdk8r\" (UID: \"0d823011-229a-426a-99d7-0af611df4000\") " pod="openstack/barbican-676d-account-create-update-bdk8r" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.324867 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hjsc\" (UniqueName: \"kubernetes.io/projected/0d823011-229a-426a-99d7-0af611df4000-kube-api-access-4hjsc\") pod \"barbican-676d-account-create-update-bdk8r\" (UID: \"0d823011-229a-426a-99d7-0af611df4000\") " pod="openstack/barbican-676d-account-create-update-bdk8r" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.325551 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a361348-e06e-4aa4-b180-0450782b1dfc-operator-scripts\") pod \"cinder-db-create-hpd9d\" (UID: \"8a361348-e06e-4aa4-b180-0450782b1dfc\") " pod="openstack/cinder-db-create-hpd9d" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.325729 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1611ffb0-e1aa-487c-aabb-a0f71f4856ff-operator-scripts\") pod \"barbican-db-create-26gs6\" (UID: \"1611ffb0-e1aa-487c-aabb-a0f71f4856ff\") " pod="openstack/barbican-db-create-26gs6" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.343379 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9wvm\" (UniqueName: \"kubernetes.io/projected/8a361348-e06e-4aa4-b180-0450782b1dfc-kube-api-access-w9wvm\") pod \"cinder-db-create-hpd9d\" (UID: \"8a361348-e06e-4aa4-b180-0450782b1dfc\") " pod="openstack/cinder-db-create-hpd9d" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.348268 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4fsv\" (UniqueName: \"kubernetes.io/projected/1611ffb0-e1aa-487c-aabb-a0f71f4856ff-kube-api-access-q4fsv\") pod \"barbican-db-create-26gs6\" (UID: \"1611ffb0-e1aa-487c-aabb-a0f71f4856ff\") " pod="openstack/barbican-db-create-26gs6" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.378015 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-de72-account-create-update-8qxjx"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.378899 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-de72-account-create-update-8qxjx" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.388601 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.388829 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-de72-account-create-update-8qxjx"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.424837 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hpd9d" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.426095 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hjsc\" (UniqueName: \"kubernetes.io/projected/0d823011-229a-426a-99d7-0af611df4000-kube-api-access-4hjsc\") pod \"barbican-676d-account-create-update-bdk8r\" (UID: \"0d823011-229a-426a-99d7-0af611df4000\") " pod="openstack/barbican-676d-account-create-update-bdk8r" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.426184 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb5p2\" (UniqueName: \"kubernetes.io/projected/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1-kube-api-access-bb5p2\") pod \"cinder-de72-account-create-update-8qxjx\" (UID: \"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1\") " pod="openstack/cinder-de72-account-create-update-8qxjx" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.426238 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d823011-229a-426a-99d7-0af611df4000-operator-scripts\") pod \"barbican-676d-account-create-update-bdk8r\" (UID: \"0d823011-229a-426a-99d7-0af611df4000\") " pod="openstack/barbican-676d-account-create-update-bdk8r" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.426259 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1-operator-scripts\") pod \"cinder-de72-account-create-update-8qxjx\" (UID: \"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1\") " pod="openstack/cinder-de72-account-create-update-8qxjx" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.427085 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d823011-229a-426a-99d7-0af611df4000-operator-scripts\") pod \"barbican-676d-account-create-update-bdk8r\" (UID: \"0d823011-229a-426a-99d7-0af611df4000\") " pod="openstack/barbican-676d-account-create-update-bdk8r" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.429996 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-45dj5"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.430984 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.432555 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zpg5t" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.432719 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.432844 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.435580 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.466572 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-45dj5"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.481615 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hjsc\" (UniqueName: \"kubernetes.io/projected/0d823011-229a-426a-99d7-0af611df4000-kube-api-access-4hjsc\") pod \"barbican-676d-account-create-update-bdk8r\" (UID: \"0d823011-229a-426a-99d7-0af611df4000\") " pod="openstack/barbican-676d-account-create-update-bdk8r" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.491366 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-26gs6" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.509145 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1306-account-create-update-mf8cj"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.509384 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-676d-account-create-update-bdk8r" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.510061 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1306-account-create-update-mf8cj" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.513466 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.521431 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-t5622"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.522337 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t5622" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.528489 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7jpm\" (UniqueName: \"kubernetes.io/projected/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-kube-api-access-m7jpm\") pod \"keystone-db-sync-45dj5\" (UID: \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\") " pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.528555 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-config-data\") pod \"keystone-db-sync-45dj5\" (UID: \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\") " pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.528590 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-combined-ca-bundle\") pod \"keystone-db-sync-45dj5\" (UID: \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\") " pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.528670 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb5p2\" (UniqueName: \"kubernetes.io/projected/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1-kube-api-access-bb5p2\") pod \"cinder-de72-account-create-update-8qxjx\" (UID: \"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1\") " pod="openstack/cinder-de72-account-create-update-8qxjx" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.528736 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1-operator-scripts\") pod \"cinder-de72-account-create-update-8qxjx\" (UID: \"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1\") " pod="openstack/cinder-de72-account-create-update-8qxjx" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.529478 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1-operator-scripts\") pod \"cinder-de72-account-create-update-8qxjx\" (UID: \"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1\") " pod="openstack/cinder-de72-account-create-update-8qxjx" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.547689 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1306-account-create-update-mf8cj"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.553585 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-t5622"] Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.567680 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb5p2\" (UniqueName: \"kubernetes.io/projected/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1-kube-api-access-bb5p2\") pod \"cinder-de72-account-create-update-8qxjx\" (UID: \"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1\") " pod="openstack/cinder-de72-account-create-update-8qxjx" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.630267 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m29wp\" (UniqueName: \"kubernetes.io/projected/68f5077d-7967-4cbe-9254-09728b25ab58-kube-api-access-m29wp\") pod \"neutron-db-create-t5622\" (UID: \"68f5077d-7967-4cbe-9254-09728b25ab58\") " pod="openstack/neutron-db-create-t5622" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.630323 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rthxm\" (UniqueName: \"kubernetes.io/projected/36bee932-2dad-4fca-aff2-0170cb6d4af8-kube-api-access-rthxm\") pod \"neutron-1306-account-create-update-mf8cj\" (UID: \"36bee932-2dad-4fca-aff2-0170cb6d4af8\") " pod="openstack/neutron-1306-account-create-update-mf8cj" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.630348 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68f5077d-7967-4cbe-9254-09728b25ab58-operator-scripts\") pod \"neutron-db-create-t5622\" (UID: \"68f5077d-7967-4cbe-9254-09728b25ab58\") " pod="openstack/neutron-db-create-t5622" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.630370 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7jpm\" (UniqueName: \"kubernetes.io/projected/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-kube-api-access-m7jpm\") pod \"keystone-db-sync-45dj5\" (UID: \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\") " pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.630407 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36bee932-2dad-4fca-aff2-0170cb6d4af8-operator-scripts\") pod \"neutron-1306-account-create-update-mf8cj\" (UID: \"36bee932-2dad-4fca-aff2-0170cb6d4af8\") " pod="openstack/neutron-1306-account-create-update-mf8cj" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.630480 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-config-data\") pod \"keystone-db-sync-45dj5\" (UID: \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\") " pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.630605 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-combined-ca-bundle\") pod \"keystone-db-sync-45dj5\" (UID: \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\") " pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.633905 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-combined-ca-bundle\") pod \"keystone-db-sync-45dj5\" (UID: \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\") " pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.634471 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-config-data\") pod \"keystone-db-sync-45dj5\" (UID: \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\") " pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.659144 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7jpm\" (UniqueName: \"kubernetes.io/projected/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-kube-api-access-m7jpm\") pod \"keystone-db-sync-45dj5\" (UID: \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\") " pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.733279 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m29wp\" (UniqueName: \"kubernetes.io/projected/68f5077d-7967-4cbe-9254-09728b25ab58-kube-api-access-m29wp\") pod \"neutron-db-create-t5622\" (UID: \"68f5077d-7967-4cbe-9254-09728b25ab58\") " pod="openstack/neutron-db-create-t5622" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.733334 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rthxm\" (UniqueName: \"kubernetes.io/projected/36bee932-2dad-4fca-aff2-0170cb6d4af8-kube-api-access-rthxm\") pod \"neutron-1306-account-create-update-mf8cj\" (UID: \"36bee932-2dad-4fca-aff2-0170cb6d4af8\") " pod="openstack/neutron-1306-account-create-update-mf8cj" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.733360 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68f5077d-7967-4cbe-9254-09728b25ab58-operator-scripts\") pod \"neutron-db-create-t5622\" (UID: \"68f5077d-7967-4cbe-9254-09728b25ab58\") " pod="openstack/neutron-db-create-t5622" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.733384 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36bee932-2dad-4fca-aff2-0170cb6d4af8-operator-scripts\") pod \"neutron-1306-account-create-update-mf8cj\" (UID: \"36bee932-2dad-4fca-aff2-0170cb6d4af8\") " pod="openstack/neutron-1306-account-create-update-mf8cj" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.734186 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36bee932-2dad-4fca-aff2-0170cb6d4af8-operator-scripts\") pod \"neutron-1306-account-create-update-mf8cj\" (UID: \"36bee932-2dad-4fca-aff2-0170cb6d4af8\") " pod="openstack/neutron-1306-account-create-update-mf8cj" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.734410 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68f5077d-7967-4cbe-9254-09728b25ab58-operator-scripts\") pod \"neutron-db-create-t5622\" (UID: \"68f5077d-7967-4cbe-9254-09728b25ab58\") " pod="openstack/neutron-db-create-t5622" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.744840 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-de72-account-create-update-8qxjx" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.766052 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rthxm\" (UniqueName: \"kubernetes.io/projected/36bee932-2dad-4fca-aff2-0170cb6d4af8-kube-api-access-rthxm\") pod \"neutron-1306-account-create-update-mf8cj\" (UID: \"36bee932-2dad-4fca-aff2-0170cb6d4af8\") " pod="openstack/neutron-1306-account-create-update-mf8cj" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.776627 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m29wp\" (UniqueName: \"kubernetes.io/projected/68f5077d-7967-4cbe-9254-09728b25ab58-kube-api-access-m29wp\") pod \"neutron-db-create-t5622\" (UID: \"68f5077d-7967-4cbe-9254-09728b25ab58\") " pod="openstack/neutron-db-create-t5622" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.847449 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.859511 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1306-account-create-update-mf8cj" Jan 30 05:25:31 crc kubenswrapper[4841]: I0130 05:25:31.884505 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t5622" Jan 30 05:25:32 crc kubenswrapper[4841]: I0130 05:25:32.312960 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hpd9d"] Jan 30 05:25:33 crc kubenswrapper[4841]: W0130 05:25:32.547520 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d823011_229a_426a_99d7_0af611df4000.slice/crio-52b11a0f25c4b378e65c7784349203ac082a9e72c08e54e49efa6affd956004a WatchSource:0}: Error finding container 52b11a0f25c4b378e65c7784349203ac082a9e72c08e54e49efa6affd956004a: Status 404 returned error can't find the container with id 52b11a0f25c4b378e65c7784349203ac082a9e72c08e54e49efa6affd956004a Jan 30 05:25:33 crc kubenswrapper[4841]: I0130 05:25:32.552784 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-26gs6"] Jan 30 05:25:33 crc kubenswrapper[4841]: I0130 05:25:32.559196 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-676d-account-create-update-bdk8r"] Jan 30 05:25:33 crc kubenswrapper[4841]: W0130 05:25:32.585537 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1611ffb0_e1aa_487c_aabb_a0f71f4856ff.slice/crio-02087a8f3d8412e3393f9730303503cfb661f38edaa27fe60b55518197490f7b WatchSource:0}: Error finding container 02087a8f3d8412e3393f9730303503cfb661f38edaa27fe60b55518197490f7b: Status 404 returned error can't find the container with id 02087a8f3d8412e3393f9730303503cfb661f38edaa27fe60b55518197490f7b Jan 30 05:25:33 crc kubenswrapper[4841]: I0130 05:25:32.888557 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:25:33 crc kubenswrapper[4841]: I0130 05:25:32.961050 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-6vbsx"] Jan 30 05:25:33 crc kubenswrapper[4841]: I0130 05:25:32.961271 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" podUID="1b9dbcfe-b92e-4c53-a410-86416ae62413" containerName="dnsmasq-dns" containerID="cri-o://6e40304e7bb30193215e7c6aa1b64cb5e94495d6e9c2a8ef9b6c2738d50d1707" gracePeriod=10 Jan 30 05:25:33 crc kubenswrapper[4841]: I0130 05:25:33.167717 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-26gs6" event={"ID":"1611ffb0-e1aa-487c-aabb-a0f71f4856ff","Type":"ContainerStarted","Data":"02087a8f3d8412e3393f9730303503cfb661f38edaa27fe60b55518197490f7b"} Jan 30 05:25:33 crc kubenswrapper[4841]: I0130 05:25:33.169189 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hpd9d" event={"ID":"8a361348-e06e-4aa4-b180-0450782b1dfc","Type":"ContainerStarted","Data":"15a9ca558c0006d54dde07e97eeaf1f1b91a5e960ef60682e27a82a57735ae35"} Jan 30 05:25:33 crc kubenswrapper[4841]: I0130 05:25:33.170295 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-676d-account-create-update-bdk8r" event={"ID":"0d823011-229a-426a-99d7-0af611df4000","Type":"ContainerStarted","Data":"52b11a0f25c4b378e65c7784349203ac082a9e72c08e54e49efa6affd956004a"} Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.181884 4841 generic.go:334] "Generic (PLEG): container finished" podID="1611ffb0-e1aa-487c-aabb-a0f71f4856ff" containerID="c9437d4bfc1aa568db9f236839b9880300b298866644c5b7a56808c9053266e9" exitCode=0 Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.182010 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-26gs6" event={"ID":"1611ffb0-e1aa-487c-aabb-a0f71f4856ff","Type":"ContainerDied","Data":"c9437d4bfc1aa568db9f236839b9880300b298866644c5b7a56808c9053266e9"} Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.191287 4841 generic.go:334] "Generic (PLEG): container finished" podID="1b9dbcfe-b92e-4c53-a410-86416ae62413" containerID="6e40304e7bb30193215e7c6aa1b64cb5e94495d6e9c2a8ef9b6c2738d50d1707" exitCode=0 Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.191509 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" event={"ID":"1b9dbcfe-b92e-4c53-a410-86416ae62413","Type":"ContainerDied","Data":"6e40304e7bb30193215e7c6aa1b64cb5e94495d6e9c2a8ef9b6c2738d50d1707"} Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.199027 4841 generic.go:334] "Generic (PLEG): container finished" podID="8a361348-e06e-4aa4-b180-0450782b1dfc" containerID="0a70a0f6a798c8a514a3ce1eb1cd476bac1ed23f3497cd8cc84628ea9151c1f3" exitCode=0 Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.199090 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hpd9d" event={"ID":"8a361348-e06e-4aa4-b180-0450782b1dfc","Type":"ContainerDied","Data":"0a70a0f6a798c8a514a3ce1eb1cd476bac1ed23f3497cd8cc84628ea9151c1f3"} Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.200974 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-676d-account-create-update-bdk8r" event={"ID":"0d823011-229a-426a-99d7-0af611df4000","Type":"ContainerStarted","Data":"9c08f840a59acac8f70270c43eea5ff5025ee709df5670b4390941105ac52fdb"} Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.270243 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-676d-account-create-update-bdk8r" podStartSLOduration=3.270222201 podStartE2EDuration="3.270222201s" podCreationTimestamp="2026-01-30 05:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:34.257693212 +0000 UTC m=+1071.251165850" watchObservedRunningTime="2026-01-30 05:25:34.270222201 +0000 UTC m=+1071.263694839" Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.352793 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.403925 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-ovsdbserver-nb\") pod \"1b9dbcfe-b92e-4c53-a410-86416ae62413\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.403985 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-ovsdbserver-sb\") pod \"1b9dbcfe-b92e-4c53-a410-86416ae62413\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.404039 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-dns-svc\") pod \"1b9dbcfe-b92e-4c53-a410-86416ae62413\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.404140 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhghd\" (UniqueName: \"kubernetes.io/projected/1b9dbcfe-b92e-4c53-a410-86416ae62413-kube-api-access-jhghd\") pod \"1b9dbcfe-b92e-4c53-a410-86416ae62413\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.404223 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-config\") pod \"1b9dbcfe-b92e-4c53-a410-86416ae62413\" (UID: \"1b9dbcfe-b92e-4c53-a410-86416ae62413\") " Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.420759 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9dbcfe-b92e-4c53-a410-86416ae62413-kube-api-access-jhghd" (OuterVolumeSpecName: "kube-api-access-jhghd") pod "1b9dbcfe-b92e-4c53-a410-86416ae62413" (UID: "1b9dbcfe-b92e-4c53-a410-86416ae62413"). InnerVolumeSpecName "kube-api-access-jhghd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.428523 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-de72-account-create-update-8qxjx"] Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.506788 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhghd\" (UniqueName: \"kubernetes.io/projected/1b9dbcfe-b92e-4c53-a410-86416ae62413-kube-api-access-jhghd\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.549270 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b9dbcfe-b92e-4c53-a410-86416ae62413" (UID: "1b9dbcfe-b92e-4c53-a410-86416ae62413"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.575284 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b9dbcfe-b92e-4c53-a410-86416ae62413" (UID: "1b9dbcfe-b92e-4c53-a410-86416ae62413"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.587167 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-config" (OuterVolumeSpecName: "config") pod "1b9dbcfe-b92e-4c53-a410-86416ae62413" (UID: "1b9dbcfe-b92e-4c53-a410-86416ae62413"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.613723 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.613749 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.613760 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.627015 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b9dbcfe-b92e-4c53-a410-86416ae62413" (UID: "1b9dbcfe-b92e-4c53-a410-86416ae62413"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:34 crc kubenswrapper[4841]: W0130 05:25:34.660435 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36bee932_2dad_4fca_aff2_0170cb6d4af8.slice/crio-962933372aea1cb7a25f6b30115b1779431d4235d101596c2b616772a7f025d8 WatchSource:0}: Error finding container 962933372aea1cb7a25f6b30115b1779431d4235d101596c2b616772a7f025d8: Status 404 returned error can't find the container with id 962933372aea1cb7a25f6b30115b1779431d4235d101596c2b616772a7f025d8 Jan 30 05:25:34 crc kubenswrapper[4841]: W0130 05:25:34.664448 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ed3d0b9_0cdf_4174_8283_5dd9faadbe34.slice/crio-167ca370d7c0a0c12c9b9458b6de506ddd6d46228d5fbbce2e430dcd154b7134 WatchSource:0}: Error finding container 167ca370d7c0a0c12c9b9458b6de506ddd6d46228d5fbbce2e430dcd154b7134: Status 404 returned error can't find the container with id 167ca370d7c0a0c12c9b9458b6de506ddd6d46228d5fbbce2e430dcd154b7134 Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.664643 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-t5622"] Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.664682 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-45dj5"] Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.664694 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1306-account-create-update-mf8cj"] Jan 30 05:25:34 crc kubenswrapper[4841]: I0130 05:25:34.718015 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b9dbcfe-b92e-4c53-a410-86416ae62413-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.210893 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1306-account-create-update-mf8cj" event={"ID":"36bee932-2dad-4fca-aff2-0170cb6d4af8","Type":"ContainerStarted","Data":"ccf928f9462b3ee5fb5e26c65672c931a7fab29d56b8e9e63d2a1c130071a792"} Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.210947 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1306-account-create-update-mf8cj" event={"ID":"36bee932-2dad-4fca-aff2-0170cb6d4af8","Type":"ContainerStarted","Data":"962933372aea1cb7a25f6b30115b1779431d4235d101596c2b616772a7f025d8"} Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.212825 4841 generic.go:334] "Generic (PLEG): container finished" podID="68f5077d-7967-4cbe-9254-09728b25ab58" containerID="7c2452ee2b7d6245999af2b7e81b2287c244e2079fa771faf3530fcfb5a47e79" exitCode=0 Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.212900 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t5622" event={"ID":"68f5077d-7967-4cbe-9254-09728b25ab58","Type":"ContainerDied","Data":"7c2452ee2b7d6245999af2b7e81b2287c244e2079fa771faf3530fcfb5a47e79"} Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.212926 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t5622" event={"ID":"68f5077d-7967-4cbe-9254-09728b25ab58","Type":"ContainerStarted","Data":"067919409e1ad24c09b8dd158e4541ee76851a7b2ea3a16ccfccf0e94306e460"} Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.216656 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-45dj5" event={"ID":"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34","Type":"ContainerStarted","Data":"167ca370d7c0a0c12c9b9458b6de506ddd6d46228d5fbbce2e430dcd154b7134"} Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.219591 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" event={"ID":"1b9dbcfe-b92e-4c53-a410-86416ae62413","Type":"ContainerDied","Data":"db9ec6f522ae76077ce78ba3dec5cb60f5313fb37859a09b8eb3e085871db08d"} Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.219622 4841 scope.go:117] "RemoveContainer" containerID="6e40304e7bb30193215e7c6aa1b64cb5e94495d6e9c2a8ef9b6c2738d50d1707" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.219687 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-6vbsx" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.221068 4841 generic.go:334] "Generic (PLEG): container finished" podID="0d823011-229a-426a-99d7-0af611df4000" containerID="9c08f840a59acac8f70270c43eea5ff5025ee709df5670b4390941105ac52fdb" exitCode=0 Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.221101 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-676d-account-create-update-bdk8r" event={"ID":"0d823011-229a-426a-99d7-0af611df4000","Type":"ContainerDied","Data":"9c08f840a59acac8f70270c43eea5ff5025ee709df5670b4390941105ac52fdb"} Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.222241 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-de72-account-create-update-8qxjx" event={"ID":"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1","Type":"ContainerStarted","Data":"2733c25272bc77b1dfc05208dea14e07e8ff595e8adce5963f17598a5feb122b"} Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.222288 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-de72-account-create-update-8qxjx" event={"ID":"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1","Type":"ContainerStarted","Data":"d12145dec8b0bde50fa394da9df5092ce7df561d662c1ce61493fca1960c8a0f"} Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.231251 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-1306-account-create-update-mf8cj" podStartSLOduration=4.23123256 podStartE2EDuration="4.23123256s" podCreationTimestamp="2026-01-30 05:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:35.227464098 +0000 UTC m=+1072.220936736" watchObservedRunningTime="2026-01-30 05:25:35.23123256 +0000 UTC m=+1072.224705198" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.290961 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-de72-account-create-update-8qxjx" podStartSLOduration=4.290935274 podStartE2EDuration="4.290935274s" podCreationTimestamp="2026-01-30 05:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:35.277301046 +0000 UTC m=+1072.270773684" watchObservedRunningTime="2026-01-30 05:25:35.290935274 +0000 UTC m=+1072.284407912" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.400069 4841 scope.go:117] "RemoveContainer" containerID="042b8d579a22310a2de0e98f37bdc1bd2e5e2047dcadb4df5582af91e9a92bae" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.401975 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-6vbsx"] Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.413684 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-6vbsx"] Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.745813 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hpd9d" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.753672 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-26gs6" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.839149 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4fsv\" (UniqueName: \"kubernetes.io/projected/1611ffb0-e1aa-487c-aabb-a0f71f4856ff-kube-api-access-q4fsv\") pod \"1611ffb0-e1aa-487c-aabb-a0f71f4856ff\" (UID: \"1611ffb0-e1aa-487c-aabb-a0f71f4856ff\") " Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.839238 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1611ffb0-e1aa-487c-aabb-a0f71f4856ff-operator-scripts\") pod \"1611ffb0-e1aa-487c-aabb-a0f71f4856ff\" (UID: \"1611ffb0-e1aa-487c-aabb-a0f71f4856ff\") " Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.839715 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1611ffb0-e1aa-487c-aabb-a0f71f4856ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1611ffb0-e1aa-487c-aabb-a0f71f4856ff" (UID: "1611ffb0-e1aa-487c-aabb-a0f71f4856ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.839831 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a361348-e06e-4aa4-b180-0450782b1dfc-operator-scripts\") pod \"8a361348-e06e-4aa4-b180-0450782b1dfc\" (UID: \"8a361348-e06e-4aa4-b180-0450782b1dfc\") " Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.839862 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9wvm\" (UniqueName: \"kubernetes.io/projected/8a361348-e06e-4aa4-b180-0450782b1dfc-kube-api-access-w9wvm\") pod \"8a361348-e06e-4aa4-b180-0450782b1dfc\" (UID: \"8a361348-e06e-4aa4-b180-0450782b1dfc\") " Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.840385 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1611ffb0-e1aa-487c-aabb-a0f71f4856ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.840550 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a361348-e06e-4aa4-b180-0450782b1dfc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a361348-e06e-4aa4-b180-0450782b1dfc" (UID: "8a361348-e06e-4aa4-b180-0450782b1dfc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.844593 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a361348-e06e-4aa4-b180-0450782b1dfc-kube-api-access-w9wvm" (OuterVolumeSpecName: "kube-api-access-w9wvm") pod "8a361348-e06e-4aa4-b180-0450782b1dfc" (UID: "8a361348-e06e-4aa4-b180-0450782b1dfc"). InnerVolumeSpecName "kube-api-access-w9wvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.847986 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1611ffb0-e1aa-487c-aabb-a0f71f4856ff-kube-api-access-q4fsv" (OuterVolumeSpecName: "kube-api-access-q4fsv") pod "1611ffb0-e1aa-487c-aabb-a0f71f4856ff" (UID: "1611ffb0-e1aa-487c-aabb-a0f71f4856ff"). InnerVolumeSpecName "kube-api-access-q4fsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.941924 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a361348-e06e-4aa4-b180-0450782b1dfc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.941954 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9wvm\" (UniqueName: \"kubernetes.io/projected/8a361348-e06e-4aa4-b180-0450782b1dfc-kube-api-access-w9wvm\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:35 crc kubenswrapper[4841]: I0130 05:25:35.941964 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4fsv\" (UniqueName: \"kubernetes.io/projected/1611ffb0-e1aa-487c-aabb-a0f71f4856ff-kube-api-access-q4fsv\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.231684 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-26gs6" event={"ID":"1611ffb0-e1aa-487c-aabb-a0f71f4856ff","Type":"ContainerDied","Data":"02087a8f3d8412e3393f9730303503cfb661f38edaa27fe60b55518197490f7b"} Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.231968 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02087a8f3d8412e3393f9730303503cfb661f38edaa27fe60b55518197490f7b" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.231904 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-26gs6" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.238583 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hpd9d" event={"ID":"8a361348-e06e-4aa4-b180-0450782b1dfc","Type":"ContainerDied","Data":"15a9ca558c0006d54dde07e97eeaf1f1b91a5e960ef60682e27a82a57735ae35"} Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.238615 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hpd9d" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.238633 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15a9ca558c0006d54dde07e97eeaf1f1b91a5e960ef60682e27a82a57735ae35" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.240848 4841 generic.go:334] "Generic (PLEG): container finished" podID="39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1" containerID="2733c25272bc77b1dfc05208dea14e07e8ff595e8adce5963f17598a5feb122b" exitCode=0 Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.240946 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-de72-account-create-update-8qxjx" event={"ID":"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1","Type":"ContainerDied","Data":"2733c25272bc77b1dfc05208dea14e07e8ff595e8adce5963f17598a5feb122b"} Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.242507 4841 generic.go:334] "Generic (PLEG): container finished" podID="36bee932-2dad-4fca-aff2-0170cb6d4af8" containerID="ccf928f9462b3ee5fb5e26c65672c931a7fab29d56b8e9e63d2a1c130071a792" exitCode=0 Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.242656 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1306-account-create-update-mf8cj" event={"ID":"36bee932-2dad-4fca-aff2-0170cb6d4af8","Type":"ContainerDied","Data":"ccf928f9462b3ee5fb5e26c65672c931a7fab29d56b8e9e63d2a1c130071a792"} Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.441766 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b9dbcfe-b92e-4c53-a410-86416ae62413" path="/var/lib/kubelet/pods/1b9dbcfe-b92e-4c53-a410-86416ae62413/volumes" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.650730 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t5622" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.654392 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-676d-account-create-update-bdk8r" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.768482 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68f5077d-7967-4cbe-9254-09728b25ab58-operator-scripts\") pod \"68f5077d-7967-4cbe-9254-09728b25ab58\" (UID: \"68f5077d-7967-4cbe-9254-09728b25ab58\") " Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.768645 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m29wp\" (UniqueName: \"kubernetes.io/projected/68f5077d-7967-4cbe-9254-09728b25ab58-kube-api-access-m29wp\") pod \"68f5077d-7967-4cbe-9254-09728b25ab58\" (UID: \"68f5077d-7967-4cbe-9254-09728b25ab58\") " Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.768712 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hjsc\" (UniqueName: \"kubernetes.io/projected/0d823011-229a-426a-99d7-0af611df4000-kube-api-access-4hjsc\") pod \"0d823011-229a-426a-99d7-0af611df4000\" (UID: \"0d823011-229a-426a-99d7-0af611df4000\") " Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.768732 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d823011-229a-426a-99d7-0af611df4000-operator-scripts\") pod \"0d823011-229a-426a-99d7-0af611df4000\" (UID: \"0d823011-229a-426a-99d7-0af611df4000\") " Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.769242 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f5077d-7967-4cbe-9254-09728b25ab58-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68f5077d-7967-4cbe-9254-09728b25ab58" (UID: "68f5077d-7967-4cbe-9254-09728b25ab58"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.769407 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d823011-229a-426a-99d7-0af611df4000-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d823011-229a-426a-99d7-0af611df4000" (UID: "0d823011-229a-426a-99d7-0af611df4000"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.777555 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d823011-229a-426a-99d7-0af611df4000-kube-api-access-4hjsc" (OuterVolumeSpecName: "kube-api-access-4hjsc") pod "0d823011-229a-426a-99d7-0af611df4000" (UID: "0d823011-229a-426a-99d7-0af611df4000"). InnerVolumeSpecName "kube-api-access-4hjsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.786199 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f5077d-7967-4cbe-9254-09728b25ab58-kube-api-access-m29wp" (OuterVolumeSpecName: "kube-api-access-m29wp") pod "68f5077d-7967-4cbe-9254-09728b25ab58" (UID: "68f5077d-7967-4cbe-9254-09728b25ab58"). InnerVolumeSpecName "kube-api-access-m29wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.869947 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m29wp\" (UniqueName: \"kubernetes.io/projected/68f5077d-7967-4cbe-9254-09728b25ab58-kube-api-access-m29wp\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.869979 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d823011-229a-426a-99d7-0af611df4000-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.869999 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hjsc\" (UniqueName: \"kubernetes.io/projected/0d823011-229a-426a-99d7-0af611df4000-kube-api-access-4hjsc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:36 crc kubenswrapper[4841]: I0130 05:25:36.870008 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68f5077d-7967-4cbe-9254-09728b25ab58-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:37 crc kubenswrapper[4841]: I0130 05:25:37.258974 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-676d-account-create-update-bdk8r" event={"ID":"0d823011-229a-426a-99d7-0af611df4000","Type":"ContainerDied","Data":"52b11a0f25c4b378e65c7784349203ac082a9e72c08e54e49efa6affd956004a"} Jan 30 05:25:37 crc kubenswrapper[4841]: I0130 05:25:37.259014 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52b11a0f25c4b378e65c7784349203ac082a9e72c08e54e49efa6affd956004a" Jan 30 05:25:37 crc kubenswrapper[4841]: I0130 05:25:37.259065 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-676d-account-create-update-bdk8r" Jan 30 05:25:37 crc kubenswrapper[4841]: I0130 05:25:37.269612 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t5622" Jan 30 05:25:37 crc kubenswrapper[4841]: I0130 05:25:37.269969 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t5622" event={"ID":"68f5077d-7967-4cbe-9254-09728b25ab58","Type":"ContainerDied","Data":"067919409e1ad24c09b8dd158e4541ee76851a7b2ea3a16ccfccf0e94306e460"} Jan 30 05:25:37 crc kubenswrapper[4841]: I0130 05:25:37.270014 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="067919409e1ad24c09b8dd158e4541ee76851a7b2ea3a16ccfccf0e94306e460" Jan 30 05:25:40 crc kubenswrapper[4841]: I0130 05:25:40.463645 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:25:40 crc kubenswrapper[4841]: I0130 05:25:40.464285 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:25:41 crc kubenswrapper[4841]: I0130 05:25:41.865005 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-de72-account-create-update-8qxjx" Jan 30 05:25:41 crc kubenswrapper[4841]: I0130 05:25:41.871167 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1306-account-create-update-mf8cj" Jan 30 05:25:41 crc kubenswrapper[4841]: I0130 05:25:41.933960 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36bee932-2dad-4fca-aff2-0170cb6d4af8-operator-scripts\") pod \"36bee932-2dad-4fca-aff2-0170cb6d4af8\" (UID: \"36bee932-2dad-4fca-aff2-0170cb6d4af8\") " Jan 30 05:25:41 crc kubenswrapper[4841]: I0130 05:25:41.934066 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rthxm\" (UniqueName: \"kubernetes.io/projected/36bee932-2dad-4fca-aff2-0170cb6d4af8-kube-api-access-rthxm\") pod \"36bee932-2dad-4fca-aff2-0170cb6d4af8\" (UID: \"36bee932-2dad-4fca-aff2-0170cb6d4af8\") " Jan 30 05:25:41 crc kubenswrapper[4841]: I0130 05:25:41.934108 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1-operator-scripts\") pod \"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1\" (UID: \"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1\") " Jan 30 05:25:41 crc kubenswrapper[4841]: I0130 05:25:41.934123 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb5p2\" (UniqueName: \"kubernetes.io/projected/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1-kube-api-access-bb5p2\") pod \"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1\" (UID: \"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1\") " Jan 30 05:25:41 crc kubenswrapper[4841]: I0130 05:25:41.936449 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36bee932-2dad-4fca-aff2-0170cb6d4af8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36bee932-2dad-4fca-aff2-0170cb6d4af8" (UID: "36bee932-2dad-4fca-aff2-0170cb6d4af8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4841]: I0130 05:25:41.938190 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36bee932-2dad-4fca-aff2-0170cb6d4af8-kube-api-access-rthxm" (OuterVolumeSpecName: "kube-api-access-rthxm") pod "36bee932-2dad-4fca-aff2-0170cb6d4af8" (UID: "36bee932-2dad-4fca-aff2-0170cb6d4af8"). InnerVolumeSpecName "kube-api-access-rthxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4841]: I0130 05:25:41.938531 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1-kube-api-access-bb5p2" (OuterVolumeSpecName: "kube-api-access-bb5p2") pod "39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1" (UID: "39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1"). InnerVolumeSpecName "kube-api-access-bb5p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:41 crc kubenswrapper[4841]: I0130 05:25:41.939620 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1" (UID: "39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:42 crc kubenswrapper[4841]: I0130 05:25:42.035961 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rthxm\" (UniqueName: \"kubernetes.io/projected/36bee932-2dad-4fca-aff2-0170cb6d4af8-kube-api-access-rthxm\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:42 crc kubenswrapper[4841]: I0130 05:25:42.036189 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:42 crc kubenswrapper[4841]: I0130 05:25:42.036199 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb5p2\" (UniqueName: \"kubernetes.io/projected/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1-kube-api-access-bb5p2\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:42 crc kubenswrapper[4841]: I0130 05:25:42.036208 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36bee932-2dad-4fca-aff2-0170cb6d4af8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:42 crc kubenswrapper[4841]: I0130 05:25:42.308621 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1306-account-create-update-mf8cj" event={"ID":"36bee932-2dad-4fca-aff2-0170cb6d4af8","Type":"ContainerDied","Data":"962933372aea1cb7a25f6b30115b1779431d4235d101596c2b616772a7f025d8"} Jan 30 05:25:42 crc kubenswrapper[4841]: I0130 05:25:42.308661 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="962933372aea1cb7a25f6b30115b1779431d4235d101596c2b616772a7f025d8" Jan 30 05:25:42 crc kubenswrapper[4841]: I0130 05:25:42.308724 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1306-account-create-update-mf8cj" Jan 30 05:25:42 crc kubenswrapper[4841]: I0130 05:25:42.316952 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-45dj5" event={"ID":"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34","Type":"ContainerStarted","Data":"2fb5720129240f51365edf86e5b5e4984e61c72f72916a58ebebcb8a9d93bad3"} Jan 30 05:25:42 crc kubenswrapper[4841]: I0130 05:25:42.319392 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-de72-account-create-update-8qxjx" event={"ID":"39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1","Type":"ContainerDied","Data":"d12145dec8b0bde50fa394da9df5092ce7df561d662c1ce61493fca1960c8a0f"} Jan 30 05:25:42 crc kubenswrapper[4841]: I0130 05:25:42.319438 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d12145dec8b0bde50fa394da9df5092ce7df561d662c1ce61493fca1960c8a0f" Jan 30 05:25:42 crc kubenswrapper[4841]: I0130 05:25:42.319484 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-de72-account-create-update-8qxjx" Jan 30 05:25:42 crc kubenswrapper[4841]: I0130 05:25:42.906017 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-45dj5" podStartSLOduration=4.883992036 podStartE2EDuration="11.905992914s" podCreationTimestamp="2026-01-30 05:25:31 +0000 UTC" firstStartedPulling="2026-01-30 05:25:34.676900765 +0000 UTC m=+1071.670373393" lastFinishedPulling="2026-01-30 05:25:41.698901593 +0000 UTC m=+1078.692374271" observedRunningTime="2026-01-30 05:25:42.338671888 +0000 UTC m=+1079.332144526" watchObservedRunningTime="2026-01-30 05:25:42.905992914 +0000 UTC m=+1079.899465592" Jan 30 05:25:45 crc kubenswrapper[4841]: I0130 05:25:45.348366 4841 generic.go:334] "Generic (PLEG): container finished" podID="4ed3d0b9-0cdf-4174-8283-5dd9faadbe34" containerID="2fb5720129240f51365edf86e5b5e4984e61c72f72916a58ebebcb8a9d93bad3" exitCode=0 Jan 30 05:25:45 crc kubenswrapper[4841]: I0130 05:25:45.348478 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-45dj5" event={"ID":"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34","Type":"ContainerDied","Data":"2fb5720129240f51365edf86e5b5e4984e61c72f72916a58ebebcb8a9d93bad3"} Jan 30 05:25:46 crc kubenswrapper[4841]: I0130 05:25:46.789023 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:46 crc kubenswrapper[4841]: I0130 05:25:46.930621 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-config-data\") pod \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\" (UID: \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\") " Jan 30 05:25:46 crc kubenswrapper[4841]: I0130 05:25:46.930794 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-combined-ca-bundle\") pod \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\" (UID: \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\") " Jan 30 05:25:46 crc kubenswrapper[4841]: I0130 05:25:46.930828 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7jpm\" (UniqueName: \"kubernetes.io/projected/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-kube-api-access-m7jpm\") pod \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\" (UID: \"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34\") " Jan 30 05:25:46 crc kubenswrapper[4841]: I0130 05:25:46.937563 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-kube-api-access-m7jpm" (OuterVolumeSpecName: "kube-api-access-m7jpm") pod "4ed3d0b9-0cdf-4174-8283-5dd9faadbe34" (UID: "4ed3d0b9-0cdf-4174-8283-5dd9faadbe34"). InnerVolumeSpecName "kube-api-access-m7jpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:46 crc kubenswrapper[4841]: I0130 05:25:46.959500 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ed3d0b9-0cdf-4174-8283-5dd9faadbe34" (UID: "4ed3d0b9-0cdf-4174-8283-5dd9faadbe34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.002607 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-config-data" (OuterVolumeSpecName: "config-data") pod "4ed3d0b9-0cdf-4174-8283-5dd9faadbe34" (UID: "4ed3d0b9-0cdf-4174-8283-5dd9faadbe34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.033555 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.033601 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7jpm\" (UniqueName: \"kubernetes.io/projected/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-kube-api-access-m7jpm\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.033622 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.384223 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-45dj5" event={"ID":"4ed3d0b9-0cdf-4174-8283-5dd9faadbe34","Type":"ContainerDied","Data":"167ca370d7c0a0c12c9b9458b6de506ddd6d46228d5fbbce2e430dcd154b7134"} Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.384285 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="167ca370d7c0a0c12c9b9458b6de506ddd6d46228d5fbbce2e430dcd154b7134" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.384335 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-45dj5" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.628209 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-v4csl"] Jan 30 05:25:47 crc kubenswrapper[4841]: E0130 05:25:47.628625 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f5077d-7967-4cbe-9254-09728b25ab58" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.628639 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f5077d-7967-4cbe-9254-09728b25ab58" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4841]: E0130 05:25:47.628662 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9dbcfe-b92e-4c53-a410-86416ae62413" containerName="init" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.628670 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9dbcfe-b92e-4c53-a410-86416ae62413" containerName="init" Jan 30 05:25:47 crc kubenswrapper[4841]: E0130 05:25:47.628685 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed3d0b9-0cdf-4174-8283-5dd9faadbe34" containerName="keystone-db-sync" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.628694 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed3d0b9-0cdf-4174-8283-5dd9faadbe34" containerName="keystone-db-sync" Jan 30 05:25:47 crc kubenswrapper[4841]: E0130 05:25:47.628709 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1611ffb0-e1aa-487c-aabb-a0f71f4856ff" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.628718 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1611ffb0-e1aa-487c-aabb-a0f71f4856ff" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4841]: E0130 05:25:47.628737 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a361348-e06e-4aa4-b180-0450782b1dfc" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.628746 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a361348-e06e-4aa4-b180-0450782b1dfc" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4841]: E0130 05:25:47.628756 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9dbcfe-b92e-4c53-a410-86416ae62413" containerName="dnsmasq-dns" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.628764 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9dbcfe-b92e-4c53-a410-86416ae62413" containerName="dnsmasq-dns" Jan 30 05:25:47 crc kubenswrapper[4841]: E0130 05:25:47.628777 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36bee932-2dad-4fca-aff2-0170cb6d4af8" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.628785 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="36bee932-2dad-4fca-aff2-0170cb6d4af8" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4841]: E0130 05:25:47.628803 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.628812 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4841]: E0130 05:25:47.628825 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d823011-229a-426a-99d7-0af611df4000" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.628834 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d823011-229a-426a-99d7-0af611df4000" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.629018 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="36bee932-2dad-4fca-aff2-0170cb6d4af8" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.629035 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f5077d-7967-4cbe-9254-09728b25ab58" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.629047 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.629062 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9dbcfe-b92e-4c53-a410-86416ae62413" containerName="dnsmasq-dns" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.629077 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed3d0b9-0cdf-4174-8283-5dd9faadbe34" containerName="keystone-db-sync" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.629089 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a361348-e06e-4aa4-b180-0450782b1dfc" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.629104 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1611ffb0-e1aa-487c-aabb-a0f71f4856ff" containerName="mariadb-database-create" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.629118 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d823011-229a-426a-99d7-0af611df4000" containerName="mariadb-account-create-update" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.630122 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.648111 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-v4csl"] Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.667848 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wwr2z"] Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.669015 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.674847 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.675114 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.675220 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.675330 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zpg5t" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.675872 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.702485 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wwr2z"] Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.747198 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.747243 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-config\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.747265 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mbcx\" (UniqueName: \"kubernetes.io/projected/97750314-b5f7-41a9-92cb-04567a919306-kube-api-access-7mbcx\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.747326 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.747349 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-dns-svc\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.747379 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-dns-swift-storage-0\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.850070 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.850115 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-config\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.850134 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mbcx\" (UniqueName: \"kubernetes.io/projected/97750314-b5f7-41a9-92cb-04567a919306-kube-api-access-7mbcx\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.850183 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-combined-ca-bundle\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.850207 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-credential-keys\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.850226 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.850249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-dns-svc\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.850265 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5wm8\" (UniqueName: \"kubernetes.io/projected/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-kube-api-access-s5wm8\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.850285 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-dns-swift-storage-0\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.850299 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-scripts\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.850325 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-fernet-keys\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.850358 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-config-data\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.851240 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.851754 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-config\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.852449 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-dns-svc\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.854038 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-dns-swift-storage-0\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.854696 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.882261 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-t7thb"] Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.883222 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.899207 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-s5wxv" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.899459 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.899632 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.901428 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mbcx\" (UniqueName: \"kubernetes.io/projected/97750314-b5f7-41a9-92cb-04567a919306-kube-api-access-7mbcx\") pod \"dnsmasq-dns-5fdbfbc95f-v4csl\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.911526 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-t7thb"] Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.958515 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.959847 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-combined-ca-bundle\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.959896 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj5wd\" (UniqueName: \"kubernetes.io/projected/502d6fe3-4215-4b32-8546-a55e5a4afc91-kube-api-access-pj5wd\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.959922 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-credential-keys\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.959983 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5wm8\" (UniqueName: \"kubernetes.io/projected/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-kube-api-access-s5wm8\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.960013 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-config-data\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.960031 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-scripts\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.960076 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-db-sync-config-data\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.960097 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-fernet-keys\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.960120 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/502d6fe3-4215-4b32-8546-a55e5a4afc91-etc-machine-id\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.960135 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-scripts\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.960170 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-combined-ca-bundle\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.960215 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-config-data\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.964501 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-config-data\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.974280 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-credential-keys\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.978876 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-scripts\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.979427 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-combined-ca-bundle\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:47 crc kubenswrapper[4841]: I0130 05:25:47.991944 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-fernet-keys\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.001804 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5wm8\" (UniqueName: \"kubernetes.io/projected/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-kube-api-access-s5wm8\") pod \"keystone-bootstrap-wwr2z\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.052425 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-dzfl5"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.053471 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.056568 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.056710 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-t24qx" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.061875 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj5wd\" (UniqueName: \"kubernetes.io/projected/502d6fe3-4215-4b32-8546-a55e5a4afc91-kube-api-access-pj5wd\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.061964 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-config-data\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.062002 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-db-sync-config-data\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.062027 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/502d6fe3-4215-4b32-8546-a55e5a4afc91-etc-machine-id\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.062046 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-scripts\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.062073 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-combined-ca-bundle\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.062099 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-combined-ca-bundle\") pod \"barbican-db-sync-dzfl5\" (UID: \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\") " pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.062123 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-db-sync-config-data\") pod \"barbican-db-sync-dzfl5\" (UID: \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\") " pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.062169 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jj4q\" (UniqueName: \"kubernetes.io/projected/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-kube-api-access-2jj4q\") pod \"barbican-db-sync-dzfl5\" (UID: \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\") " pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.066569 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/502d6fe3-4215-4b32-8546-a55e5a4afc91-etc-machine-id\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.081140 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-b8f64"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.082183 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b8f64" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.083983 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.106594 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-config-data\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.107017 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.107179 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8vxjs" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.109958 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-db-sync-config-data\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.110014 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dzfl5"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.128576 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-combined-ca-bundle\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.136957 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b8f64"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.142539 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj5wd\" (UniqueName: \"kubernetes.io/projected/502d6fe3-4215-4b32-8546-a55e5a4afc91-kube-api-access-pj5wd\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.161211 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-scripts\") pod \"cinder-db-sync-t7thb\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.163346 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-combined-ca-bundle\") pod \"barbican-db-sync-dzfl5\" (UID: \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\") " pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.163389 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-db-sync-config-data\") pod \"barbican-db-sync-dzfl5\" (UID: \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\") " pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.163448 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jj4q\" (UniqueName: \"kubernetes.io/projected/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-kube-api-access-2jj4q\") pod \"barbican-db-sync-dzfl5\" (UID: \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\") " pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.170052 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-combined-ca-bundle\") pod \"barbican-db-sync-dzfl5\" (UID: \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\") " pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.181024 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-db-sync-config-data\") pod \"barbican-db-sync-dzfl5\" (UID: \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\") " pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.250896 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jj4q\" (UniqueName: \"kubernetes.io/projected/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-kube-api-access-2jj4q\") pod \"barbican-db-sync-dzfl5\" (UID: \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\") " pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.250956 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.257775 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.271926 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.273318 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7thb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.273370 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.273970 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-v4csl"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.283025 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a606323e-b83b-4046-aeff-ea4ded617943-config\") pod \"neutron-db-sync-b8f64\" (UID: \"a606323e-b83b-4046-aeff-ea4ded617943\") " pod="openstack/neutron-db-sync-b8f64" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.283593 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95528d5c-f34b-4913-9db3-05ef436c106d-run-httpd\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.284098 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.284235 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv86p\" (UniqueName: \"kubernetes.io/projected/a606323e-b83b-4046-aeff-ea4ded617943-kube-api-access-nv86p\") pod \"neutron-db-sync-b8f64\" (UID: \"a606323e-b83b-4046-aeff-ea4ded617943\") " pod="openstack/neutron-db-sync-b8f64" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.284327 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qltc\" (UniqueName: \"kubernetes.io/projected/95528d5c-f34b-4913-9db3-05ef436c106d-kube-api-access-8qltc\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.284447 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.284525 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a606323e-b83b-4046-aeff-ea4ded617943-combined-ca-bundle\") pod \"neutron-db-sync-b8f64\" (UID: \"a606323e-b83b-4046-aeff-ea4ded617943\") " pod="openstack/neutron-db-sync-b8f64" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.284610 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-config-data\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.287070 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.290698 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.296561 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95528d5c-f34b-4913-9db3-05ef436c106d-log-httpd\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.296752 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-scripts\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.325328 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8z7z8"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.326316 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.328708 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.328926 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4vwkx" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.329030 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.338139 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8z7z8"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.346642 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-mtzkb"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.351499 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.358746 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-mtzkb"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.399792 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qltc\" (UniqueName: \"kubernetes.io/projected/95528d5c-f34b-4913-9db3-05ef436c106d-kube-api-access-8qltc\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.400004 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-config\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.400045 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnjht\" (UniqueName: \"kubernetes.io/projected/04f2acf0-5409-4a59-b61b-33b657368f0f-kube-api-access-hnjht\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.400063 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.400201 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a606323e-b83b-4046-aeff-ea4ded617943-combined-ca-bundle\") pod \"neutron-db-sync-b8f64\" (UID: \"a606323e-b83b-4046-aeff-ea4ded617943\") " pod="openstack/neutron-db-sync-b8f64" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.400229 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpsjg\" (UniqueName: \"kubernetes.io/projected/738fde20-8e94-46e9-bb59-24f917e279cd-kube-api-access-dpsjg\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.400252 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-config-data\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.400274 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-scripts\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.400314 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-dns-svc\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.400331 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.400371 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95528d5c-f34b-4913-9db3-05ef436c106d-log-httpd\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.400452 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-scripts\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.403261 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a606323e-b83b-4046-aeff-ea4ded617943-config\") pod \"neutron-db-sync-b8f64\" (UID: \"a606323e-b83b-4046-aeff-ea4ded617943\") " pod="openstack/neutron-db-sync-b8f64" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.403291 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95528d5c-f34b-4913-9db3-05ef436c106d-run-httpd\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.403351 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.403389 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.403444 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/738fde20-8e94-46e9-bb59-24f917e279cd-logs\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.403464 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-combined-ca-bundle\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.403486 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.403517 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-config-data\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.403545 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv86p\" (UniqueName: \"kubernetes.io/projected/a606323e-b83b-4046-aeff-ea4ded617943-kube-api-access-nv86p\") pod \"neutron-db-sync-b8f64\" (UID: \"a606323e-b83b-4046-aeff-ea4ded617943\") " pod="openstack/neutron-db-sync-b8f64" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.412725 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.415803 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a606323e-b83b-4046-aeff-ea4ded617943-combined-ca-bundle\") pod \"neutron-db-sync-b8f64\" (UID: \"a606323e-b83b-4046-aeff-ea4ded617943\") " pod="openstack/neutron-db-sync-b8f64" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.416536 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95528d5c-f34b-4913-9db3-05ef436c106d-run-httpd\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.416767 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95528d5c-f34b-4913-9db3-05ef436c106d-log-httpd\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.421027 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-config-data\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.423085 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-scripts\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.423214 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv86p\" (UniqueName: \"kubernetes.io/projected/a606323e-b83b-4046-aeff-ea4ded617943-kube-api-access-nv86p\") pod \"neutron-db-sync-b8f64\" (UID: \"a606323e-b83b-4046-aeff-ea4ded617943\") " pod="openstack/neutron-db-sync-b8f64" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.424361 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a606323e-b83b-4046-aeff-ea4ded617943-config\") pod \"neutron-db-sync-b8f64\" (UID: \"a606323e-b83b-4046-aeff-ea4ded617943\") " pod="openstack/neutron-db-sync-b8f64" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.424768 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qltc\" (UniqueName: \"kubernetes.io/projected/95528d5c-f34b-4913-9db3-05ef436c106d-kube-api-access-8qltc\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.426257 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.490073 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.505464 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-config-data\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.505527 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-config\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.505556 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnjht\" (UniqueName: \"kubernetes.io/projected/04f2acf0-5409-4a59-b61b-33b657368f0f-kube-api-access-hnjht\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.505597 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpsjg\" (UniqueName: \"kubernetes.io/projected/738fde20-8e94-46e9-bb59-24f917e279cd-kube-api-access-dpsjg\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.505635 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-scripts\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.505655 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-dns-svc\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.505675 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.505736 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.505758 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.505775 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/738fde20-8e94-46e9-bb59-24f917e279cd-logs\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.505796 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-combined-ca-bundle\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.506476 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/738fde20-8e94-46e9-bb59-24f917e279cd-logs\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.506855 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.507025 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-dns-svc\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.507445 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-config\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.507563 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.507595 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.510181 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-combined-ca-bundle\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.514219 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-config-data\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.520166 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b8f64" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.521524 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-scripts\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.525618 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpsjg\" (UniqueName: \"kubernetes.io/projected/738fde20-8e94-46e9-bb59-24f917e279cd-kube-api-access-dpsjg\") pod \"placement-db-sync-8z7z8\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.527365 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnjht\" (UniqueName: \"kubernetes.io/projected/04f2acf0-5409-4a59-b61b-33b657368f0f-kube-api-access-hnjht\") pod \"dnsmasq-dns-6f6f8cb849-mtzkb\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.593879 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.651257 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8z7z8" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.678909 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.683802 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-v4csl"] Jan 30 05:25:48 crc kubenswrapper[4841]: W0130 05:25:48.703762 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97750314_b5f7_41a9_92cb_04567a919306.slice/crio-1655528a77fc9f8004876283cadc895aca3d77f94baa1644563c70f471e7b16e WatchSource:0}: Error finding container 1655528a77fc9f8004876283cadc895aca3d77f94baa1644563c70f471e7b16e: Status 404 returned error can't find the container with id 1655528a77fc9f8004876283cadc895aca3d77f94baa1644563c70f471e7b16e Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.774697 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-t7thb"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.791025 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.792356 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.798331 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.798599 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.798785 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tfq6h" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.798929 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.813165 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-scripts\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.813479 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.813502 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.813535 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-config-data\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.813590 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhvq8\" (UniqueName: \"kubernetes.io/projected/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-kube-api-access-nhvq8\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.813622 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-logs\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.813658 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.813676 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.828565 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.894726 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wwr2z"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.914698 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-logs\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.914758 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.914776 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.914809 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-scripts\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.914840 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.914855 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.914885 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-config-data\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.914926 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhvq8\" (UniqueName: \"kubernetes.io/projected/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-kube-api-access-nhvq8\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.915560 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-logs\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.915843 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.916117 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.925261 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-scripts\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.927384 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-config-data\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.928507 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.928699 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.933020 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.937944 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.940317 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.940534 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.945060 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.953316 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.981635 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhvq8\" (UniqueName: \"kubernetes.io/projected/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-kube-api-access-nhvq8\") pod \"glance-default-external-api-0\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:48 crc kubenswrapper[4841]: I0130 05:25:48.983663 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dzfl5"] Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.126118 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.127770 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.128546 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.128606 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.128650 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hb6r\" (UniqueName: \"kubernetes.io/projected/3a63a7bb-35a1-48a8-90c1-d21117adc486-kube-api-access-2hb6r\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.128691 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a63a7bb-35a1-48a8-90c1-d21117adc486-logs\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.128744 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a63a7bb-35a1-48a8-90c1-d21117adc486-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.128777 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.128833 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.161961 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b8f64"] Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.226226 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.231112 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.231161 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.231181 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.231254 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.231290 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.231312 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hb6r\" (UniqueName: \"kubernetes.io/projected/3a63a7bb-35a1-48a8-90c1-d21117adc486-kube-api-access-2hb6r\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.231332 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a63a7bb-35a1-48a8-90c1-d21117adc486-logs\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.231379 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a63a7bb-35a1-48a8-90c1-d21117adc486-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.231901 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a63a7bb-35a1-48a8-90c1-d21117adc486-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.232104 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.233201 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a63a7bb-35a1-48a8-90c1-d21117adc486-logs\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.237020 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.245313 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.262311 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.263109 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.271667 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hb6r\" (UniqueName: \"kubernetes.io/projected/3a63a7bb-35a1-48a8-90c1-d21117adc486-kube-api-access-2hb6r\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.272049 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.283245 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.407946 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dzfl5" event={"ID":"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a","Type":"ContainerStarted","Data":"8c91b95ce0ea018cf6e877003cecb6ad1dc03b8adc36e924fc93a6bfb121a5fc"} Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.434839 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-mtzkb"] Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.445119 4841 generic.go:334] "Generic (PLEG): container finished" podID="97750314-b5f7-41a9-92cb-04567a919306" containerID="82f5c9452aadeeec0f045529015561ac3da7b2fd53f7d72792444d7c920a978e" exitCode=0 Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.445232 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" event={"ID":"97750314-b5f7-41a9-92cb-04567a919306","Type":"ContainerDied","Data":"82f5c9452aadeeec0f045529015561ac3da7b2fd53f7d72792444d7c920a978e"} Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.445261 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" event={"ID":"97750314-b5f7-41a9-92cb-04567a919306","Type":"ContainerStarted","Data":"1655528a77fc9f8004876283cadc895aca3d77f94baa1644563c70f471e7b16e"} Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.458772 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8z7z8"] Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.462562 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95528d5c-f34b-4913-9db3-05ef436c106d","Type":"ContainerStarted","Data":"7a596e5efb750b1d547bfacb64364003ad337bfe0b01d1c0d08932eed1bdf44f"} Jan 30 05:25:49 crc kubenswrapper[4841]: W0130 05:25:49.472659 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04f2acf0_5409_4a59_b61b_33b657368f0f.slice/crio-6f4e1f5ffd28780646c236957b068739329db5aedab6b449f46bd76a138ad367 WatchSource:0}: Error finding container 6f4e1f5ffd28780646c236957b068739329db5aedab6b449f46bd76a138ad367: Status 404 returned error can't find the container with id 6f4e1f5ffd28780646c236957b068739329db5aedab6b449f46bd76a138ad367 Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.474540 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7thb" event={"ID":"502d6fe3-4215-4b32-8546-a55e5a4afc91","Type":"ContainerStarted","Data":"bbea313c627f342f17d623bf1060c6258e35bc29916fdb9b73500546ef1265bc"} Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.486546 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wwr2z" event={"ID":"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9","Type":"ContainerStarted","Data":"5ea394b127fd970af53caa39f1534450199ef44d3f1786176954b4e967668dcb"} Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.486591 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wwr2z" event={"ID":"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9","Type":"ContainerStarted","Data":"c12d0df66cf36b9da8421d18e963fe48216683c0e220ac29ef1fddcb8a71b12f"} Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.500804 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b8f64" event={"ID":"a606323e-b83b-4046-aeff-ea4ded617943","Type":"ContainerStarted","Data":"e73b4956c1fe267eab078e4b8bc409093eaf79c3a536821a957a5a9cfc33ca8d"} Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.517492 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wwr2z" podStartSLOduration=2.517468684 podStartE2EDuration="2.517468684s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:49.510595648 +0000 UTC m=+1086.504068286" watchObservedRunningTime="2026-01-30 05:25:49.517468684 +0000 UTC m=+1086.510941312" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.540982 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-b8f64" podStartSLOduration=2.540962929 podStartE2EDuration="2.540962929s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:49.540852366 +0000 UTC m=+1086.534325004" watchObservedRunningTime="2026-01-30 05:25:49.540962929 +0000 UTC m=+1086.534435557" Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.606510 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:49 crc kubenswrapper[4841]: I0130 05:25:49.965917 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.014723 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.060522 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-ovsdbserver-sb\") pod \"97750314-b5f7-41a9-92cb-04567a919306\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.060578 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-ovsdbserver-nb\") pod \"97750314-b5f7-41a9-92cb-04567a919306\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.060626 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-config\") pod \"97750314-b5f7-41a9-92cb-04567a919306\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.061937 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-dns-svc\") pod \"97750314-b5f7-41a9-92cb-04567a919306\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.061995 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-dns-swift-storage-0\") pod \"97750314-b5f7-41a9-92cb-04567a919306\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.066735 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mbcx\" (UniqueName: \"kubernetes.io/projected/97750314-b5f7-41a9-92cb-04567a919306-kube-api-access-7mbcx\") pod \"97750314-b5f7-41a9-92cb-04567a919306\" (UID: \"97750314-b5f7-41a9-92cb-04567a919306\") " Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.078257 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97750314-b5f7-41a9-92cb-04567a919306-kube-api-access-7mbcx" (OuterVolumeSpecName: "kube-api-access-7mbcx") pod "97750314-b5f7-41a9-92cb-04567a919306" (UID: "97750314-b5f7-41a9-92cb-04567a919306"). InnerVolumeSpecName "kube-api-access-7mbcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.092085 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "97750314-b5f7-41a9-92cb-04567a919306" (UID: "97750314-b5f7-41a9-92cb-04567a919306"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.101663 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-config" (OuterVolumeSpecName: "config") pod "97750314-b5f7-41a9-92cb-04567a919306" (UID: "97750314-b5f7-41a9-92cb-04567a919306"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.118586 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "97750314-b5f7-41a9-92cb-04567a919306" (UID: "97750314-b5f7-41a9-92cb-04567a919306"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.143944 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "97750314-b5f7-41a9-92cb-04567a919306" (UID: "97750314-b5f7-41a9-92cb-04567a919306"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.158934 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "97750314-b5f7-41a9-92cb-04567a919306" (UID: "97750314-b5f7-41a9-92cb-04567a919306"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.170846 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.170878 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.170887 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.170896 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.170907 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97750314-b5f7-41a9-92cb-04567a919306-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.170916 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mbcx\" (UniqueName: \"kubernetes.io/projected/97750314-b5f7-41a9-92cb-04567a919306-kube-api-access-7mbcx\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.529670 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8z7z8" event={"ID":"738fde20-8e94-46e9-bb59-24f917e279cd","Type":"ContainerStarted","Data":"29eb23e8e6962874ebfa7461693d4ba17762ee16d090633394d30cf39df72e74"} Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.532217 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" event={"ID":"97750314-b5f7-41a9-92cb-04567a919306","Type":"ContainerDied","Data":"1655528a77fc9f8004876283cadc895aca3d77f94baa1644563c70f471e7b16e"} Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.532250 4841 scope.go:117] "RemoveContainer" containerID="82f5c9452aadeeec0f045529015561ac3da7b2fd53f7d72792444d7c920a978e" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.532354 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-v4csl" Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.537018 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3a63a7bb-35a1-48a8-90c1-d21117adc486","Type":"ContainerStarted","Data":"5df74c607bc79253a08333d7293f8ac1b957d94f8524d260f86d0ec31e5fe155"} Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.540505 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4676d4e6-2ad0-4c2e-9383-ab1700c389f2","Type":"ContainerStarted","Data":"3463417a7b10bf01cf01f29297ea14ff4da3d157fcfeb9e7942473392639f44e"} Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.544616 4841 generic.go:334] "Generic (PLEG): container finished" podID="04f2acf0-5409-4a59-b61b-33b657368f0f" containerID="2ba8a83227e53817f9ce555c972d42ea1f8375d6f21b2b439488c05d8c4580a9" exitCode=0 Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.544689 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" event={"ID":"04f2acf0-5409-4a59-b61b-33b657368f0f","Type":"ContainerDied","Data":"2ba8a83227e53817f9ce555c972d42ea1f8375d6f21b2b439488c05d8c4580a9"} Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.544712 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" event={"ID":"04f2acf0-5409-4a59-b61b-33b657368f0f","Type":"ContainerStarted","Data":"6f4e1f5ffd28780646c236957b068739329db5aedab6b449f46bd76a138ad367"} Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.559210 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b8f64" event={"ID":"a606323e-b83b-4046-aeff-ea4ded617943","Type":"ContainerStarted","Data":"4917c8f1f511e2ab1d1cbd400696ac027e9a598fabd5f2c438a5fb3e801553af"} Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.679522 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-v4csl"] Jan 30 05:25:50 crc kubenswrapper[4841]: I0130 05:25:50.699854 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-v4csl"] Jan 30 05:25:51 crc kubenswrapper[4841]: I0130 05:25:51.300108 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:51 crc kubenswrapper[4841]: I0130 05:25:51.356469 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:25:51 crc kubenswrapper[4841]: I0130 05:25:51.369350 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:51 crc kubenswrapper[4841]: I0130 05:25:51.663844 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4676d4e6-2ad0-4c2e-9383-ab1700c389f2","Type":"ContainerStarted","Data":"0d56f0169dd79c3b5c44e6dfdb3e9e9c70f4258f48b01b103b52f667eea2d875"} Jan 30 05:25:51 crc kubenswrapper[4841]: I0130 05:25:51.667077 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" event={"ID":"04f2acf0-5409-4a59-b61b-33b657368f0f","Type":"ContainerStarted","Data":"eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37"} Jan 30 05:25:51 crc kubenswrapper[4841]: I0130 05:25:51.698280 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3a63a7bb-35a1-48a8-90c1-d21117adc486","Type":"ContainerStarted","Data":"0984185d14d2c7b9d96eafc322b9f62d62c646fcbc3e77c6b28e1618f357a1f5"} Jan 30 05:25:51 crc kubenswrapper[4841]: I0130 05:25:51.704741 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" podStartSLOduration=3.704728272 podStartE2EDuration="3.704728272s" podCreationTimestamp="2026-01-30 05:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:51.704247879 +0000 UTC m=+1088.697720517" watchObservedRunningTime="2026-01-30 05:25:51.704728272 +0000 UTC m=+1088.698200910" Jan 30 05:25:52 crc kubenswrapper[4841]: I0130 05:25:52.451461 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97750314-b5f7-41a9-92cb-04567a919306" path="/var/lib/kubelet/pods/97750314-b5f7-41a9-92cb-04567a919306/volumes" Jan 30 05:25:52 crc kubenswrapper[4841]: I0130 05:25:52.709320 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4676d4e6-2ad0-4c2e-9383-ab1700c389f2","Type":"ContainerStarted","Data":"ccff00c4326e9c089c2e40b939ca105d011b12e978e557e81a8435798aaa66d6"} Jan 30 05:25:52 crc kubenswrapper[4841]: I0130 05:25:52.709481 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4676d4e6-2ad0-4c2e-9383-ab1700c389f2" containerName="glance-log" containerID="cri-o://0d56f0169dd79c3b5c44e6dfdb3e9e9c70f4258f48b01b103b52f667eea2d875" gracePeriod=30 Jan 30 05:25:52 crc kubenswrapper[4841]: I0130 05:25:52.710001 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4676d4e6-2ad0-4c2e-9383-ab1700c389f2" containerName="glance-httpd" containerID="cri-o://ccff00c4326e9c089c2e40b939ca105d011b12e978e557e81a8435798aaa66d6" gracePeriod=30 Jan 30 05:25:52 crc kubenswrapper[4841]: I0130 05:25:52.713103 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3a63a7bb-35a1-48a8-90c1-d21117adc486" containerName="glance-log" containerID="cri-o://0984185d14d2c7b9d96eafc322b9f62d62c646fcbc3e77c6b28e1618f357a1f5" gracePeriod=30 Jan 30 05:25:52 crc kubenswrapper[4841]: I0130 05:25:52.713157 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3a63a7bb-35a1-48a8-90c1-d21117adc486","Type":"ContainerStarted","Data":"27137f5699f9345d442ca5a1cd89a37b0897a5eae555b1f5b0b86bcb5d14a1e8"} Jan 30 05:25:52 crc kubenswrapper[4841]: I0130 05:25:52.713174 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:52 crc kubenswrapper[4841]: I0130 05:25:52.713212 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3a63a7bb-35a1-48a8-90c1-d21117adc486" containerName="glance-httpd" containerID="cri-o://27137f5699f9345d442ca5a1cd89a37b0897a5eae555b1f5b0b86bcb5d14a1e8" gracePeriod=30 Jan 30 05:25:52 crc kubenswrapper[4841]: I0130 05:25:52.738470 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.738445068 podStartE2EDuration="5.738445068s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:52.729090894 +0000 UTC m=+1089.722563532" watchObservedRunningTime="2026-01-30 05:25:52.738445068 +0000 UTC m=+1089.731917716" Jan 30 05:25:52 crc kubenswrapper[4841]: I0130 05:25:52.763314 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.763295239 podStartE2EDuration="5.763295239s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:25:52.752273102 +0000 UTC m=+1089.745745740" watchObservedRunningTime="2026-01-30 05:25:52.763295239 +0000 UTC m=+1089.756767877" Jan 30 05:25:53 crc kubenswrapper[4841]: I0130 05:25:53.723240 4841 generic.go:334] "Generic (PLEG): container finished" podID="4676d4e6-2ad0-4c2e-9383-ab1700c389f2" containerID="ccff00c4326e9c089c2e40b939ca105d011b12e978e557e81a8435798aaa66d6" exitCode=0 Jan 30 05:25:53 crc kubenswrapper[4841]: I0130 05:25:53.723523 4841 generic.go:334] "Generic (PLEG): container finished" podID="4676d4e6-2ad0-4c2e-9383-ab1700c389f2" containerID="0d56f0169dd79c3b5c44e6dfdb3e9e9c70f4258f48b01b103b52f667eea2d875" exitCode=143 Jan 30 05:25:53 crc kubenswrapper[4841]: I0130 05:25:53.723337 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4676d4e6-2ad0-4c2e-9383-ab1700c389f2","Type":"ContainerDied","Data":"ccff00c4326e9c089c2e40b939ca105d011b12e978e557e81a8435798aaa66d6"} Jan 30 05:25:53 crc kubenswrapper[4841]: I0130 05:25:53.723594 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4676d4e6-2ad0-4c2e-9383-ab1700c389f2","Type":"ContainerDied","Data":"0d56f0169dd79c3b5c44e6dfdb3e9e9c70f4258f48b01b103b52f667eea2d875"} Jan 30 05:25:53 crc kubenswrapper[4841]: I0130 05:25:53.726757 4841 generic.go:334] "Generic (PLEG): container finished" podID="3a63a7bb-35a1-48a8-90c1-d21117adc486" containerID="27137f5699f9345d442ca5a1cd89a37b0897a5eae555b1f5b0b86bcb5d14a1e8" exitCode=0 Jan 30 05:25:53 crc kubenswrapper[4841]: I0130 05:25:53.726782 4841 generic.go:334] "Generic (PLEG): container finished" podID="3a63a7bb-35a1-48a8-90c1-d21117adc486" containerID="0984185d14d2c7b9d96eafc322b9f62d62c646fcbc3e77c6b28e1618f357a1f5" exitCode=143 Jan 30 05:25:53 crc kubenswrapper[4841]: I0130 05:25:53.726841 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3a63a7bb-35a1-48a8-90c1-d21117adc486","Type":"ContainerDied","Data":"27137f5699f9345d442ca5a1cd89a37b0897a5eae555b1f5b0b86bcb5d14a1e8"} Jan 30 05:25:53 crc kubenswrapper[4841]: I0130 05:25:53.726893 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3a63a7bb-35a1-48a8-90c1-d21117adc486","Type":"ContainerDied","Data":"0984185d14d2c7b9d96eafc322b9f62d62c646fcbc3e77c6b28e1618f357a1f5"} Jan 30 05:25:54 crc kubenswrapper[4841]: I0130 05:25:54.736201 4841 generic.go:334] "Generic (PLEG): container finished" podID="3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9" containerID="5ea394b127fd970af53caa39f1534450199ef44d3f1786176954b4e967668dcb" exitCode=0 Jan 30 05:25:54 crc kubenswrapper[4841]: I0130 05:25:54.736239 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wwr2z" event={"ID":"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9","Type":"ContainerDied","Data":"5ea394b127fd970af53caa39f1534450199ef44d3f1786176954b4e967668dcb"} Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.308892 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.313963 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.444743 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hb6r\" (UniqueName: \"kubernetes.io/projected/3a63a7bb-35a1-48a8-90c1-d21117adc486-kube-api-access-2hb6r\") pod \"3a63a7bb-35a1-48a8-90c1-d21117adc486\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.444808 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-config-data\") pod \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.444855 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-logs\") pod \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.444902 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-scripts\") pod \"3a63a7bb-35a1-48a8-90c1-d21117adc486\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.444985 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a63a7bb-35a1-48a8-90c1-d21117adc486-httpd-run\") pod \"3a63a7bb-35a1-48a8-90c1-d21117adc486\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.445022 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-config-data\") pod \"3a63a7bb-35a1-48a8-90c1-d21117adc486\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.445056 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhvq8\" (UniqueName: \"kubernetes.io/projected/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-kube-api-access-nhvq8\") pod \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.445131 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-scripts\") pod \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.445169 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-internal-tls-certs\") pod \"3a63a7bb-35a1-48a8-90c1-d21117adc486\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.445220 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-httpd-run\") pod \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.445294 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-public-tls-certs\") pod \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.445355 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.445497 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a63a7bb-35a1-48a8-90c1-d21117adc486-logs\") pod \"3a63a7bb-35a1-48a8-90c1-d21117adc486\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.445539 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-combined-ca-bundle\") pod \"3a63a7bb-35a1-48a8-90c1-d21117adc486\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.445582 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"3a63a7bb-35a1-48a8-90c1-d21117adc486\" (UID: \"3a63a7bb-35a1-48a8-90c1-d21117adc486\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.445620 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-combined-ca-bundle\") pod \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\" (UID: \"4676d4e6-2ad0-4c2e-9383-ab1700c389f2\") " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.447224 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4676d4e6-2ad0-4c2e-9383-ab1700c389f2" (UID: "4676d4e6-2ad0-4c2e-9383-ab1700c389f2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.447651 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a63a7bb-35a1-48a8-90c1-d21117adc486-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3a63a7bb-35a1-48a8-90c1-d21117adc486" (UID: "3a63a7bb-35a1-48a8-90c1-d21117adc486"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.448557 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a63a7bb-35a1-48a8-90c1-d21117adc486-logs" (OuterVolumeSpecName: "logs") pod "3a63a7bb-35a1-48a8-90c1-d21117adc486" (UID: "3a63a7bb-35a1-48a8-90c1-d21117adc486"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.451079 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a63a7bb-35a1-48a8-90c1-d21117adc486-kube-api-access-2hb6r" (OuterVolumeSpecName: "kube-api-access-2hb6r") pod "3a63a7bb-35a1-48a8-90c1-d21117adc486" (UID: "3a63a7bb-35a1-48a8-90c1-d21117adc486"). InnerVolumeSpecName "kube-api-access-2hb6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.451495 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-scripts" (OuterVolumeSpecName: "scripts") pod "3a63a7bb-35a1-48a8-90c1-d21117adc486" (UID: "3a63a7bb-35a1-48a8-90c1-d21117adc486"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.451703 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-logs" (OuterVolumeSpecName: "logs") pod "4676d4e6-2ad0-4c2e-9383-ab1700c389f2" (UID: "4676d4e6-2ad0-4c2e-9383-ab1700c389f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.454276 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-scripts" (OuterVolumeSpecName: "scripts") pod "4676d4e6-2ad0-4c2e-9383-ab1700c389f2" (UID: "4676d4e6-2ad0-4c2e-9383-ab1700c389f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.457391 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-kube-api-access-nhvq8" (OuterVolumeSpecName: "kube-api-access-nhvq8") pod "4676d4e6-2ad0-4c2e-9383-ab1700c389f2" (UID: "4676d4e6-2ad0-4c2e-9383-ab1700c389f2"). InnerVolumeSpecName "kube-api-access-nhvq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.482624 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "3a63a7bb-35a1-48a8-90c1-d21117adc486" (UID: "3a63a7bb-35a1-48a8-90c1-d21117adc486"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.482783 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "4676d4e6-2ad0-4c2e-9383-ab1700c389f2" (UID: "4676d4e6-2ad0-4c2e-9383-ab1700c389f2"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.522109 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a63a7bb-35a1-48a8-90c1-d21117adc486" (UID: "3a63a7bb-35a1-48a8-90c1-d21117adc486"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.546766 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.546801 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a63a7bb-35a1-48a8-90c1-d21117adc486-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.546812 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.546825 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.546834 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hb6r\" (UniqueName: \"kubernetes.io/projected/3a63a7bb-35a1-48a8-90c1-d21117adc486-kube-api-access-2hb6r\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.546845 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.546853 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.546861 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a63a7bb-35a1-48a8-90c1-d21117adc486-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.546868 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhvq8\" (UniqueName: \"kubernetes.io/projected/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-kube-api-access-nhvq8\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.546878 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.546885 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.570463 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.585091 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.617146 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4676d4e6-2ad0-4c2e-9383-ab1700c389f2" (UID: "4676d4e6-2ad0-4c2e-9383-ab1700c389f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.620730 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-config-data" (OuterVolumeSpecName: "config-data") pod "3a63a7bb-35a1-48a8-90c1-d21117adc486" (UID: "3a63a7bb-35a1-48a8-90c1-d21117adc486"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.648074 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.648102 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.648112 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.648121 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.648249 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-config-data" (OuterVolumeSpecName: "config-data") pod "4676d4e6-2ad0-4c2e-9383-ab1700c389f2" (UID: "4676d4e6-2ad0-4c2e-9383-ab1700c389f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.650516 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4676d4e6-2ad0-4c2e-9383-ab1700c389f2" (UID: "4676d4e6-2ad0-4c2e-9383-ab1700c389f2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.653593 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3a63a7bb-35a1-48a8-90c1-d21117adc486" (UID: "3a63a7bb-35a1-48a8-90c1-d21117adc486"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.749930 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.749973 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a63a7bb-35a1-48a8-90c1-d21117adc486-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.749989 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4676d4e6-2ad0-4c2e-9383-ab1700c389f2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.765640 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4676d4e6-2ad0-4c2e-9383-ab1700c389f2","Type":"ContainerDied","Data":"3463417a7b10bf01cf01f29297ea14ff4da3d157fcfeb9e7942473392639f44e"} Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.765719 4841 scope.go:117] "RemoveContainer" containerID="ccff00c4326e9c089c2e40b939ca105d011b12e978e557e81a8435798aaa66d6" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.765929 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.773979 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3a63a7bb-35a1-48a8-90c1-d21117adc486","Type":"ContainerDied","Data":"5df74c607bc79253a08333d7293f8ac1b957d94f8524d260f86d0ec31e5fe155"} Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.774025 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.807543 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.826905 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.852741 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:56 crc kubenswrapper[4841]: E0130 05:25:56.853510 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4676d4e6-2ad0-4c2e-9383-ab1700c389f2" containerName="glance-log" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.853529 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4676d4e6-2ad0-4c2e-9383-ab1700c389f2" containerName="glance-log" Jan 30 05:25:56 crc kubenswrapper[4841]: E0130 05:25:56.853569 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4676d4e6-2ad0-4c2e-9383-ab1700c389f2" containerName="glance-httpd" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.853579 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4676d4e6-2ad0-4c2e-9383-ab1700c389f2" containerName="glance-httpd" Jan 30 05:25:56 crc kubenswrapper[4841]: E0130 05:25:56.853601 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a63a7bb-35a1-48a8-90c1-d21117adc486" containerName="glance-log" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.853609 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a63a7bb-35a1-48a8-90c1-d21117adc486" containerName="glance-log" Jan 30 05:25:56 crc kubenswrapper[4841]: E0130 05:25:56.853625 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a63a7bb-35a1-48a8-90c1-d21117adc486" containerName="glance-httpd" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.853633 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a63a7bb-35a1-48a8-90c1-d21117adc486" containerName="glance-httpd" Jan 30 05:25:56 crc kubenswrapper[4841]: E0130 05:25:56.853666 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97750314-b5f7-41a9-92cb-04567a919306" containerName="init" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.853674 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="97750314-b5f7-41a9-92cb-04567a919306" containerName="init" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.854343 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4676d4e6-2ad0-4c2e-9383-ab1700c389f2" containerName="glance-httpd" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.854380 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4676d4e6-2ad0-4c2e-9383-ab1700c389f2" containerName="glance-log" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.854410 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a63a7bb-35a1-48a8-90c1-d21117adc486" containerName="glance-httpd" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.854429 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="97750314-b5f7-41a9-92cb-04567a919306" containerName="init" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.854447 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a63a7bb-35a1-48a8-90c1-d21117adc486" containerName="glance-log" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.856143 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.863850 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.879930 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.880652 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.881023 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tfq6h" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.882954 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.929005 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.941778 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.951365 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.952843 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.955573 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.955578 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.970930 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.971167 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.971241 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.971314 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfqzd\" (UniqueName: \"kubernetes.io/projected/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-kube-api-access-zfqzd\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.971411 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.976347 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-config-data\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.976530 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-logs\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.976715 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-scripts\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:56 crc kubenswrapper[4841]: I0130 05:25:56.975606 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079149 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079199 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-scripts\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079216 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079294 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079316 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079337 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079363 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079377 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfqzd\" (UniqueName: \"kubernetes.io/projected/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-kube-api-access-zfqzd\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079412 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079443 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079463 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079482 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-config-data\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079503 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079518 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc2n5\" (UniqueName: \"kubernetes.io/projected/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-kube-api-access-bc2n5\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079539 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-logs\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.079561 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.080427 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.080540 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-logs\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.080638 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.083885 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.093389 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.094170 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-scripts\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.094572 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-config-data\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.097447 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfqzd\" (UniqueName: \"kubernetes.io/projected/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-kube-api-access-zfqzd\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.117156 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.181094 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.181152 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.181180 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.181196 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc2n5\" (UniqueName: \"kubernetes.io/projected/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-kube-api-access-bc2n5\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.181225 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.181269 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.181289 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.181334 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.181670 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.182061 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.182904 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.185792 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.186546 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.187649 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.188289 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.197782 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc2n5\" (UniqueName: \"kubernetes.io/projected/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-kube-api-access-bc2n5\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.207705 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.218580 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:25:57 crc kubenswrapper[4841]: I0130 05:25:57.285627 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:25:58 crc kubenswrapper[4841]: I0130 05:25:58.444588 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a63a7bb-35a1-48a8-90c1-d21117adc486" path="/var/lib/kubelet/pods/3a63a7bb-35a1-48a8-90c1-d21117adc486/volumes" Jan 30 05:25:58 crc kubenswrapper[4841]: I0130 05:25:58.445884 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4676d4e6-2ad0-4c2e-9383-ab1700c389f2" path="/var/lib/kubelet/pods/4676d4e6-2ad0-4c2e-9383-ab1700c389f2/volumes" Jan 30 05:25:58 crc kubenswrapper[4841]: I0130 05:25:58.680684 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:25:58 crc kubenswrapper[4841]: I0130 05:25:58.766336 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-fb28v"] Jan 30 05:25:58 crc kubenswrapper[4841]: I0130 05:25:58.767168 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" podUID="797269f4-a97a-4a49-a5a0-f5a5623f5e0c" containerName="dnsmasq-dns" containerID="cri-o://4505dd4fa912678acacc7f9626ff3079bef2687f832e614da822c9cdb2620da4" gracePeriod=10 Jan 30 05:25:59 crc kubenswrapper[4841]: I0130 05:25:59.811698 4841 generic.go:334] "Generic (PLEG): container finished" podID="797269f4-a97a-4a49-a5a0-f5a5623f5e0c" containerID="4505dd4fa912678acacc7f9626ff3079bef2687f832e614da822c9cdb2620da4" exitCode=0 Jan 30 05:25:59 crc kubenswrapper[4841]: I0130 05:25:59.811745 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" event={"ID":"797269f4-a97a-4a49-a5a0-f5a5623f5e0c","Type":"ContainerDied","Data":"4505dd4fa912678acacc7f9626ff3079bef2687f832e614da822c9cdb2620da4"} Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.094612 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.190333 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-config-data\") pod \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.190431 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-scripts\") pod \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.190575 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5wm8\" (UniqueName: \"kubernetes.io/projected/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-kube-api-access-s5wm8\") pod \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.190633 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-combined-ca-bundle\") pod \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.190721 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-fernet-keys\") pod \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.190764 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-credential-keys\") pod \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\" (UID: \"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9\") " Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.197930 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9" (UID: "3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.199448 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-kube-api-access-s5wm8" (OuterVolumeSpecName: "kube-api-access-s5wm8") pod "3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9" (UID: "3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9"). InnerVolumeSpecName "kube-api-access-s5wm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.200251 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-scripts" (OuterVolumeSpecName: "scripts") pod "3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9" (UID: "3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.214653 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9" (UID: "3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.225646 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9" (UID: "3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.229860 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-config-data" (OuterVolumeSpecName: "config-data") pod "3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9" (UID: "3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.296576 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.296618 4841 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.296631 4841 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.296642 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.296657 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.296668 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5wm8\" (UniqueName: \"kubernetes.io/projected/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9-kube-api-access-s5wm8\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.835460 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wwr2z" event={"ID":"3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9","Type":"ContainerDied","Data":"c12d0df66cf36b9da8421d18e963fe48216683c0e220ac29ef1fddcb8a71b12f"} Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.835761 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c12d0df66cf36b9da8421d18e963fe48216683c0e220ac29ef1fddcb8a71b12f" Jan 30 05:26:02 crc kubenswrapper[4841]: I0130 05:26:02.835510 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wwr2z" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.184746 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wwr2z"] Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.195022 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wwr2z"] Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.275555 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wwlqb"] Jan 30 05:26:03 crc kubenswrapper[4841]: E0130 05:26:03.275912 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9" containerName="keystone-bootstrap" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.275924 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9" containerName="keystone-bootstrap" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.276116 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9" containerName="keystone-bootstrap" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.276747 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.279991 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.280034 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.280891 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.281057 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zpg5t" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.281186 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.292301 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wwlqb"] Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.424022 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-fernet-keys\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.424097 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-credential-keys\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.424191 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-config-data\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.424223 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-scripts\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.424245 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-combined-ca-bundle\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.424291 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztwd7\" (UniqueName: \"kubernetes.io/projected/4f3379a5-71bf-421e-8262-9f63141e2a09-kube-api-access-ztwd7\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.526819 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-fernet-keys\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.527039 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-credential-keys\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.527152 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-config-data\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.527251 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-scripts\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.527314 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-combined-ca-bundle\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.527362 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztwd7\" (UniqueName: \"kubernetes.io/projected/4f3379a5-71bf-421e-8262-9f63141e2a09-kube-api-access-ztwd7\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.534124 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-scripts\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.535146 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-config-data\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.537366 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-combined-ca-bundle\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.539659 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-fernet-keys\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.538025 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-credential-keys\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.558502 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztwd7\" (UniqueName: \"kubernetes.io/projected/4f3379a5-71bf-421e-8262-9f63141e2a09-kube-api-access-ztwd7\") pod \"keystone-bootstrap-wwlqb\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:03 crc kubenswrapper[4841]: I0130 05:26:03.598123 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:04 crc kubenswrapper[4841]: I0130 05:26:04.448969 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9" path="/var/lib/kubelet/pods/3f4159ff-e846-4cfd-b4b6-7bd76ab5d6e9/volumes" Jan 30 05:26:07 crc kubenswrapper[4841]: I0130 05:26:07.889444 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" podUID="797269f4-a97a-4a49-a5a0-f5a5623f5e0c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Jan 30 05:26:10 crc kubenswrapper[4841]: I0130 05:26:10.463370 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:26:10 crc kubenswrapper[4841]: I0130 05:26:10.465719 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.080819 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.176699 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmvg4\" (UniqueName: \"kubernetes.io/projected/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-kube-api-access-nmvg4\") pod \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.176759 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-config\") pod \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.176846 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-dns-svc\") pod \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.176887 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-dns-swift-storage-0\") pod \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.176917 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-ovsdbserver-nb\") pod \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.177126 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-ovsdbserver-sb\") pod \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\" (UID: \"797269f4-a97a-4a49-a5a0-f5a5623f5e0c\") " Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.189121 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-kube-api-access-nmvg4" (OuterVolumeSpecName: "kube-api-access-nmvg4") pod "797269f4-a97a-4a49-a5a0-f5a5623f5e0c" (UID: "797269f4-a97a-4a49-a5a0-f5a5623f5e0c"). InnerVolumeSpecName "kube-api-access-nmvg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.233319 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "797269f4-a97a-4a49-a5a0-f5a5623f5e0c" (UID: "797269f4-a97a-4a49-a5a0-f5a5623f5e0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.237532 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "797269f4-a97a-4a49-a5a0-f5a5623f5e0c" (UID: "797269f4-a97a-4a49-a5a0-f5a5623f5e0c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.243586 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-config" (OuterVolumeSpecName: "config") pod "797269f4-a97a-4a49-a5a0-f5a5623f5e0c" (UID: "797269f4-a97a-4a49-a5a0-f5a5623f5e0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.244302 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "797269f4-a97a-4a49-a5a0-f5a5623f5e0c" (UID: "797269f4-a97a-4a49-a5a0-f5a5623f5e0c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.246150 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "797269f4-a97a-4a49-a5a0-f5a5623f5e0c" (UID: "797269f4-a97a-4a49-a5a0-f5a5623f5e0c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.278844 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.278881 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmvg4\" (UniqueName: \"kubernetes.io/projected/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-kube-api-access-nmvg4\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.278894 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.278902 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.278910 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.278918 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/797269f4-a97a-4a49-a5a0-f5a5623f5e0c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:11 crc kubenswrapper[4841]: E0130 05:26:11.484122 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777" Jan 30 05:26:11 crc kubenswrapper[4841]: E0130 05:26:11.484284 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n56dh59ch544h68bhc6hddh558h558h5d8h5b6h88hfdh547h59fh675h595h658h664hb8h55ch6fh55dh86h9bh5d6h6ch5fch658hbfh565hbbh646q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8qltc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(95528d5c-f34b-4913-9db3-05ef436c106d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.490092 4841 scope.go:117] "RemoveContainer" containerID="0d56f0169dd79c3b5c44e6dfdb3e9e9c70f4258f48b01b103b52f667eea2d875" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.930888 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.930981 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" event={"ID":"797269f4-a97a-4a49-a5a0-f5a5623f5e0c","Type":"ContainerDied","Data":"e75f7ea47764b2aad1d1e17337fd1d5fbc04c6cf0f011e342a9696358fbfe0c9"} Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.971200 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-fb28v"] Jan 30 05:26:11 crc kubenswrapper[4841]: I0130 05:26:11.979571 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-fb28v"] Jan 30 05:26:12 crc kubenswrapper[4841]: I0130 05:26:12.442605 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="797269f4-a97a-4a49-a5a0-f5a5623f5e0c" path="/var/lib/kubelet/pods/797269f4-a97a-4a49-a5a0-f5a5623f5e0c/volumes" Jan 30 05:26:12 crc kubenswrapper[4841]: I0130 05:26:12.643847 4841 scope.go:117] "RemoveContainer" containerID="27137f5699f9345d442ca5a1cd89a37b0897a5eae555b1f5b0b86bcb5d14a1e8" Jan 30 05:26:12 crc kubenswrapper[4841]: E0130 05:26:12.645892 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Jan 30 05:26:12 crc kubenswrapper[4841]: E0130 05:26:12.646027 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pj5wd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-t7thb_openstack(502d6fe3-4215-4b32-8546-a55e5a4afc91): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 05:26:12 crc kubenswrapper[4841]: E0130 05:26:12.647831 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-t7thb" podUID="502d6fe3-4215-4b32-8546-a55e5a4afc91" Jan 30 05:26:12 crc kubenswrapper[4841]: I0130 05:26:12.768538 4841 scope.go:117] "RemoveContainer" containerID="0984185d14d2c7b9d96eafc322b9f62d62c646fcbc3e77c6b28e1618f357a1f5" Jan 30 05:26:12 crc kubenswrapper[4841]: I0130 05:26:12.825538 4841 scope.go:117] "RemoveContainer" containerID="4505dd4fa912678acacc7f9626ff3079bef2687f832e614da822c9cdb2620da4" Jan 30 05:26:12 crc kubenswrapper[4841]: I0130 05:26:12.844008 4841 scope.go:117] "RemoveContainer" containerID="889730a22c7b6cea75cc2bd04ff5c116d6193d6c36fbcaba34622c33276ee496" Jan 30 05:26:12 crc kubenswrapper[4841]: I0130 05:26:12.890290 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dfc89d77-fb28v" podUID="797269f4-a97a-4a49-a5a0-f5a5623f5e0c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Jan 30 05:26:12 crc kubenswrapper[4841]: I0130 05:26:12.964088 4841 generic.go:334] "Generic (PLEG): container finished" podID="a606323e-b83b-4046-aeff-ea4ded617943" containerID="4917c8f1f511e2ab1d1cbd400696ac027e9a598fabd5f2c438a5fb3e801553af" exitCode=0 Jan 30 05:26:12 crc kubenswrapper[4841]: I0130 05:26:12.964106 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b8f64" event={"ID":"a606323e-b83b-4046-aeff-ea4ded617943","Type":"ContainerDied","Data":"4917c8f1f511e2ab1d1cbd400696ac027e9a598fabd5f2c438a5fb3e801553af"} Jan 30 05:26:12 crc kubenswrapper[4841]: I0130 05:26:12.965846 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8z7z8" event={"ID":"738fde20-8e94-46e9-bb59-24f917e279cd","Type":"ContainerStarted","Data":"b3f1445caa2309dc539440752e7e25f18a4dc9dad577904f8bc72b78c43bd48f"} Jan 30 05:26:12 crc kubenswrapper[4841]: I0130 05:26:12.969075 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dzfl5" event={"ID":"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a","Type":"ContainerStarted","Data":"3d6a7ca7f9fa1770de4767c7a65014825dae033a2797b29ddd22537c9692650e"} Jan 30 05:26:12 crc kubenswrapper[4841]: E0130 05:26:12.974415 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack/cinder-db-sync-t7thb" podUID="502d6fe3-4215-4b32-8546-a55e5a4afc91" Jan 30 05:26:12 crc kubenswrapper[4841]: I0130 05:26:12.991644 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-dzfl5" podStartSLOduration=2.391678109 podStartE2EDuration="25.991626886s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="2026-01-30 05:25:48.990756345 +0000 UTC m=+1085.984228983" lastFinishedPulling="2026-01-30 05:26:12.590705122 +0000 UTC m=+1109.584177760" observedRunningTime="2026-01-30 05:26:12.991445541 +0000 UTC m=+1109.984918179" watchObservedRunningTime="2026-01-30 05:26:12.991626886 +0000 UTC m=+1109.985099524" Jan 30 05:26:13 crc kubenswrapper[4841]: I0130 05:26:13.025656 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8z7z8" podStartSLOduration=1.878297761 podStartE2EDuration="25.025640166s" podCreationTimestamp="2026-01-30 05:25:48 +0000 UTC" firstStartedPulling="2026-01-30 05:25:49.462525468 +0000 UTC m=+1086.455998106" lastFinishedPulling="2026-01-30 05:26:12.609867873 +0000 UTC m=+1109.603340511" observedRunningTime="2026-01-30 05:26:13.004758789 +0000 UTC m=+1109.998231427" watchObservedRunningTime="2026-01-30 05:26:13.025640166 +0000 UTC m=+1110.019112804" Jan 30 05:26:13 crc kubenswrapper[4841]: E0130 05:26:13.064516 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda606323e_b83b_4046_aeff_ea4ded617943.slice/crio-conmon-4917c8f1f511e2ab1d1cbd400696ac027e9a598fabd5f2c438a5fb3e801553af.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda606323e_b83b_4046_aeff_ea4ded617943.slice/crio-4917c8f1f511e2ab1d1cbd400696ac027e9a598fabd5f2c438a5fb3e801553af.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:26:13 crc kubenswrapper[4841]: I0130 05:26:13.146010 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wwlqb"] Jan 30 05:26:13 crc kubenswrapper[4841]: I0130 05:26:13.179376 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:26:13 crc kubenswrapper[4841]: I0130 05:26:13.304869 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:26:13 crc kubenswrapper[4841]: I0130 05:26:13.984951 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wwlqb" event={"ID":"4f3379a5-71bf-421e-8262-9f63141e2a09","Type":"ContainerStarted","Data":"0fcfa87f68875bfddaa93b739bc572583fbc92347ff5282b6f4b67fd22982107"} Jan 30 05:26:13 crc kubenswrapper[4841]: I0130 05:26:13.985207 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wwlqb" event={"ID":"4f3379a5-71bf-421e-8262-9f63141e2a09","Type":"ContainerStarted","Data":"65ea7a5330217033265419bfd4f89e3f20d0000738a0d7260b0771049c8438b0"} Jan 30 05:26:13 crc kubenswrapper[4841]: I0130 05:26:13.991666 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0","Type":"ContainerStarted","Data":"2d3900b1448fe376ef696f3456d87f5c517c207d97e6b3b893204ecbe37b4613"} Jan 30 05:26:13 crc kubenswrapper[4841]: I0130 05:26:13.995239 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556","Type":"ContainerStarted","Data":"f9520a966ad3395753a91a91f2d76e8ba7e77f0407a1d27fa908e2ab168dfbd2"} Jan 30 05:26:14 crc kubenswrapper[4841]: I0130 05:26:14.005477 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95528d5c-f34b-4913-9db3-05ef436c106d","Type":"ContainerStarted","Data":"cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8"} Jan 30 05:26:14 crc kubenswrapper[4841]: I0130 05:26:14.006818 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wwlqb" podStartSLOduration=11.006802737 podStartE2EDuration="11.006802737s" podCreationTimestamp="2026-01-30 05:26:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:14.003255694 +0000 UTC m=+1110.996728332" watchObservedRunningTime="2026-01-30 05:26:14.006802737 +0000 UTC m=+1111.000275385" Jan 30 05:26:14 crc kubenswrapper[4841]: I0130 05:26:14.327733 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b8f64" Jan 30 05:26:14 crc kubenswrapper[4841]: I0130 05:26:14.443713 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a606323e-b83b-4046-aeff-ea4ded617943-combined-ca-bundle\") pod \"a606323e-b83b-4046-aeff-ea4ded617943\" (UID: \"a606323e-b83b-4046-aeff-ea4ded617943\") " Jan 30 05:26:14 crc kubenswrapper[4841]: I0130 05:26:14.443830 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a606323e-b83b-4046-aeff-ea4ded617943-config\") pod \"a606323e-b83b-4046-aeff-ea4ded617943\" (UID: \"a606323e-b83b-4046-aeff-ea4ded617943\") " Jan 30 05:26:14 crc kubenswrapper[4841]: I0130 05:26:14.443952 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv86p\" (UniqueName: \"kubernetes.io/projected/a606323e-b83b-4046-aeff-ea4ded617943-kube-api-access-nv86p\") pod \"a606323e-b83b-4046-aeff-ea4ded617943\" (UID: \"a606323e-b83b-4046-aeff-ea4ded617943\") " Jan 30 05:26:14 crc kubenswrapper[4841]: I0130 05:26:14.448438 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a606323e-b83b-4046-aeff-ea4ded617943-kube-api-access-nv86p" (OuterVolumeSpecName: "kube-api-access-nv86p") pod "a606323e-b83b-4046-aeff-ea4ded617943" (UID: "a606323e-b83b-4046-aeff-ea4ded617943"). InnerVolumeSpecName "kube-api-access-nv86p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:14 crc kubenswrapper[4841]: I0130 05:26:14.477725 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a606323e-b83b-4046-aeff-ea4ded617943-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a606323e-b83b-4046-aeff-ea4ded617943" (UID: "a606323e-b83b-4046-aeff-ea4ded617943"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:14 crc kubenswrapper[4841]: I0130 05:26:14.481147 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a606323e-b83b-4046-aeff-ea4ded617943-config" (OuterVolumeSpecName: "config") pod "a606323e-b83b-4046-aeff-ea4ded617943" (UID: "a606323e-b83b-4046-aeff-ea4ded617943"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:14 crc kubenswrapper[4841]: I0130 05:26:14.545491 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a606323e-b83b-4046-aeff-ea4ded617943-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:14 crc kubenswrapper[4841]: I0130 05:26:14.545624 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv86p\" (UniqueName: \"kubernetes.io/projected/a606323e-b83b-4046-aeff-ea4ded617943-kube-api-access-nv86p\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:14 crc kubenswrapper[4841]: I0130 05:26:14.545784 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a606323e-b83b-4046-aeff-ea4ded617943-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.017377 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b8f64" event={"ID":"a606323e-b83b-4046-aeff-ea4ded617943","Type":"ContainerDied","Data":"e73b4956c1fe267eab078e4b8bc409093eaf79c3a536821a957a5a9cfc33ca8d"} Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.017636 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e73b4956c1fe267eab078e4b8bc409093eaf79c3a536821a957a5a9cfc33ca8d" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.017685 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b8f64" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.021713 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0","Type":"ContainerStarted","Data":"c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723"} Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.021750 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0","Type":"ContainerStarted","Data":"e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52"} Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.035490 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556","Type":"ContainerStarted","Data":"78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7"} Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.035631 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556","Type":"ContainerStarted","Data":"01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3"} Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.056882 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=19.056862212 podStartE2EDuration="19.056862212s" podCreationTimestamp="2026-01-30 05:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:15.043554123 +0000 UTC m=+1112.037026761" watchObservedRunningTime="2026-01-30 05:26:15.056862212 +0000 UTC m=+1112.050334850" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.079311 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.079295059 podStartE2EDuration="19.079295059s" podCreationTimestamp="2026-01-30 05:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:15.06903752 +0000 UTC m=+1112.062510158" watchObservedRunningTime="2026-01-30 05:26:15.079295059 +0000 UTC m=+1112.072767697" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.235410 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-685444497c-7d6rm"] Jan 30 05:26:15 crc kubenswrapper[4841]: E0130 05:26:15.235836 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797269f4-a97a-4a49-a5a0-f5a5623f5e0c" containerName="init" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.235861 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="797269f4-a97a-4a49-a5a0-f5a5623f5e0c" containerName="init" Jan 30 05:26:15 crc kubenswrapper[4841]: E0130 05:26:15.235889 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a606323e-b83b-4046-aeff-ea4ded617943" containerName="neutron-db-sync" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.235896 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a606323e-b83b-4046-aeff-ea4ded617943" containerName="neutron-db-sync" Jan 30 05:26:15 crc kubenswrapper[4841]: E0130 05:26:15.235910 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797269f4-a97a-4a49-a5a0-f5a5623f5e0c" containerName="dnsmasq-dns" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.235918 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="797269f4-a97a-4a49-a5a0-f5a5623f5e0c" containerName="dnsmasq-dns" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.236095 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a606323e-b83b-4046-aeff-ea4ded617943" containerName="neutron-db-sync" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.236112 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="797269f4-a97a-4a49-a5a0-f5a5623f5e0c" containerName="dnsmasq-dns" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.237037 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.266256 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685444497c-7d6rm"] Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.289761 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f5986d768-c66l5"] Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.291577 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.293327 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8vxjs" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.296670 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.300004 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.301582 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.304131 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f5986d768-c66l5"] Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.356510 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhppk\" (UniqueName: \"kubernetes.io/projected/92dee641-7744-48ab-b9dd-49555b7d9ce9-kube-api-access-lhppk\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.356755 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-combined-ca-bundle\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.356800 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-dns-swift-storage-0\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.356821 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-ovndb-tls-certs\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.356862 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-dns-svc\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.357032 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-httpd-config\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.357067 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-config\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.357111 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hmbk\" (UniqueName: \"kubernetes.io/projected/3afd6c02-28f4-4dae-92ea-b3d04085383b-kube-api-access-6hmbk\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.357200 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-ovsdbserver-nb\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.357236 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-config\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.357280 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-ovsdbserver-sb\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.459513 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-dns-swift-storage-0\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.459567 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-ovndb-tls-certs\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.459622 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-dns-svc\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.459672 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-httpd-config\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.459699 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-config\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.459733 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hmbk\" (UniqueName: \"kubernetes.io/projected/3afd6c02-28f4-4dae-92ea-b3d04085383b-kube-api-access-6hmbk\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.459780 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-ovsdbserver-nb\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.459810 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-config\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.459844 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-ovsdbserver-sb\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.459880 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhppk\" (UniqueName: \"kubernetes.io/projected/92dee641-7744-48ab-b9dd-49555b7d9ce9-kube-api-access-lhppk\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.459903 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-combined-ca-bundle\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.461895 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-dns-swift-storage-0\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.462093 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-ovsdbserver-sb\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.462339 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-dns-svc\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.462697 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-ovsdbserver-nb\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.462799 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-config\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.472119 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-ovndb-tls-certs\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.472282 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-combined-ca-bundle\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.472324 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-httpd-config\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.475175 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-config\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.478192 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hmbk\" (UniqueName: \"kubernetes.io/projected/3afd6c02-28f4-4dae-92ea-b3d04085383b-kube-api-access-6hmbk\") pod \"neutron-f5986d768-c66l5\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.519075 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhppk\" (UniqueName: \"kubernetes.io/projected/92dee641-7744-48ab-b9dd-49555b7d9ce9-kube-api-access-lhppk\") pod \"dnsmasq-dns-685444497c-7d6rm\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.604947 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:15 crc kubenswrapper[4841]: I0130 05:26:15.622825 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:16 crc kubenswrapper[4841]: I0130 05:26:16.043873 4841 generic.go:334] "Generic (PLEG): container finished" podID="738fde20-8e94-46e9-bb59-24f917e279cd" containerID="b3f1445caa2309dc539440752e7e25f18a4dc9dad577904f8bc72b78c43bd48f" exitCode=0 Jan 30 05:26:16 crc kubenswrapper[4841]: I0130 05:26:16.044869 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8z7z8" event={"ID":"738fde20-8e94-46e9-bb59-24f917e279cd","Type":"ContainerDied","Data":"b3f1445caa2309dc539440752e7e25f18a4dc9dad577904f8bc72b78c43bd48f"} Jan 30 05:26:16 crc kubenswrapper[4841]: I0130 05:26:16.149061 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685444497c-7d6rm"] Jan 30 05:26:16 crc kubenswrapper[4841]: I0130 05:26:16.260011 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f5986d768-c66l5"] Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.054705 4841 generic.go:334] "Generic (PLEG): container finished" podID="beaf0495-d1d3-42df-b727-dc0c6fb5fe2a" containerID="3d6a7ca7f9fa1770de4767c7a65014825dae033a2797b29ddd22537c9692650e" exitCode=0 Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.054801 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dzfl5" event={"ID":"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a","Type":"ContainerDied","Data":"3d6a7ca7f9fa1770de4767c7a65014825dae033a2797b29ddd22537c9692650e"} Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.057827 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f5986d768-c66l5" event={"ID":"3afd6c02-28f4-4dae-92ea-b3d04085383b","Type":"ContainerStarted","Data":"667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345"} Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.057863 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f5986d768-c66l5" event={"ID":"3afd6c02-28f4-4dae-92ea-b3d04085383b","Type":"ContainerStarted","Data":"36b6381960abe233954061fb0206a9bd82970ebe445f1def087e33f4c6cbb3d9"} Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.060080 4841 generic.go:334] "Generic (PLEG): container finished" podID="92dee641-7744-48ab-b9dd-49555b7d9ce9" containerID="c08133f4129a7126b1a6d4a619856a1cd3bffd9a546a8f7ee43b6d42ac78121a" exitCode=0 Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.060163 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-7d6rm" event={"ID":"92dee641-7744-48ab-b9dd-49555b7d9ce9","Type":"ContainerDied","Data":"c08133f4129a7126b1a6d4a619856a1cd3bffd9a546a8f7ee43b6d42ac78121a"} Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.060307 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-7d6rm" event={"ID":"92dee641-7744-48ab-b9dd-49555b7d9ce9","Type":"ContainerStarted","Data":"d62f4656252f33fa77e18b9b251fbc0ae82ff8f5c155fbf9ddba8639784e5b6e"} Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.219635 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.221629 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.275020 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.283961 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.286186 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.287929 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.360148 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.366112 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.703822 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bcfb9ffb5-q4t7z"] Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.705214 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.707442 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.708673 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.726199 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bcfb9ffb5-q4t7z"] Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.820324 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-httpd-config\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.820582 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-combined-ca-bundle\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.820733 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-config\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.820790 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-internal-tls-certs\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.820973 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk56k\" (UniqueName: \"kubernetes.io/projected/fc4310f7-3d6f-45dc-a17e-5311da837e81-kube-api-access-xk56k\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.821029 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-ovndb-tls-certs\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.821192 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-public-tls-certs\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.922528 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-combined-ca-bundle\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.922609 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-config\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.922642 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-internal-tls-certs\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.922699 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk56k\" (UniqueName: \"kubernetes.io/projected/fc4310f7-3d6f-45dc-a17e-5311da837e81-kube-api-access-xk56k\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.922716 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-ovndb-tls-certs\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.922737 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-public-tls-certs\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.922776 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-httpd-config\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.930439 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-internal-tls-certs\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.930754 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-ovndb-tls-certs\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.930869 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-config\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.931283 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-combined-ca-bundle\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.931957 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-public-tls-certs\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.939512 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk56k\" (UniqueName: \"kubernetes.io/projected/fc4310f7-3d6f-45dc-a17e-5311da837e81-kube-api-access-xk56k\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:17 crc kubenswrapper[4841]: I0130 05:26:17.944501 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-httpd-config\") pod \"neutron-6bcfb9ffb5-q4t7z\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:18 crc kubenswrapper[4841]: I0130 05:26:18.065867 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:18 crc kubenswrapper[4841]: I0130 05:26:18.074306 4841 generic.go:334] "Generic (PLEG): container finished" podID="4f3379a5-71bf-421e-8262-9f63141e2a09" containerID="0fcfa87f68875bfddaa93b739bc572583fbc92347ff5282b6f4b67fd22982107" exitCode=0 Jan 30 05:26:18 crc kubenswrapper[4841]: I0130 05:26:18.075118 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wwlqb" event={"ID":"4f3379a5-71bf-421e-8262-9f63141e2a09","Type":"ContainerDied","Data":"0fcfa87f68875bfddaa93b739bc572583fbc92347ff5282b6f4b67fd22982107"} Jan 30 05:26:18 crc kubenswrapper[4841]: I0130 05:26:18.075152 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:18 crc kubenswrapper[4841]: I0130 05:26:18.075564 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:18 crc kubenswrapper[4841]: I0130 05:26:18.075591 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 05:26:18 crc kubenswrapper[4841]: I0130 05:26:18.075860 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 05:26:19 crc kubenswrapper[4841]: I0130 05:26:19.983652 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.865157 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.895472 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8z7z8" Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.914862 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.994327 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-scripts\") pod \"4f3379a5-71bf-421e-8262-9f63141e2a09\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.994662 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jj4q\" (UniqueName: \"kubernetes.io/projected/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-kube-api-access-2jj4q\") pod \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\" (UID: \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.994725 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-db-sync-config-data\") pod \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\" (UID: \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.994762 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-fernet-keys\") pod \"4f3379a5-71bf-421e-8262-9f63141e2a09\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.994802 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpsjg\" (UniqueName: \"kubernetes.io/projected/738fde20-8e94-46e9-bb59-24f917e279cd-kube-api-access-dpsjg\") pod \"738fde20-8e94-46e9-bb59-24f917e279cd\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.994829 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/738fde20-8e94-46e9-bb59-24f917e279cd-logs\") pod \"738fde20-8e94-46e9-bb59-24f917e279cd\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.994843 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-config-data\") pod \"738fde20-8e94-46e9-bb59-24f917e279cd\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.994895 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-credential-keys\") pod \"4f3379a5-71bf-421e-8262-9f63141e2a09\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.994925 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-combined-ca-bundle\") pod \"4f3379a5-71bf-421e-8262-9f63141e2a09\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.994971 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-config-data\") pod \"4f3379a5-71bf-421e-8262-9f63141e2a09\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.995001 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-combined-ca-bundle\") pod \"738fde20-8e94-46e9-bb59-24f917e279cd\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.995029 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-combined-ca-bundle\") pod \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\" (UID: \"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.995057 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztwd7\" (UniqueName: \"kubernetes.io/projected/4f3379a5-71bf-421e-8262-9f63141e2a09-kube-api-access-ztwd7\") pod \"4f3379a5-71bf-421e-8262-9f63141e2a09\" (UID: \"4f3379a5-71bf-421e-8262-9f63141e2a09\") " Jan 30 05:26:20 crc kubenswrapper[4841]: I0130 05:26:20.995088 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-scripts\") pod \"738fde20-8e94-46e9-bb59-24f917e279cd\" (UID: \"738fde20-8e94-46e9-bb59-24f917e279cd\") " Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:20.999893 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/738fde20-8e94-46e9-bb59-24f917e279cd-logs" (OuterVolumeSpecName: "logs") pod "738fde20-8e94-46e9-bb59-24f917e279cd" (UID: "738fde20-8e94-46e9-bb59-24f917e279cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.009299 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4f3379a5-71bf-421e-8262-9f63141e2a09" (UID: "4f3379a5-71bf-421e-8262-9f63141e2a09"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.009758 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/738fde20-8e94-46e9-bb59-24f917e279cd-kube-api-access-dpsjg" (OuterVolumeSpecName: "kube-api-access-dpsjg") pod "738fde20-8e94-46e9-bb59-24f917e279cd" (UID: "738fde20-8e94-46e9-bb59-24f917e279cd"). InnerVolumeSpecName "kube-api-access-dpsjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.009939 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3379a5-71bf-421e-8262-9f63141e2a09-kube-api-access-ztwd7" (OuterVolumeSpecName: "kube-api-access-ztwd7") pod "4f3379a5-71bf-421e-8262-9f63141e2a09" (UID: "4f3379a5-71bf-421e-8262-9f63141e2a09"). InnerVolumeSpecName "kube-api-access-ztwd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.015220 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-scripts" (OuterVolumeSpecName: "scripts") pod "738fde20-8e94-46e9-bb59-24f917e279cd" (UID: "738fde20-8e94-46e9-bb59-24f917e279cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.015709 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-scripts" (OuterVolumeSpecName: "scripts") pod "4f3379a5-71bf-421e-8262-9f63141e2a09" (UID: "4f3379a5-71bf-421e-8262-9f63141e2a09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.015735 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "beaf0495-d1d3-42df-b727-dc0c6fb5fe2a" (UID: "beaf0495-d1d3-42df-b727-dc0c6fb5fe2a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.016054 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-kube-api-access-2jj4q" (OuterVolumeSpecName: "kube-api-access-2jj4q") pod "beaf0495-d1d3-42df-b727-dc0c6fb5fe2a" (UID: "beaf0495-d1d3-42df-b727-dc0c6fb5fe2a"). InnerVolumeSpecName "kube-api-access-2jj4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.029317 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4f3379a5-71bf-421e-8262-9f63141e2a09" (UID: "4f3379a5-71bf-421e-8262-9f63141e2a09"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.040376 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "beaf0495-d1d3-42df-b727-dc0c6fb5fe2a" (UID: "beaf0495-d1d3-42df-b727-dc0c6fb5fe2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.043282 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.069266 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-config-data" (OuterVolumeSpecName: "config-data") pod "4f3379a5-71bf-421e-8262-9f63141e2a09" (UID: "4f3379a5-71bf-421e-8262-9f63141e2a09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.069380 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f3379a5-71bf-421e-8262-9f63141e2a09" (UID: "4f3379a5-71bf-421e-8262-9f63141e2a09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.079290 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.079944 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-config-data" (OuterVolumeSpecName: "config-data") pod "738fde20-8e94-46e9-bb59-24f917e279cd" (UID: "738fde20-8e94-46e9-bb59-24f917e279cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.086515 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "738fde20-8e94-46e9-bb59-24f917e279cd" (UID: "738fde20-8e94-46e9-bb59-24f917e279cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.096905 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.096927 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.096935 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.096944 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.096956 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztwd7\" (UniqueName: \"kubernetes.io/projected/4f3379a5-71bf-421e-8262-9f63141e2a09-kube-api-access-ztwd7\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.096966 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.096973 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.096981 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jj4q\" (UniqueName: \"kubernetes.io/projected/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-kube-api-access-2jj4q\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.096991 4841 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.096998 4841 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.097006 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpsjg\" (UniqueName: \"kubernetes.io/projected/738fde20-8e94-46e9-bb59-24f917e279cd-kube-api-access-dpsjg\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.097014 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/738fde20-8e94-46e9-bb59-24f917e279cd-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.097021 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/738fde20-8e94-46e9-bb59-24f917e279cd-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.097029 4841 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f3379a5-71bf-421e-8262-9f63141e2a09-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.111278 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-7d6rm" event={"ID":"92dee641-7744-48ab-b9dd-49555b7d9ce9","Type":"ContainerStarted","Data":"b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da"} Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.112796 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.129566 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95528d5c-f34b-4913-9db3-05ef436c106d","Type":"ContainerStarted","Data":"fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6"} Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.133674 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wwlqb" event={"ID":"4f3379a5-71bf-421e-8262-9f63141e2a09","Type":"ContainerDied","Data":"65ea7a5330217033265419bfd4f89e3f20d0000738a0d7260b0771049c8438b0"} Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.133697 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65ea7a5330217033265419bfd4f89e3f20d0000738a0d7260b0771049c8438b0" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.133753 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wwlqb" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.137422 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-685444497c-7d6rm" podStartSLOduration=6.137374513 podStartE2EDuration="6.137374513s" podCreationTimestamp="2026-01-30 05:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:21.129839846 +0000 UTC m=+1118.123312484" watchObservedRunningTime="2026-01-30 05:26:21.137374513 +0000 UTC m=+1118.130847151" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.139199 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8z7z8" event={"ID":"738fde20-8e94-46e9-bb59-24f917e279cd","Type":"ContainerDied","Data":"29eb23e8e6962874ebfa7461693d4ba17762ee16d090633394d30cf39df72e74"} Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.139227 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29eb23e8e6962874ebfa7461693d4ba17762ee16d090633394d30cf39df72e74" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.139273 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8z7z8" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.142202 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dzfl5" event={"ID":"beaf0495-d1d3-42df-b727-dc0c6fb5fe2a","Type":"ContainerDied","Data":"8c91b95ce0ea018cf6e877003cecb6ad1dc03b8adc36e924fc93a6bfb121a5fc"} Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.142241 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c91b95ce0ea018cf6e877003cecb6ad1dc03b8adc36e924fc93a6bfb121a5fc" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.142301 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dzfl5" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.158366 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f5986d768-c66l5" event={"ID":"3afd6c02-28f4-4dae-92ea-b3d04085383b","Type":"ContainerStarted","Data":"db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f"} Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.158827 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.184840 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.215572 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f5986d768-c66l5" podStartSLOduration=6.215546399 podStartE2EDuration="6.215546399s" podCreationTimestamp="2026-01-30 05:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:21.185788621 +0000 UTC m=+1118.179261269" watchObservedRunningTime="2026-01-30 05:26:21.215546399 +0000 UTC m=+1118.209019037" Jan 30 05:26:21 crc kubenswrapper[4841]: I0130 05:26:21.321198 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bcfb9ffb5-q4t7z"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.053265 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5445c58497-n245m"] Jan 30 05:26:22 crc kubenswrapper[4841]: E0130 05:26:22.053946 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3379a5-71bf-421e-8262-9f63141e2a09" containerName="keystone-bootstrap" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.053958 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3379a5-71bf-421e-8262-9f63141e2a09" containerName="keystone-bootstrap" Jan 30 05:26:22 crc kubenswrapper[4841]: E0130 05:26:22.053979 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738fde20-8e94-46e9-bb59-24f917e279cd" containerName="placement-db-sync" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.053985 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="738fde20-8e94-46e9-bb59-24f917e279cd" containerName="placement-db-sync" Jan 30 05:26:22 crc kubenswrapper[4841]: E0130 05:26:22.053997 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beaf0495-d1d3-42df-b727-dc0c6fb5fe2a" containerName="barbican-db-sync" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.054005 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="beaf0495-d1d3-42df-b727-dc0c6fb5fe2a" containerName="barbican-db-sync" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.054172 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="738fde20-8e94-46e9-bb59-24f917e279cd" containerName="placement-db-sync" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.054180 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3379a5-71bf-421e-8262-9f63141e2a09" containerName="keystone-bootstrap" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.054188 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="beaf0495-d1d3-42df-b727-dc0c6fb5fe2a" containerName="barbican-db-sync" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.054696 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.056535 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.056790 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.057461 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zpg5t" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.057505 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.058281 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.058636 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.076834 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5445c58497-n245m"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.091177 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7cb787b658-zbrbz"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.092588 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.099914 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.100152 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4vwkx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.100285 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.099923 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.118824 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.121205 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-config-data\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.121890 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq567\" (UniqueName: \"kubernetes.io/projected/b191848f-8f16-40e6-8a2b-f66a0179f359-kube-api-access-nq567\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.121952 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-credential-keys\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.122072 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-internal-tls-certs\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.122243 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-public-tls-certs\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.122268 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-combined-ca-bundle\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.122321 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-fernet-keys\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.122524 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-scripts\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.133600 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cb787b658-zbrbz"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.159451 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7d4467fc7c-fcdx6"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.160871 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.166676 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.166854 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-t24qx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.166974 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.194458 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-65d7cd9bbd-ltngt"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.195952 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.207632 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bcfb9ffb5-q4t7z" event={"ID":"fc4310f7-3d6f-45dc-a17e-5311da837e81","Type":"ContainerStarted","Data":"6b1c61cf113a9d02dc8e84edb3cccdb64ff41343455a288f83fa9a65400016f9"} Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.207676 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bcfb9ffb5-q4t7z" event={"ID":"fc4310f7-3d6f-45dc-a17e-5311da837e81","Type":"ContainerStarted","Data":"2b1a0e1c3a5b1bc5a2cea93c3da88301a3042badaf0a1dbc97c75b42d3595c7d"} Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.207692 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bcfb9ffb5-q4t7z" event={"ID":"fc4310f7-3d6f-45dc-a17e-5311da837e81","Type":"ContainerStarted","Data":"df0465c42a22f9325cdc36a94eb6914e40060da2ba8c0b4deb2b8e7e9c081ea8"} Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.211676 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.220503 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d4467fc7c-fcdx6"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225175 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-combined-ca-bundle\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225215 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-combined-ca-bundle\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225234 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-public-tls-certs\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225253 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-fernet-keys\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225269 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-internal-tls-certs\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225317 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-scripts\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225346 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b660389-643d-46fa-aa90-747dc6dbe5f9-logs\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225365 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-config-data-custom\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225381 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8476\" (UniqueName: \"kubernetes.io/projected/cae633ad-6e09-4487-a2ad-21fc13696859-kube-api-access-l8476\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225428 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-config-data\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225447 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cae633ad-6e09-4487-a2ad-21fc13696859-logs\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225464 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-combined-ca-bundle\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225489 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-config-data\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225512 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-public-tls-certs\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225529 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-config-data\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225552 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c4xt\" (UniqueName: \"kubernetes.io/projected/8b660389-643d-46fa-aa90-747dc6dbe5f9-kube-api-access-9c4xt\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225574 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq567\" (UniqueName: \"kubernetes.io/projected/b191848f-8f16-40e6-8a2b-f66a0179f359-kube-api-access-nq567\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225595 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-credential-keys\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225617 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-internal-tls-certs\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.225637 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-scripts\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.234811 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-internal-tls-certs\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.240644 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-public-tls-certs\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.241296 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-config-data\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.242678 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-scripts\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.249914 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-fernet-keys\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.250071 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-65d7cd9bbd-ltngt"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.253066 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-credential-keys\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.253131 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq567\" (UniqueName: \"kubernetes.io/projected/b191848f-8f16-40e6-8a2b-f66a0179f359-kube-api-access-nq567\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.253487 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-combined-ca-bundle\") pod \"keystone-5445c58497-n245m\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.286970 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bcfb9ffb5-q4t7z" podStartSLOduration=5.286954233 podStartE2EDuration="5.286954233s" podCreationTimestamp="2026-01-30 05:26:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:22.274742043 +0000 UTC m=+1119.268214681" watchObservedRunningTime="2026-01-30 05:26:22.286954233 +0000 UTC m=+1119.280426871" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.334673 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685444497c-7d6rm"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.335561 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-scripts\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.335616 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-combined-ca-bundle\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.335650 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-combined-ca-bundle\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.335669 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-internal-tls-certs\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.335691 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69c5b922-a9ee-44a7-a350-966abc4e4809-logs\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.335798 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b660389-643d-46fa-aa90-747dc6dbe5f9-logs\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.335821 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-config-data-custom\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.335837 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8476\" (UniqueName: \"kubernetes.io/projected/cae633ad-6e09-4487-a2ad-21fc13696859-kube-api-access-l8476\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.335913 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cae633ad-6e09-4487-a2ad-21fc13696859-logs\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.335935 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-combined-ca-bundle\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.335960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-config-data\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.335978 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-config-data\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.336014 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-public-tls-certs\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.336032 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-config-data\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.336062 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-config-data-custom\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.336104 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c4xt\" (UniqueName: \"kubernetes.io/projected/8b660389-643d-46fa-aa90-747dc6dbe5f9-kube-api-access-9c4xt\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.336704 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k546x\" (UniqueName: \"kubernetes.io/projected/69c5b922-a9ee-44a7-a350-966abc4e4809-kube-api-access-k546x\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.350217 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cae633ad-6e09-4487-a2ad-21fc13696859-logs\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.353154 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b660389-643d-46fa-aa90-747dc6dbe5f9-logs\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.353326 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-combined-ca-bundle\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.359942 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-config-data\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.372715 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.380017 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-config-data-custom\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.383266 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-internal-tls-certs\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.383288 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-config-data\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.383565 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-combined-ca-bundle\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.383685 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-public-tls-certs\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.383782 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-scripts\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.393564 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66cdd4b5b5-n9dmx"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.395552 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.400129 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c4xt\" (UniqueName: \"kubernetes.io/projected/8b660389-643d-46fa-aa90-747dc6dbe5f9-kube-api-access-9c4xt\") pod \"placement-7cb787b658-zbrbz\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.408882 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8476\" (UniqueName: \"kubernetes.io/projected/cae633ad-6e09-4487-a2ad-21fc13696859-kube-api-access-l8476\") pod \"barbican-worker-7d4467fc7c-fcdx6\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.438133 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66cdd4b5b5-n9dmx"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.458724 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-config-data\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.458769 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-config\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.458811 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-ovsdbserver-nb\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.458845 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-config-data-custom\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.458889 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k546x\" (UniqueName: \"kubernetes.io/projected/69c5b922-a9ee-44a7-a350-966abc4e4809-kube-api-access-k546x\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.458930 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-dns-swift-storage-0\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.458947 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-combined-ca-bundle\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.458988 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69c5b922-a9ee-44a7-a350-966abc4e4809-logs\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.459012 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-ovsdbserver-sb\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.459060 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-dns-svc\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.459089 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf8wt\" (UniqueName: \"kubernetes.io/projected/569f263e-9a57-4cff-bb00-da7a3a5923c8-kube-api-access-zf8wt\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.462861 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.476878 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69c5b922-a9ee-44a7-a350-966abc4e4809-logs\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.487161 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.503201 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k546x\" (UniqueName: \"kubernetes.io/projected/69c5b922-a9ee-44a7-a350-966abc4e4809-kube-api-access-k546x\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.515972 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-config-data-custom\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.516584 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-combined-ca-bundle\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.520376 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-config-data\") pod \"barbican-keystone-listener-65d7cd9bbd-ltngt\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.533774 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.534561 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5c979bc7b8-j85nw"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.610595 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-dns-swift-storage-0\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.610664 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-ovsdbserver-sb\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.610702 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-dns-svc\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.610749 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf8wt\" (UniqueName: \"kubernetes.io/projected/569f263e-9a57-4cff-bb00-da7a3a5923c8-kube-api-access-zf8wt\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.610811 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-config\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.610835 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-ovsdbserver-nb\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.611717 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-ovsdbserver-nb\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.612495 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-dns-swift-storage-0\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.621102 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c979bc7b8-j85nw"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.621258 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.624160 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-dns-svc\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.624990 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-config\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.625234 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.646202 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-ovsdbserver-sb\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.709729 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf8wt\" (UniqueName: \"kubernetes.io/projected/569f263e-9a57-4cff-bb00-da7a3a5923c8-kube-api-access-zf8wt\") pod \"dnsmasq-dns-66cdd4b5b5-n9dmx\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.717284 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-config-data\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.717355 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-config-data-custom\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.717384 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvpv5\" (UniqueName: \"kubernetes.io/projected/da063a32-f0f1-4849-b6da-d592237caf41-kube-api-access-gvpv5\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.717456 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-combined-ca-bundle\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.717480 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da063a32-f0f1-4849-b6da-d592237caf41-logs\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.794499 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-69fd44f6fc-dpzfd"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.817470 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.822753 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-config-data\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.822836 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-config-data-custom\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.822863 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvpv5\" (UniqueName: \"kubernetes.io/projected/da063a32-f0f1-4849-b6da-d592237caf41-kube-api-access-gvpv5\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.822912 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-combined-ca-bundle\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.822948 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da063a32-f0f1-4849-b6da-d592237caf41-logs\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.826600 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7b9cf755cd-5p4pk"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.827914 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da063a32-f0f1-4849-b6da-d592237caf41-logs\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.832602 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.849582 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-combined-ca-bundle\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.852292 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-config-data\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.853632 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvpv5\" (UniqueName: \"kubernetes.io/projected/da063a32-f0f1-4849-b6da-d592237caf41-kube-api-access-gvpv5\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.853995 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69fd44f6fc-dpzfd"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.862561 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-config-data-custom\") pod \"barbican-api-5c979bc7b8-j85nw\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.875566 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7b9cf755cd-5p4pk"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.878548 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.924844 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-logs\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.925156 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-config-data-custom\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.925198 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-config-data\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.925216 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-config-data-custom\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.925236 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-combined-ca-bundle\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.925268 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xczf\" (UniqueName: \"kubernetes.io/projected/a035e8ef-e433-4c59-a0fd-09937eb5f226-kube-api-access-6xczf\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.925287 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-config-data\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.925327 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4tm6\" (UniqueName: \"kubernetes.io/projected/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-kube-api-access-n4tm6\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.925355 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a035e8ef-e433-4c59-a0fd-09937eb5f226-logs\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.931623 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-combined-ca-bundle\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.940449 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6fc8b6ddd6-nkc6r"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.949633 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.953584 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6fc8b6ddd6-nkc6r"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.953780 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.984865 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-77d66cd676-sj9d2"] Jan 30 05:26:22 crc kubenswrapper[4841]: I0130 05:26:22.989286 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.012282 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77d66cd676-sj9d2"] Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.036363 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-config-data\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.036416 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-config-data-custom\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.036442 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-combined-ca-bundle\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.036466 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-combined-ca-bundle\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.036497 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-scripts\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.036520 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xczf\" (UniqueName: \"kubernetes.io/projected/a035e8ef-e433-4c59-a0fd-09937eb5f226-kube-api-access-6xczf\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.036542 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-config-data\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.036865 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-internal-tls-certs\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.036915 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4tm6\" (UniqueName: \"kubernetes.io/projected/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-kube-api-access-n4tm6\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.036947 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a035e8ef-e433-4c59-a0fd-09937eb5f226-logs\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.036971 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9p22\" (UniqueName: \"kubernetes.io/projected/28551500-d017-475a-aae4-8352782c0b4e-kube-api-access-j9p22\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.036989 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-combined-ca-bundle\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.037029 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28551500-d017-475a-aae4-8352782c0b4e-logs\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.037050 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-logs\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.037074 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-config-data\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.037098 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-public-tls-certs\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.037115 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-config-data-custom\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.042019 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a035e8ef-e433-4c59-a0fd-09937eb5f226-logs\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.042932 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-config-data-custom\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.043144 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-config-data\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.043385 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-logs\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.045737 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-combined-ca-bundle\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.048733 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-config-data-custom\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.048914 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-combined-ca-bundle\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.057185 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-config-data\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.073084 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xczf\" (UniqueName: \"kubernetes.io/projected/a035e8ef-e433-4c59-a0fd-09937eb5f226-kube-api-access-6xczf\") pod \"barbican-worker-69fd44f6fc-dpzfd\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.075944 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4tm6\" (UniqueName: \"kubernetes.io/projected/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-kube-api-access-n4tm6\") pod \"barbican-keystone-listener-7b9cf755cd-5p4pk\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.138892 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-combined-ca-bundle\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.138956 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9p22\" (UniqueName: \"kubernetes.io/projected/28551500-d017-475a-aae4-8352782c0b4e-kube-api-access-j9p22\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.139006 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj5p8\" (UniqueName: \"kubernetes.io/projected/b0608cab-e4d7-4288-9ca2-831df372d653-kube-api-access-jj5p8\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.139038 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28551500-d017-475a-aae4-8352782c0b4e-logs\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.139073 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-config-data\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.139097 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-public-tls-certs\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.139135 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0608cab-e4d7-4288-9ca2-831df372d653-logs\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.139183 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-combined-ca-bundle\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.139204 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-scripts\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.139251 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-internal-tls-certs\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.139268 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-config-data\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.139289 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-config-data-custom\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.140099 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28551500-d017-475a-aae4-8352782c0b4e-logs\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.146030 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-internal-tls-certs\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.146269 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-scripts\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.148622 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-combined-ca-bundle\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.148985 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-public-tls-certs\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.149491 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-config-data\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.156055 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9p22\" (UniqueName: \"kubernetes.io/projected/28551500-d017-475a-aae4-8352782c0b4e-kube-api-access-j9p22\") pod \"placement-6fc8b6ddd6-nkc6r\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.222578 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.242313 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0608cab-e4d7-4288-9ca2-831df372d653-logs\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.242426 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-config-data\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.242447 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-config-data-custom\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.242488 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-combined-ca-bundle\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.242524 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj5p8\" (UniqueName: \"kubernetes.io/projected/b0608cab-e4d7-4288-9ca2-831df372d653-kube-api-access-jj5p8\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.243127 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0608cab-e4d7-4288-9ca2-831df372d653-logs\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.251908 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-config-data-custom\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.251964 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-combined-ca-bundle\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.252043 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.253301 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-config-data\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.261190 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.262115 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj5p8\" (UniqueName: \"kubernetes.io/projected/b0608cab-e4d7-4288-9ca2-831df372d653-kube-api-access-jj5p8\") pod \"barbican-api-77d66cd676-sj9d2\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.285481 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.336705 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.399093 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5445c58497-n245m"] Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.756165 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d4467fc7c-fcdx6"] Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.767484 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cb787b658-zbrbz"] Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.788066 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-65d7cd9bbd-ltngt"] Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.804204 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66cdd4b5b5-n9dmx"] Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.819179 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c979bc7b8-j85nw"] Jan 30 05:26:23 crc kubenswrapper[4841]: W0130 05:26:23.832421 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda063a32_f0f1_4849_b6da_d592237caf41.slice/crio-149fb8974aa4f982c06b0be34cc976c4e1eecaba0f11667667583e4c465ffa9f WatchSource:0}: Error finding container 149fb8974aa4f982c06b0be34cc976c4e1eecaba0f11667667583e4c465ffa9f: Status 404 returned error can't find the container with id 149fb8974aa4f982c06b0be34cc976c4e1eecaba0f11667667583e4c465ffa9f Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.873169 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7b9cf755cd-5p4pk"] Jan 30 05:26:23 crc kubenswrapper[4841]: I0130 05:26:23.888601 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69fd44f6fc-dpzfd"] Jan 30 05:26:23 crc kubenswrapper[4841]: W0130 05:26:23.896091 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda035e8ef_e433_4c59_a0fd_09937eb5f226.slice/crio-d7b374183505e48eddd61c21c0009dd11a4291e4f2b2d78965b774068964e6c4 WatchSource:0}: Error finding container d7b374183505e48eddd61c21c0009dd11a4291e4f2b2d78965b774068964e6c4: Status 404 returned error can't find the container with id d7b374183505e48eddd61c21c0009dd11a4291e4f2b2d78965b774068964e6c4 Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.035450 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6fc8b6ddd6-nkc6r"] Jan 30 05:26:24 crc kubenswrapper[4841]: W0130 05:26:24.045962 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28551500_d017_475a_aae4_8352782c0b4e.slice/crio-fb3571d156a923c7d7af537c762725872e8f52a1c84ffa314dad1c545f735afb WatchSource:0}: Error finding container fb3571d156a923c7d7af537c762725872e8f52a1c84ffa314dad1c545f735afb: Status 404 returned error can't find the container with id fb3571d156a923c7d7af537c762725872e8f52a1c84ffa314dad1c545f735afb Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.113341 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77d66cd676-sj9d2"] Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.236157 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" event={"ID":"3b29e384-bd86-4102-8a9e-4745cd0ae8d5","Type":"ContainerStarted","Data":"a98cfd75ca6b38eb01720689777a32dfeb6ddb4b423204a48d38c8e079b35cd6"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.237282 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" event={"ID":"cae633ad-6e09-4487-a2ad-21fc13696859","Type":"ContainerStarted","Data":"aa698e62fc79938d10af4d5712e326275eca4bcf08580b96fe9563e13104b152"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.239646 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c979bc7b8-j85nw" event={"ID":"da063a32-f0f1-4849-b6da-d592237caf41","Type":"ContainerStarted","Data":"8e91f0687ac0db5068ef7e0d7151bd9537afd5e5cfd1c59d5e214bf6abf733e8"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.239670 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c979bc7b8-j85nw" event={"ID":"da063a32-f0f1-4849-b6da-d592237caf41","Type":"ContainerStarted","Data":"149fb8974aa4f982c06b0be34cc976c4e1eecaba0f11667667583e4c465ffa9f"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.241078 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc8b6ddd6-nkc6r" event={"ID":"28551500-d017-475a-aae4-8352782c0b4e","Type":"ContainerStarted","Data":"fb3571d156a923c7d7af537c762725872e8f52a1c84ffa314dad1c545f735afb"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.245417 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" event={"ID":"a035e8ef-e433-4c59-a0fd-09937eb5f226","Type":"ContainerStarted","Data":"d7b374183505e48eddd61c21c0009dd11a4291e4f2b2d78965b774068964e6c4"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.247284 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cb787b658-zbrbz" event={"ID":"8b660389-643d-46fa-aa90-747dc6dbe5f9","Type":"ContainerStarted","Data":"71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.247308 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cb787b658-zbrbz" event={"ID":"8b660389-643d-46fa-aa90-747dc6dbe5f9","Type":"ContainerStarted","Data":"6e921f6df5c6d836874ba876b9fd7e3235b60728d4235eb932dee3d6d7b157f5"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.249393 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5445c58497-n245m" event={"ID":"b191848f-8f16-40e6-8a2b-f66a0179f359","Type":"ContainerStarted","Data":"5f9a9849cbf3d28b7e6adfb866f7318a167f6a59eaf9156c8cc67f2d836ff57f"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.249430 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5445c58497-n245m" event={"ID":"b191848f-8f16-40e6-8a2b-f66a0179f359","Type":"ContainerStarted","Data":"c76dedfa2ca22effdff62e518b1c1253e08c42c734b788c4024dfa5241c692cc"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.250454 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.252597 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77d66cd676-sj9d2" event={"ID":"b0608cab-e4d7-4288-9ca2-831df372d653","Type":"ContainerStarted","Data":"068f0557b6f2402d00a6e002663214e793ed1601b0687c1852a732b8e5efb417"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.260625 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" event={"ID":"69c5b922-a9ee-44a7-a350-966abc4e4809","Type":"ContainerStarted","Data":"5c88135e730c48cceb98d83c0995f1027d0023e394d7622377b5dc0cce9cbdc2"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.262303 4841 generic.go:334] "Generic (PLEG): container finished" podID="569f263e-9a57-4cff-bb00-da7a3a5923c8" containerID="874020d90aa08682927abe4c4d2d7fd0c134e81df493ef2d1dd120899bbaf959" exitCode=0 Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.262426 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" event={"ID":"569f263e-9a57-4cff-bb00-da7a3a5923c8","Type":"ContainerDied","Data":"874020d90aa08682927abe4c4d2d7fd0c134e81df493ef2d1dd120899bbaf959"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.262468 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" event={"ID":"569f263e-9a57-4cff-bb00-da7a3a5923c8","Type":"ContainerStarted","Data":"6e023825af3d1da526affd9f95a0db4a2494646b21789b16565cd36340b200d6"} Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.262889 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-685444497c-7d6rm" podUID="92dee641-7744-48ab-b9dd-49555b7d9ce9" containerName="dnsmasq-dns" containerID="cri-o://b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da" gracePeriod=10 Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.281232 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5445c58497-n245m" podStartSLOduration=2.281216931 podStartE2EDuration="2.281216931s" podCreationTimestamp="2026-01-30 05:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:24.271189719 +0000 UTC m=+1121.264662357" watchObservedRunningTime="2026-01-30 05:26:24.281216931 +0000 UTC m=+1121.274689569" Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.828584 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.902560 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-dns-svc\") pod \"92dee641-7744-48ab-b9dd-49555b7d9ce9\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.902721 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhppk\" (UniqueName: \"kubernetes.io/projected/92dee641-7744-48ab-b9dd-49555b7d9ce9-kube-api-access-lhppk\") pod \"92dee641-7744-48ab-b9dd-49555b7d9ce9\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.902739 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-ovsdbserver-sb\") pod \"92dee641-7744-48ab-b9dd-49555b7d9ce9\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.902774 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-ovsdbserver-nb\") pod \"92dee641-7744-48ab-b9dd-49555b7d9ce9\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.902803 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-config\") pod \"92dee641-7744-48ab-b9dd-49555b7d9ce9\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.902874 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-dns-swift-storage-0\") pod \"92dee641-7744-48ab-b9dd-49555b7d9ce9\" (UID: \"92dee641-7744-48ab-b9dd-49555b7d9ce9\") " Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.941648 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92dee641-7744-48ab-b9dd-49555b7d9ce9-kube-api-access-lhppk" (OuterVolumeSpecName: "kube-api-access-lhppk") pod "92dee641-7744-48ab-b9dd-49555b7d9ce9" (UID: "92dee641-7744-48ab-b9dd-49555b7d9ce9"). InnerVolumeSpecName "kube-api-access-lhppk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.991097 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "92dee641-7744-48ab-b9dd-49555b7d9ce9" (UID: "92dee641-7744-48ab-b9dd-49555b7d9ce9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:24 crc kubenswrapper[4841]: I0130 05:26:24.995806 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "92dee641-7744-48ab-b9dd-49555b7d9ce9" (UID: "92dee641-7744-48ab-b9dd-49555b7d9ce9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.005545 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhppk\" (UniqueName: \"kubernetes.io/projected/92dee641-7744-48ab-b9dd-49555b7d9ce9-kube-api-access-lhppk\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.005583 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.005592 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.007877 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92dee641-7744-48ab-b9dd-49555b7d9ce9" (UID: "92dee641-7744-48ab-b9dd-49555b7d9ce9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.064130 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "92dee641-7744-48ab-b9dd-49555b7d9ce9" (UID: "92dee641-7744-48ab-b9dd-49555b7d9ce9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.071615 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-config" (OuterVolumeSpecName: "config") pod "92dee641-7744-48ab-b9dd-49555b7d9ce9" (UID: "92dee641-7744-48ab-b9dd-49555b7d9ce9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.108301 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.108365 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.108374 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92dee641-7744-48ab-b9dd-49555b7d9ce9-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.274448 4841 generic.go:334] "Generic (PLEG): container finished" podID="92dee641-7744-48ab-b9dd-49555b7d9ce9" containerID="b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da" exitCode=0 Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.274501 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685444497c-7d6rm" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.274542 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-7d6rm" event={"ID":"92dee641-7744-48ab-b9dd-49555b7d9ce9","Type":"ContainerDied","Data":"b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da"} Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.274613 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-7d6rm" event={"ID":"92dee641-7744-48ab-b9dd-49555b7d9ce9","Type":"ContainerDied","Data":"d62f4656252f33fa77e18b9b251fbc0ae82ff8f5c155fbf9ddba8639784e5b6e"} Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.274631 4841 scope.go:117] "RemoveContainer" containerID="b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.297352 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cb787b658-zbrbz" event={"ID":"8b660389-643d-46fa-aa90-747dc6dbe5f9","Type":"ContainerStarted","Data":"e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6"} Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.297738 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.297792 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.301348 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77d66cd676-sj9d2" event={"ID":"b0608cab-e4d7-4288-9ca2-831df372d653","Type":"ContainerStarted","Data":"8074b38b93b4b4437dfa123aa02f0cce562d3c49629666abfcfca788b967095c"} Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.301369 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77d66cd676-sj9d2" event={"ID":"b0608cab-e4d7-4288-9ca2-831df372d653","Type":"ContainerStarted","Data":"8030d99396b89edc5d7d06dcf3d56bab00d6e79110e9c57a635ab6b9a5d1877e"} Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.302159 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.302179 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.315250 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685444497c-7d6rm"] Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.316435 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" event={"ID":"569f263e-9a57-4cff-bb00-da7a3a5923c8","Type":"ContainerStarted","Data":"aabf880281ea8e9bfd967d4b1672d7e1b1a4b6bda8d727751bb2143a2b045b4e"} Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.317095 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.319448 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c979bc7b8-j85nw" event={"ID":"da063a32-f0f1-4849-b6da-d592237caf41","Type":"ContainerStarted","Data":"5cd1529dfaed65efce838b2cf1dcbb9d36186b52d55e2fce1867ed9c8f4b657c"} Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.319839 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.319865 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.322721 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc8b6ddd6-nkc6r" event={"ID":"28551500-d017-475a-aae4-8352782c0b4e","Type":"ContainerStarted","Data":"1608c08a181e26696b13ff2abda250d1ad88e50ec15ce51f053cef31c22f983e"} Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.322742 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc8b6ddd6-nkc6r" event={"ID":"28551500-d017-475a-aae4-8352782c0b4e","Type":"ContainerStarted","Data":"088a47d541f00148328e19a9fb5636697d24c805841cd93d6a4ac4b7d6e6779f"} Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.322755 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.322781 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.326040 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-685444497c-7d6rm"] Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.348090 4841 scope.go:117] "RemoveContainer" containerID="c08133f4129a7126b1a6d4a619856a1cd3bffd9a546a8f7ee43b6d42ac78121a" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.354190 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7cb787b658-zbrbz" podStartSLOduration=3.354176554 podStartE2EDuration="3.354176554s" podCreationTimestamp="2026-01-30 05:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:25.328951564 +0000 UTC m=+1122.322424212" watchObservedRunningTime="2026-01-30 05:26:25.354176554 +0000 UTC m=+1122.347649192" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.393193 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-77d66cd676-sj9d2" podStartSLOduration=3.393171385 podStartE2EDuration="3.393171385s" podCreationTimestamp="2026-01-30 05:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:25.359539965 +0000 UTC m=+1122.353012603" watchObservedRunningTime="2026-01-30 05:26:25.393171385 +0000 UTC m=+1122.386644023" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.396850 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6fc8b6ddd6-nkc6r" podStartSLOduration=3.396842161 podStartE2EDuration="3.396842161s" podCreationTimestamp="2026-01-30 05:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:25.386663114 +0000 UTC m=+1122.380135752" watchObservedRunningTime="2026-01-30 05:26:25.396842161 +0000 UTC m=+1122.390314799" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.399655 4841 scope.go:117] "RemoveContainer" containerID="b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da" Jan 30 05:26:25 crc kubenswrapper[4841]: E0130 05:26:25.402028 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da\": container with ID starting with b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da not found: ID does not exist" containerID="b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.402080 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da"} err="failed to get container status \"b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da\": rpc error: code = NotFound desc = could not find container \"b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da\": container with ID starting with b05e78c4b96060da0d2fb345dc2f4679720f12d6ad7571d65394bcf6412896da not found: ID does not exist" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.402107 4841 scope.go:117] "RemoveContainer" containerID="c08133f4129a7126b1a6d4a619856a1cd3bffd9a546a8f7ee43b6d42ac78121a" Jan 30 05:26:25 crc kubenswrapper[4841]: E0130 05:26:25.407962 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08133f4129a7126b1a6d4a619856a1cd3bffd9a546a8f7ee43b6d42ac78121a\": container with ID starting with c08133f4129a7126b1a6d4a619856a1cd3bffd9a546a8f7ee43b6d42ac78121a not found: ID does not exist" containerID="c08133f4129a7126b1a6d4a619856a1cd3bffd9a546a8f7ee43b6d42ac78121a" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.408001 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08133f4129a7126b1a6d4a619856a1cd3bffd9a546a8f7ee43b6d42ac78121a"} err="failed to get container status \"c08133f4129a7126b1a6d4a619856a1cd3bffd9a546a8f7ee43b6d42ac78121a\": rpc error: code = NotFound desc = could not find container \"c08133f4129a7126b1a6d4a619856a1cd3bffd9a546a8f7ee43b6d42ac78121a\": container with ID starting with c08133f4129a7126b1a6d4a619856a1cd3bffd9a546a8f7ee43b6d42ac78121a not found: ID does not exist" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.424925 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" podStartSLOduration=3.424900976 podStartE2EDuration="3.424900976s" podCreationTimestamp="2026-01-30 05:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:25.407673694 +0000 UTC m=+1122.401146332" watchObservedRunningTime="2026-01-30 05:26:25.424900976 +0000 UTC m=+1122.418373614" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.449820 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5c979bc7b8-j85nw" podStartSLOduration=3.449799867 podStartE2EDuration="3.449799867s" podCreationTimestamp="2026-01-30 05:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:25.427573356 +0000 UTC m=+1122.421045994" watchObservedRunningTime="2026-01-30 05:26:25.449799867 +0000 UTC m=+1122.443272505" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.461539 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c979bc7b8-j85nw"] Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.480606 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d94d6f7cb-lf9nq"] Jan 30 05:26:25 crc kubenswrapper[4841]: E0130 05:26:25.480983 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92dee641-7744-48ab-b9dd-49555b7d9ce9" containerName="dnsmasq-dns" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.480999 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="92dee641-7744-48ab-b9dd-49555b7d9ce9" containerName="dnsmasq-dns" Jan 30 05:26:25 crc kubenswrapper[4841]: E0130 05:26:25.481016 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92dee641-7744-48ab-b9dd-49555b7d9ce9" containerName="init" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.481023 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="92dee641-7744-48ab-b9dd-49555b7d9ce9" containerName="init" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.481224 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="92dee641-7744-48ab-b9dd-49555b7d9ce9" containerName="dnsmasq-dns" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.482157 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.485888 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.486994 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.500443 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d94d6f7cb-lf9nq"] Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.520515 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqhrz\" (UniqueName: \"kubernetes.io/projected/91db9edf-7d6d-4189-aaac-480a438900be-kube-api-access-vqhrz\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.520563 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-combined-ca-bundle\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.520675 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-config-data-custom\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.520719 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91db9edf-7d6d-4189-aaac-480a438900be-logs\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.520744 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-public-tls-certs\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.520769 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-config-data\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.520836 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-internal-tls-certs\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.622646 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqhrz\" (UniqueName: \"kubernetes.io/projected/91db9edf-7d6d-4189-aaac-480a438900be-kube-api-access-vqhrz\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.622705 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-combined-ca-bundle\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.622754 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-config-data-custom\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.622782 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91db9edf-7d6d-4189-aaac-480a438900be-logs\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.622812 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-public-tls-certs\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.622836 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-config-data\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.622873 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-internal-tls-certs\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.624484 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91db9edf-7d6d-4189-aaac-480a438900be-logs\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.627494 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-config-data-custom\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.627976 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-internal-tls-certs\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.628017 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-public-tls-certs\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.628766 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-combined-ca-bundle\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.632507 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-config-data\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.639286 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqhrz\" (UniqueName: \"kubernetes.io/projected/91db9edf-7d6d-4189-aaac-480a438900be-kube-api-access-vqhrz\") pod \"barbican-api-d94d6f7cb-lf9nq\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:25 crc kubenswrapper[4841]: I0130 05:26:25.804991 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:26 crc kubenswrapper[4841]: I0130 05:26:26.443295 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92dee641-7744-48ab-b9dd-49555b7d9ce9" path="/var/lib/kubelet/pods/92dee641-7744-48ab-b9dd-49555b7d9ce9/volumes" Jan 30 05:26:26 crc kubenswrapper[4841]: I0130 05:26:26.751232 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d94d6f7cb-lf9nq"] Jan 30 05:26:26 crc kubenswrapper[4841]: W0130 05:26:26.756537 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91db9edf_7d6d_4189_aaac_480a438900be.slice/crio-b4d32361f9052fe7c32aa9c1a7bae2e5fb1d7bfde165d57d450468b7d3637822 WatchSource:0}: Error finding container b4d32361f9052fe7c32aa9c1a7bae2e5fb1d7bfde165d57d450468b7d3637822: Status 404 returned error can't find the container with id b4d32361f9052fe7c32aa9c1a7bae2e5fb1d7bfde165d57d450468b7d3637822 Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.357133 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d94d6f7cb-lf9nq" event={"ID":"91db9edf-7d6d-4189-aaac-480a438900be","Type":"ContainerStarted","Data":"bc897d8ef0f32c66c2606560ae71bcd74a208effcd7e7b2a87ba2f0b34843405"} Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.357389 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d94d6f7cb-lf9nq" event={"ID":"91db9edf-7d6d-4189-aaac-480a438900be","Type":"ContainerStarted","Data":"0de4c7d3f6fcb35d2a2e2a038bee89e95869ad06f6b00924683feb595867a396"} Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.357425 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d94d6f7cb-lf9nq" event={"ID":"91db9edf-7d6d-4189-aaac-480a438900be","Type":"ContainerStarted","Data":"b4d32361f9052fe7c32aa9c1a7bae2e5fb1d7bfde165d57d450468b7d3637822"} Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.357773 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.357815 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.359816 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" event={"ID":"69c5b922-a9ee-44a7-a350-966abc4e4809","Type":"ContainerStarted","Data":"e19b87a704474a39dde8c492160b1a7d52c9fc0ec7d9920d7bb3bc793b82d4c6"} Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.359846 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" event={"ID":"69c5b922-a9ee-44a7-a350-966abc4e4809","Type":"ContainerStarted","Data":"626098a8afa8e25d0fa56ed15e16341bd2dc5528d29cc4461cbcc4e3b8845bca"} Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.361691 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" event={"ID":"cae633ad-6e09-4487-a2ad-21fc13696859","Type":"ContainerStarted","Data":"617639990fdd7a9fc550c4d31f88262c90b61c1d859843dedcd6704d84e2fe6b"} Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.361797 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" event={"ID":"cae633ad-6e09-4487-a2ad-21fc13696859","Type":"ContainerStarted","Data":"ef99740979d9c7920bf8f0fabf36f408f8c7cef1d02940b3ff12f1aa80e86c55"} Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.363779 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" event={"ID":"a035e8ef-e433-4c59-a0fd-09937eb5f226","Type":"ContainerStarted","Data":"ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d"} Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.363834 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" event={"ID":"a035e8ef-e433-4c59-a0fd-09937eb5f226","Type":"ContainerStarted","Data":"da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184"} Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.368385 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" event={"ID":"3b29e384-bd86-4102-8a9e-4745cd0ae8d5","Type":"ContainerStarted","Data":"5c4314bd5d4e8c9a32d1faa3556ab7cd3dae6013e3228dbb71483e6295221f1f"} Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.368433 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" event={"ID":"3b29e384-bd86-4102-8a9e-4745cd0ae8d5","Type":"ContainerStarted","Data":"0a7c1a689c4e10c27ddb7c3c12fcc8d9ee61127c2c5b8d9fdc13d03072e0a7c1"} Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.368970 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c979bc7b8-j85nw" podUID="da063a32-f0f1-4849-b6da-d592237caf41" containerName="barbican-api-log" containerID="cri-o://8e91f0687ac0db5068ef7e0d7151bd9537afd5e5cfd1c59d5e214bf6abf733e8" gracePeriod=30 Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.369524 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c979bc7b8-j85nw" podUID="da063a32-f0f1-4849-b6da-d592237caf41" containerName="barbican-api" containerID="cri-o://5cd1529dfaed65efce838b2cf1dcbb9d36186b52d55e2fce1867ed9c8f4b657c" gracePeriod=30 Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.383313 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d94d6f7cb-lf9nq" podStartSLOduration=2.383292804 podStartE2EDuration="2.383292804s" podCreationTimestamp="2026-01-30 05:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:27.373256152 +0000 UTC m=+1124.366728790" watchObservedRunningTime="2026-01-30 05:26:27.383292804 +0000 UTC m=+1124.376765442" Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.402859 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" podStartSLOduration=3.05447113 podStartE2EDuration="5.402842626s" podCreationTimestamp="2026-01-30 05:26:22 +0000 UTC" firstStartedPulling="2026-01-30 05:26:23.94422753 +0000 UTC m=+1120.937700168" lastFinishedPulling="2026-01-30 05:26:26.292599026 +0000 UTC m=+1123.286071664" observedRunningTime="2026-01-30 05:26:27.400990608 +0000 UTC m=+1124.394463246" watchObservedRunningTime="2026-01-30 05:26:27.402842626 +0000 UTC m=+1124.396315264" Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.430461 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7d4467fc7c-fcdx6"] Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.439958 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" podStartSLOduration=2.949932534 podStartE2EDuration="5.439937917s" podCreationTimestamp="2026-01-30 05:26:22 +0000 UTC" firstStartedPulling="2026-01-30 05:26:23.802670165 +0000 UTC m=+1120.796142803" lastFinishedPulling="2026-01-30 05:26:26.292675538 +0000 UTC m=+1123.286148186" observedRunningTime="2026-01-30 05:26:27.42208369 +0000 UTC m=+1124.415556328" watchObservedRunningTime="2026-01-30 05:26:27.439937917 +0000 UTC m=+1124.433410555" Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.516338 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" podStartSLOduration=3.168652428 podStartE2EDuration="5.516320016s" podCreationTimestamp="2026-01-30 05:26:22 +0000 UTC" firstStartedPulling="2026-01-30 05:26:23.94422665 +0000 UTC m=+1120.937699288" lastFinishedPulling="2026-01-30 05:26:26.291894238 +0000 UTC m=+1123.285366876" observedRunningTime="2026-01-30 05:26:27.442838423 +0000 UTC m=+1124.436311061" watchObservedRunningTime="2026-01-30 05:26:27.516320016 +0000 UTC m=+1124.509792654" Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.543872 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-65d7cd9bbd-ltngt"] Jan 30 05:26:27 crc kubenswrapper[4841]: I0130 05:26:27.563102 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" podStartSLOduration=3.05485978 podStartE2EDuration="5.56308189s" podCreationTimestamp="2026-01-30 05:26:22 +0000 UTC" firstStartedPulling="2026-01-30 05:26:23.782882637 +0000 UTC m=+1120.776355275" lastFinishedPulling="2026-01-30 05:26:26.291104747 +0000 UTC m=+1123.284577385" observedRunningTime="2026-01-30 05:26:27.473005562 +0000 UTC m=+1124.466478200" watchObservedRunningTime="2026-01-30 05:26:27.56308189 +0000 UTC m=+1124.556554528" Jan 30 05:26:28 crc kubenswrapper[4841]: I0130 05:26:28.382669 4841 generic.go:334] "Generic (PLEG): container finished" podID="da063a32-f0f1-4849-b6da-d592237caf41" containerID="5cd1529dfaed65efce838b2cf1dcbb9d36186b52d55e2fce1867ed9c8f4b657c" exitCode=0 Jan 30 05:26:28 crc kubenswrapper[4841]: I0130 05:26:28.382971 4841 generic.go:334] "Generic (PLEG): container finished" podID="da063a32-f0f1-4849-b6da-d592237caf41" containerID="8e91f0687ac0db5068ef7e0d7151bd9537afd5e5cfd1c59d5e214bf6abf733e8" exitCode=143 Jan 30 05:26:28 crc kubenswrapper[4841]: I0130 05:26:28.383421 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c979bc7b8-j85nw" event={"ID":"da063a32-f0f1-4849-b6da-d592237caf41","Type":"ContainerDied","Data":"5cd1529dfaed65efce838b2cf1dcbb9d36186b52d55e2fce1867ed9c8f4b657c"} Jan 30 05:26:28 crc kubenswrapper[4841]: I0130 05:26:28.383467 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c979bc7b8-j85nw" event={"ID":"da063a32-f0f1-4849-b6da-d592237caf41","Type":"ContainerDied","Data":"8e91f0687ac0db5068ef7e0d7151bd9537afd5e5cfd1c59d5e214bf6abf733e8"} Jan 30 05:26:29 crc kubenswrapper[4841]: I0130 05:26:29.394328 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" podUID="cae633ad-6e09-4487-a2ad-21fc13696859" containerName="barbican-worker-log" containerID="cri-o://ef99740979d9c7920bf8f0fabf36f408f8c7cef1d02940b3ff12f1aa80e86c55" gracePeriod=30 Jan 30 05:26:29 crc kubenswrapper[4841]: I0130 05:26:29.394459 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" podUID="cae633ad-6e09-4487-a2ad-21fc13696859" containerName="barbican-worker" containerID="cri-o://617639990fdd7a9fc550c4d31f88262c90b61c1d859843dedcd6704d84e2fe6b" gracePeriod=30 Jan 30 05:26:29 crc kubenswrapper[4841]: I0130 05:26:29.394741 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" podUID="69c5b922-a9ee-44a7-a350-966abc4e4809" containerName="barbican-keystone-listener-log" containerID="cri-o://626098a8afa8e25d0fa56ed15e16341bd2dc5528d29cc4461cbcc4e3b8845bca" gracePeriod=30 Jan 30 05:26:29 crc kubenswrapper[4841]: I0130 05:26:29.394836 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" podUID="69c5b922-a9ee-44a7-a350-966abc4e4809" containerName="barbican-keystone-listener" containerID="cri-o://e19b87a704474a39dde8c492160b1a7d52c9fc0ec7d9920d7bb3bc793b82d4c6" gracePeriod=30 Jan 30 05:26:30 crc kubenswrapper[4841]: I0130 05:26:30.410091 4841 generic.go:334] "Generic (PLEG): container finished" podID="cae633ad-6e09-4487-a2ad-21fc13696859" containerID="617639990fdd7a9fc550c4d31f88262c90b61c1d859843dedcd6704d84e2fe6b" exitCode=0 Jan 30 05:26:30 crc kubenswrapper[4841]: I0130 05:26:30.410429 4841 generic.go:334] "Generic (PLEG): container finished" podID="cae633ad-6e09-4487-a2ad-21fc13696859" containerID="ef99740979d9c7920bf8f0fabf36f408f8c7cef1d02940b3ff12f1aa80e86c55" exitCode=143 Jan 30 05:26:30 crc kubenswrapper[4841]: I0130 05:26:30.410499 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" event={"ID":"cae633ad-6e09-4487-a2ad-21fc13696859","Type":"ContainerDied","Data":"617639990fdd7a9fc550c4d31f88262c90b61c1d859843dedcd6704d84e2fe6b"} Jan 30 05:26:30 crc kubenswrapper[4841]: I0130 05:26:30.410537 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" event={"ID":"cae633ad-6e09-4487-a2ad-21fc13696859","Type":"ContainerDied","Data":"ef99740979d9c7920bf8f0fabf36f408f8c7cef1d02940b3ff12f1aa80e86c55"} Jan 30 05:26:30 crc kubenswrapper[4841]: I0130 05:26:30.412697 4841 generic.go:334] "Generic (PLEG): container finished" podID="69c5b922-a9ee-44a7-a350-966abc4e4809" containerID="e19b87a704474a39dde8c492160b1a7d52c9fc0ec7d9920d7bb3bc793b82d4c6" exitCode=0 Jan 30 05:26:30 crc kubenswrapper[4841]: I0130 05:26:30.412732 4841 generic.go:334] "Generic (PLEG): container finished" podID="69c5b922-a9ee-44a7-a350-966abc4e4809" containerID="626098a8afa8e25d0fa56ed15e16341bd2dc5528d29cc4461cbcc4e3b8845bca" exitCode=143 Jan 30 05:26:30 crc kubenswrapper[4841]: I0130 05:26:30.412758 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" event={"ID":"69c5b922-a9ee-44a7-a350-966abc4e4809","Type":"ContainerDied","Data":"e19b87a704474a39dde8c492160b1a7d52c9fc0ec7d9920d7bb3bc793b82d4c6"} Jan 30 05:26:30 crc kubenswrapper[4841]: I0130 05:26:30.412786 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" event={"ID":"69c5b922-a9ee-44a7-a350-966abc4e4809","Type":"ContainerDied","Data":"626098a8afa8e25d0fa56ed15e16341bd2dc5528d29cc4461cbcc4e3b8845bca"} Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.146025 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.265099 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-config-data-custom\") pod \"da063a32-f0f1-4849-b6da-d592237caf41\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.265421 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-combined-ca-bundle\") pod \"da063a32-f0f1-4849-b6da-d592237caf41\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.265487 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvpv5\" (UniqueName: \"kubernetes.io/projected/da063a32-f0f1-4849-b6da-d592237caf41-kube-api-access-gvpv5\") pod \"da063a32-f0f1-4849-b6da-d592237caf41\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.265555 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-config-data\") pod \"da063a32-f0f1-4849-b6da-d592237caf41\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.265682 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da063a32-f0f1-4849-b6da-d592237caf41-logs\") pod \"da063a32-f0f1-4849-b6da-d592237caf41\" (UID: \"da063a32-f0f1-4849-b6da-d592237caf41\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.266545 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da063a32-f0f1-4849-b6da-d592237caf41-logs" (OuterVolumeSpecName: "logs") pod "da063a32-f0f1-4849-b6da-d592237caf41" (UID: "da063a32-f0f1-4849-b6da-d592237caf41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.270795 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da063a32-f0f1-4849-b6da-d592237caf41" (UID: "da063a32-f0f1-4849-b6da-d592237caf41"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.270867 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da063a32-f0f1-4849-b6da-d592237caf41-kube-api-access-gvpv5" (OuterVolumeSpecName: "kube-api-access-gvpv5") pod "da063a32-f0f1-4849-b6da-d592237caf41" (UID: "da063a32-f0f1-4849-b6da-d592237caf41"). InnerVolumeSpecName "kube-api-access-gvpv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.295741 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da063a32-f0f1-4849-b6da-d592237caf41" (UID: "da063a32-f0f1-4849-b6da-d592237caf41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.316390 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-config-data" (OuterVolumeSpecName: "config-data") pod "da063a32-f0f1-4849-b6da-d592237caf41" (UID: "da063a32-f0f1-4849-b6da-d592237caf41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.368224 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da063a32-f0f1-4849-b6da-d592237caf41-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.368255 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.368266 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.368276 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvpv5\" (UniqueName: \"kubernetes.io/projected/da063a32-f0f1-4849-b6da-d592237caf41-kube-api-access-gvpv5\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.368286 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da063a32-f0f1-4849-b6da-d592237caf41-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.440026 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c979bc7b8-j85nw" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.447252 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c979bc7b8-j85nw" event={"ID":"da063a32-f0f1-4849-b6da-d592237caf41","Type":"ContainerDied","Data":"149fb8974aa4f982c06b0be34cc976c4e1eecaba0f11667667583e4c465ffa9f"} Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.447298 4841 scope.go:117] "RemoveContainer" containerID="5cd1529dfaed65efce838b2cf1dcbb9d36186b52d55e2fce1867ed9c8f4b657c" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.474549 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c979bc7b8-j85nw"] Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.481275 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5c979bc7b8-j85nw"] Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.633318 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.639162 4841 scope.go:117] "RemoveContainer" containerID="8e91f0687ac0db5068ef7e0d7151bd9537afd5e5cfd1c59d5e214bf6abf733e8" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.643450 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.675790 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-combined-ca-bundle\") pod \"cae633ad-6e09-4487-a2ad-21fc13696859\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.675858 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-config-data-custom\") pod \"69c5b922-a9ee-44a7-a350-966abc4e4809\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.675953 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-config-data\") pod \"cae633ad-6e09-4487-a2ad-21fc13696859\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.676041 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cae633ad-6e09-4487-a2ad-21fc13696859-logs\") pod \"cae633ad-6e09-4487-a2ad-21fc13696859\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.676145 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69c5b922-a9ee-44a7-a350-966abc4e4809-logs\") pod \"69c5b922-a9ee-44a7-a350-966abc4e4809\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.676211 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-config-data\") pod \"69c5b922-a9ee-44a7-a350-966abc4e4809\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.676295 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8476\" (UniqueName: \"kubernetes.io/projected/cae633ad-6e09-4487-a2ad-21fc13696859-kube-api-access-l8476\") pod \"cae633ad-6e09-4487-a2ad-21fc13696859\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.676363 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-combined-ca-bundle\") pod \"69c5b922-a9ee-44a7-a350-966abc4e4809\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.676476 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k546x\" (UniqueName: \"kubernetes.io/projected/69c5b922-a9ee-44a7-a350-966abc4e4809-kube-api-access-k546x\") pod \"69c5b922-a9ee-44a7-a350-966abc4e4809\" (UID: \"69c5b922-a9ee-44a7-a350-966abc4e4809\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.676560 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-config-data-custom\") pod \"cae633ad-6e09-4487-a2ad-21fc13696859\" (UID: \"cae633ad-6e09-4487-a2ad-21fc13696859\") " Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.676697 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c5b922-a9ee-44a7-a350-966abc4e4809-logs" (OuterVolumeSpecName: "logs") pod "69c5b922-a9ee-44a7-a350-966abc4e4809" (UID: "69c5b922-a9ee-44a7-a350-966abc4e4809"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.681631 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae633ad-6e09-4487-a2ad-21fc13696859-kube-api-access-l8476" (OuterVolumeSpecName: "kube-api-access-l8476") pod "cae633ad-6e09-4487-a2ad-21fc13696859" (UID: "cae633ad-6e09-4487-a2ad-21fc13696859"). InnerVolumeSpecName "kube-api-access-l8476". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.682064 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cae633ad-6e09-4487-a2ad-21fc13696859-logs" (OuterVolumeSpecName: "logs") pod "cae633ad-6e09-4487-a2ad-21fc13696859" (UID: "cae633ad-6e09-4487-a2ad-21fc13696859"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.683541 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "69c5b922-a9ee-44a7-a350-966abc4e4809" (UID: "69c5b922-a9ee-44a7-a350-966abc4e4809"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.684855 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69c5b922-a9ee-44a7-a350-966abc4e4809-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.684877 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8476\" (UniqueName: \"kubernetes.io/projected/cae633ad-6e09-4487-a2ad-21fc13696859-kube-api-access-l8476\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.684887 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cae633ad-6e09-4487-a2ad-21fc13696859-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.686033 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cae633ad-6e09-4487-a2ad-21fc13696859" (UID: "cae633ad-6e09-4487-a2ad-21fc13696859"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.687672 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c5b922-a9ee-44a7-a350-966abc4e4809-kube-api-access-k546x" (OuterVolumeSpecName: "kube-api-access-k546x") pod "69c5b922-a9ee-44a7-a350-966abc4e4809" (UID: "69c5b922-a9ee-44a7-a350-966abc4e4809"). InnerVolumeSpecName "kube-api-access-k546x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.705634 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69c5b922-a9ee-44a7-a350-966abc4e4809" (UID: "69c5b922-a9ee-44a7-a350-966abc4e4809"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.724771 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cae633ad-6e09-4487-a2ad-21fc13696859" (UID: "cae633ad-6e09-4487-a2ad-21fc13696859"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.747697 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-config-data" (OuterVolumeSpecName: "config-data") pod "cae633ad-6e09-4487-a2ad-21fc13696859" (UID: "cae633ad-6e09-4487-a2ad-21fc13696859"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.749852 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-config-data" (OuterVolumeSpecName: "config-data") pod "69c5b922-a9ee-44a7-a350-966abc4e4809" (UID: "69c5b922-a9ee-44a7-a350-966abc4e4809"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.786613 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.786653 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.786665 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k546x\" (UniqueName: \"kubernetes.io/projected/69c5b922-a9ee-44a7-a350-966abc4e4809-kube-api-access-k546x\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.786675 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.786684 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.786691 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69c5b922-a9ee-44a7-a350-966abc4e4809-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.786700 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae633ad-6e09-4487-a2ad-21fc13696859-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.882650 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.942078 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-mtzkb"] Jan 30 05:26:32 crc kubenswrapper[4841]: I0130 05:26:32.942686 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" podUID="04f2acf0-5409-4a59-b61b-33b657368f0f" containerName="dnsmasq-dns" containerID="cri-o://eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37" gracePeriod=10 Jan 30 05:26:33 crc kubenswrapper[4841]: E0130 05:26:33.104028 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.447793 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.449261 4841 generic.go:334] "Generic (PLEG): container finished" podID="04f2acf0-5409-4a59-b61b-33b657368f0f" containerID="eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37" exitCode=0 Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.449304 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" event={"ID":"04f2acf0-5409-4a59-b61b-33b657368f0f","Type":"ContainerDied","Data":"eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37"} Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.449325 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" event={"ID":"04f2acf0-5409-4a59-b61b-33b657368f0f","Type":"ContainerDied","Data":"6f4e1f5ffd28780646c236957b068739329db5aedab6b449f46bd76a138ad367"} Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.449340 4841 scope.go:117] "RemoveContainer" containerID="eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.451217 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" event={"ID":"cae633ad-6e09-4487-a2ad-21fc13696859","Type":"ContainerDied","Data":"aa698e62fc79938d10af4d5712e326275eca4bcf08580b96fe9563e13104b152"} Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.451279 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d4467fc7c-fcdx6" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.458554 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95528d5c-f34b-4913-9db3-05ef436c106d","Type":"ContainerStarted","Data":"36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0"} Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.458690 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" containerName="ceilometer-notification-agent" containerID="cri-o://cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8" gracePeriod=30 Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.459219 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.459251 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" containerName="sg-core" containerID="cri-o://fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6" gracePeriod=30 Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.459337 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" containerName="proxy-httpd" containerID="cri-o://36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0" gracePeriod=30 Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.495458 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" event={"ID":"69c5b922-a9ee-44a7-a350-966abc4e4809","Type":"ContainerDied","Data":"5c88135e730c48cceb98d83c0995f1027d0023e394d7622377b5dc0cce9cbdc2"} Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.497226 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65d7cd9bbd-ltngt" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.506464 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-dns-svc\") pod \"04f2acf0-5409-4a59-b61b-33b657368f0f\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.506789 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-config\") pod \"04f2acf0-5409-4a59-b61b-33b657368f0f\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.506932 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-ovsdbserver-sb\") pod \"04f2acf0-5409-4a59-b61b-33b657368f0f\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.507069 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnjht\" (UniqueName: \"kubernetes.io/projected/04f2acf0-5409-4a59-b61b-33b657368f0f-kube-api-access-hnjht\") pod \"04f2acf0-5409-4a59-b61b-33b657368f0f\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.507096 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-dns-swift-storage-0\") pod \"04f2acf0-5409-4a59-b61b-33b657368f0f\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.507239 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-ovsdbserver-nb\") pod \"04f2acf0-5409-4a59-b61b-33b657368f0f\" (UID: \"04f2acf0-5409-4a59-b61b-33b657368f0f\") " Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.531552 4841 scope.go:117] "RemoveContainer" containerID="2ba8a83227e53817f9ce555c972d42ea1f8375d6f21b2b439488c05d8c4580a9" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.532609 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f2acf0-5409-4a59-b61b-33b657368f0f-kube-api-access-hnjht" (OuterVolumeSpecName: "kube-api-access-hnjht") pod "04f2acf0-5409-4a59-b61b-33b657368f0f" (UID: "04f2acf0-5409-4a59-b61b-33b657368f0f"). InnerVolumeSpecName "kube-api-access-hnjht". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.607090 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04f2acf0-5409-4a59-b61b-33b657368f0f" (UID: "04f2acf0-5409-4a59-b61b-33b657368f0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.616861 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-config" (OuterVolumeSpecName: "config") pod "04f2acf0-5409-4a59-b61b-33b657368f0f" (UID: "04f2acf0-5409-4a59-b61b-33b657368f0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.619347 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.619373 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.619381 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnjht\" (UniqueName: \"kubernetes.io/projected/04f2acf0-5409-4a59-b61b-33b657368f0f-kube-api-access-hnjht\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.662461 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7d4467fc7c-fcdx6"] Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.681496 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7d4467fc7c-fcdx6"] Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.684944 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04f2acf0-5409-4a59-b61b-33b657368f0f" (UID: "04f2acf0-5409-4a59-b61b-33b657368f0f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.713475 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-65d7cd9bbd-ltngt"] Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.721638 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.735926 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04f2acf0-5409-4a59-b61b-33b657368f0f" (UID: "04f2acf0-5409-4a59-b61b-33b657368f0f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.736065 4841 scope.go:117] "RemoveContainer" containerID="eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37" Jan 30 05:26:33 crc kubenswrapper[4841]: E0130 05:26:33.736799 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37\": container with ID starting with eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37 not found: ID does not exist" containerID="eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.736824 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37"} err="failed to get container status \"eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37\": rpc error: code = NotFound desc = could not find container \"eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37\": container with ID starting with eea8ca043f8dab4532da5ed49e34bbe235eb63c61f77365e38805e11be863b37 not found: ID does not exist" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.736845 4841 scope.go:117] "RemoveContainer" containerID="2ba8a83227e53817f9ce555c972d42ea1f8375d6f21b2b439488c05d8c4580a9" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.740352 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-65d7cd9bbd-ltngt"] Jan 30 05:26:33 crc kubenswrapper[4841]: E0130 05:26:33.745215 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba8a83227e53817f9ce555c972d42ea1f8375d6f21b2b439488c05d8c4580a9\": container with ID starting with 2ba8a83227e53817f9ce555c972d42ea1f8375d6f21b2b439488c05d8c4580a9 not found: ID does not exist" containerID="2ba8a83227e53817f9ce555c972d42ea1f8375d6f21b2b439488c05d8c4580a9" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.745271 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba8a83227e53817f9ce555c972d42ea1f8375d6f21b2b439488c05d8c4580a9"} err="failed to get container status \"2ba8a83227e53817f9ce555c972d42ea1f8375d6f21b2b439488c05d8c4580a9\": rpc error: code = NotFound desc = could not find container \"2ba8a83227e53817f9ce555c972d42ea1f8375d6f21b2b439488c05d8c4580a9\": container with ID starting with 2ba8a83227e53817f9ce555c972d42ea1f8375d6f21b2b439488c05d8c4580a9 not found: ID does not exist" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.745304 4841 scope.go:117] "RemoveContainer" containerID="617639990fdd7a9fc550c4d31f88262c90b61c1d859843dedcd6704d84e2fe6b" Jan 30 05:26:33 crc kubenswrapper[4841]: E0130 05:26:33.760984 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95528d5c_f34b_4913_9db3_05ef436c106d.slice/crio-fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.761837 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04f2acf0-5409-4a59-b61b-33b657368f0f" (UID: "04f2acf0-5409-4a59-b61b-33b657368f0f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.776393 4841 scope.go:117] "RemoveContainer" containerID="ef99740979d9c7920bf8f0fabf36f408f8c7cef1d02940b3ff12f1aa80e86c55" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.799663 4841 scope.go:117] "RemoveContainer" containerID="e19b87a704474a39dde8c492160b1a7d52c9fc0ec7d9920d7bb3bc793b82d4c6" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.824627 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.826407 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f2acf0-5409-4a59-b61b-33b657368f0f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:33 crc kubenswrapper[4841]: I0130 05:26:33.837545 4841 scope.go:117] "RemoveContainer" containerID="626098a8afa8e25d0fa56ed15e16341bd2dc5528d29cc4461cbcc4e3b8845bca" Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.455819 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c5b922-a9ee-44a7-a350-966abc4e4809" path="/var/lib/kubelet/pods/69c5b922-a9ee-44a7-a350-966abc4e4809/volumes" Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.458234 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae633ad-6e09-4487-a2ad-21fc13696859" path="/var/lib/kubelet/pods/cae633ad-6e09-4487-a2ad-21fc13696859/volumes" Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.459642 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da063a32-f0f1-4849-b6da-d592237caf41" path="/var/lib/kubelet/pods/da063a32-f0f1-4849-b6da-d592237caf41/volumes" Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.515243 4841 generic.go:334] "Generic (PLEG): container finished" podID="95528d5c-f34b-4913-9db3-05ef436c106d" containerID="36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0" exitCode=0 Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.515320 4841 generic.go:334] "Generic (PLEG): container finished" podID="95528d5c-f34b-4913-9db3-05ef436c106d" containerID="fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6" exitCode=2 Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.515461 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95528d5c-f34b-4913-9db3-05ef436c106d","Type":"ContainerDied","Data":"36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0"} Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.515532 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95528d5c-f34b-4913-9db3-05ef436c106d","Type":"ContainerDied","Data":"fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6"} Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.518105 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7thb" event={"ID":"502d6fe3-4215-4b32-8546-a55e5a4afc91","Type":"ContainerStarted","Data":"142f87d2cb0e9372bbcdecb138aed934cb74505996af8c75485dcbe012426e16"} Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.522954 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-mtzkb" Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.618755 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-t7thb" podStartSLOduration=3.622560124 podStartE2EDuration="47.618736316s" podCreationTimestamp="2026-01-30 05:25:47 +0000 UTC" firstStartedPulling="2026-01-30 05:25:48.790777699 +0000 UTC m=+1085.784250337" lastFinishedPulling="2026-01-30 05:26:32.786953891 +0000 UTC m=+1129.780426529" observedRunningTime="2026-01-30 05:26:34.592614392 +0000 UTC m=+1131.586087070" watchObservedRunningTime="2026-01-30 05:26:34.618736316 +0000 UTC m=+1131.612208964" Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.631865 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-mtzkb"] Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.642009 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-mtzkb"] Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.729662 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:34 crc kubenswrapper[4841]: I0130 05:26:34.909441 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:36 crc kubenswrapper[4841]: I0130 05:26:36.444684 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04f2acf0-5409-4a59-b61b-33b657368f0f" path="/var/lib/kubelet/pods/04f2acf0-5409-4a59-b61b-33b657368f0f/volumes" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.204081 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.225747 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.307450 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-77d66cd676-sj9d2"] Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.307679 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-77d66cd676-sj9d2" podUID="b0608cab-e4d7-4288-9ca2-831df372d653" containerName="barbican-api-log" containerID="cri-o://8030d99396b89edc5d7d06dcf3d56bab00d6e79110e9c57a635ab6b9a5d1877e" gracePeriod=30 Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.307826 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-77d66cd676-sj9d2" podUID="b0608cab-e4d7-4288-9ca2-831df372d653" containerName="barbican-api" containerID="cri-o://8074b38b93b4b4437dfa123aa02f0cce562d3c49629666abfcfca788b967095c" gracePeriod=30 Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.505961 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.561985 4841 generic.go:334] "Generic (PLEG): container finished" podID="95528d5c-f34b-4913-9db3-05ef436c106d" containerID="cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8" exitCode=0 Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.562059 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95528d5c-f34b-4913-9db3-05ef436c106d","Type":"ContainerDied","Data":"cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8"} Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.562088 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95528d5c-f34b-4913-9db3-05ef436c106d","Type":"ContainerDied","Data":"7a596e5efb750b1d547bfacb64364003ad337bfe0b01d1c0d08932eed1bdf44f"} Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.562059 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.562105 4841 scope.go:117] "RemoveContainer" containerID="36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.564937 4841 generic.go:334] "Generic (PLEG): container finished" podID="b0608cab-e4d7-4288-9ca2-831df372d653" containerID="8030d99396b89edc5d7d06dcf3d56bab00d6e79110e9c57a635ab6b9a5d1877e" exitCode=143 Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.565307 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77d66cd676-sj9d2" event={"ID":"b0608cab-e4d7-4288-9ca2-831df372d653","Type":"ContainerDied","Data":"8030d99396b89edc5d7d06dcf3d56bab00d6e79110e9c57a635ab6b9a5d1877e"} Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.598494 4841 scope.go:117] "RemoveContainer" containerID="fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.601683 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-config-data\") pod \"95528d5c-f34b-4913-9db3-05ef436c106d\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.601772 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qltc\" (UniqueName: \"kubernetes.io/projected/95528d5c-f34b-4913-9db3-05ef436c106d-kube-api-access-8qltc\") pod \"95528d5c-f34b-4913-9db3-05ef436c106d\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.602646 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-combined-ca-bundle\") pod \"95528d5c-f34b-4913-9db3-05ef436c106d\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.602692 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95528d5c-f34b-4913-9db3-05ef436c106d-run-httpd\") pod \"95528d5c-f34b-4913-9db3-05ef436c106d\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.603628 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-scripts\") pod \"95528d5c-f34b-4913-9db3-05ef436c106d\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.603687 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95528d5c-f34b-4913-9db3-05ef436c106d-log-httpd\") pod \"95528d5c-f34b-4913-9db3-05ef436c106d\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.603725 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-sg-core-conf-yaml\") pod \"95528d5c-f34b-4913-9db3-05ef436c106d\" (UID: \"95528d5c-f34b-4913-9db3-05ef436c106d\") " Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.604039 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95528d5c-f34b-4913-9db3-05ef436c106d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "95528d5c-f34b-4913-9db3-05ef436c106d" (UID: "95528d5c-f34b-4913-9db3-05ef436c106d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.604558 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95528d5c-f34b-4913-9db3-05ef436c106d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.604740 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95528d5c-f34b-4913-9db3-05ef436c106d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "95528d5c-f34b-4913-9db3-05ef436c106d" (UID: "95528d5c-f34b-4913-9db3-05ef436c106d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.607215 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95528d5c-f34b-4913-9db3-05ef436c106d-kube-api-access-8qltc" (OuterVolumeSpecName: "kube-api-access-8qltc") pod "95528d5c-f34b-4913-9db3-05ef436c106d" (UID: "95528d5c-f34b-4913-9db3-05ef436c106d"). InnerVolumeSpecName "kube-api-access-8qltc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.625343 4841 scope.go:117] "RemoveContainer" containerID="cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.632096 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-scripts" (OuterVolumeSpecName: "scripts") pod "95528d5c-f34b-4913-9db3-05ef436c106d" (UID: "95528d5c-f34b-4913-9db3-05ef436c106d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.642068 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "95528d5c-f34b-4913-9db3-05ef436c106d" (UID: "95528d5c-f34b-4913-9db3-05ef436c106d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.652040 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95528d5c-f34b-4913-9db3-05ef436c106d" (UID: "95528d5c-f34b-4913-9db3-05ef436c106d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.679097 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-config-data" (OuterVolumeSpecName: "config-data") pod "95528d5c-f34b-4913-9db3-05ef436c106d" (UID: "95528d5c-f34b-4913-9db3-05ef436c106d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.706264 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.706297 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qltc\" (UniqueName: \"kubernetes.io/projected/95528d5c-f34b-4913-9db3-05ef436c106d-kube-api-access-8qltc\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.706308 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.706317 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.706324 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95528d5c-f34b-4913-9db3-05ef436c106d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.706332 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95528d5c-f34b-4913-9db3-05ef436c106d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.706707 4841 scope.go:117] "RemoveContainer" containerID="36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0" Jan 30 05:26:37 crc kubenswrapper[4841]: E0130 05:26:37.708712 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0\": container with ID starting with 36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0 not found: ID does not exist" containerID="36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.708759 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0"} err="failed to get container status \"36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0\": rpc error: code = NotFound desc = could not find container \"36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0\": container with ID starting with 36571f9e9e313c91814026e584e6e4a4a6214ba89eb457fc48002d4dd1936bc0 not found: ID does not exist" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.708803 4841 scope.go:117] "RemoveContainer" containerID="fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6" Jan 30 05:26:37 crc kubenswrapper[4841]: E0130 05:26:37.709230 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6\": container with ID starting with fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6 not found: ID does not exist" containerID="fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.709254 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6"} err="failed to get container status \"fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6\": rpc error: code = NotFound desc = could not find container \"fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6\": container with ID starting with fe6f3e46a58fdabf314c06ad12bc601ca92fa7f4fa03088e26306e8c3f7d1aa6 not found: ID does not exist" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.709291 4841 scope.go:117] "RemoveContainer" containerID="cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8" Jan 30 05:26:37 crc kubenswrapper[4841]: E0130 05:26:37.714604 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8\": container with ID starting with cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8 not found: ID does not exist" containerID="cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.714656 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8"} err="failed to get container status \"cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8\": rpc error: code = NotFound desc = could not find container \"cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8\": container with ID starting with cde340bea26d741efff02c11464b1bcc6e5280c76ebff24f132b1e28a39204a8 not found: ID does not exist" Jan 30 05:26:37 crc kubenswrapper[4841]: I0130 05:26:37.985476 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.000283 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.017500 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:26:38 crc kubenswrapper[4841]: E0130 05:26:38.017883 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" containerName="proxy-httpd" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.017900 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" containerName="proxy-httpd" Jan 30 05:26:38 crc kubenswrapper[4841]: E0130 05:26:38.017913 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f2acf0-5409-4a59-b61b-33b657368f0f" containerName="dnsmasq-dns" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.017920 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f2acf0-5409-4a59-b61b-33b657368f0f" containerName="dnsmasq-dns" Jan 30 05:26:38 crc kubenswrapper[4841]: E0130 05:26:38.017931 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae633ad-6e09-4487-a2ad-21fc13696859" containerName="barbican-worker-log" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.017938 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae633ad-6e09-4487-a2ad-21fc13696859" containerName="barbican-worker-log" Jan 30 05:26:38 crc kubenswrapper[4841]: E0130 05:26:38.017946 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c5b922-a9ee-44a7-a350-966abc4e4809" containerName="barbican-keystone-listener-log" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.017952 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c5b922-a9ee-44a7-a350-966abc4e4809" containerName="barbican-keystone-listener-log" Jan 30 05:26:38 crc kubenswrapper[4841]: E0130 05:26:38.017966 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" containerName="ceilometer-notification-agent" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.017972 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" containerName="ceilometer-notification-agent" Jan 30 05:26:38 crc kubenswrapper[4841]: E0130 05:26:38.017980 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f2acf0-5409-4a59-b61b-33b657368f0f" containerName="init" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.017986 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f2acf0-5409-4a59-b61b-33b657368f0f" containerName="init" Jan 30 05:26:38 crc kubenswrapper[4841]: E0130 05:26:38.017995 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae633ad-6e09-4487-a2ad-21fc13696859" containerName="barbican-worker" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018001 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae633ad-6e09-4487-a2ad-21fc13696859" containerName="barbican-worker" Jan 30 05:26:38 crc kubenswrapper[4841]: E0130 05:26:38.018026 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" containerName="sg-core" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018040 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" containerName="sg-core" Jan 30 05:26:38 crc kubenswrapper[4841]: E0130 05:26:38.018049 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da063a32-f0f1-4849-b6da-d592237caf41" containerName="barbican-api-log" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018055 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="da063a32-f0f1-4849-b6da-d592237caf41" containerName="barbican-api-log" Jan 30 05:26:38 crc kubenswrapper[4841]: E0130 05:26:38.018067 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c5b922-a9ee-44a7-a350-966abc4e4809" containerName="barbican-keystone-listener" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018073 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c5b922-a9ee-44a7-a350-966abc4e4809" containerName="barbican-keystone-listener" Jan 30 05:26:38 crc kubenswrapper[4841]: E0130 05:26:38.018083 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da063a32-f0f1-4849-b6da-d592237caf41" containerName="barbican-api" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018089 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="da063a32-f0f1-4849-b6da-d592237caf41" containerName="barbican-api" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018265 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="da063a32-f0f1-4849-b6da-d592237caf41" containerName="barbican-api" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018273 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" containerName="proxy-httpd" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018284 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae633ad-6e09-4487-a2ad-21fc13696859" containerName="barbican-worker" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018299 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" containerName="sg-core" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018308 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" containerName="ceilometer-notification-agent" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018318 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="da063a32-f0f1-4849-b6da-d592237caf41" containerName="barbican-api-log" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018326 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c5b922-a9ee-44a7-a350-966abc4e4809" containerName="barbican-keystone-listener-log" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018334 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c5b922-a9ee-44a7-a350-966abc4e4809" containerName="barbican-keystone-listener" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018344 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae633ad-6e09-4487-a2ad-21fc13696859" containerName="barbican-worker-log" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.018351 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="04f2acf0-5409-4a59-b61b-33b657368f0f" containerName="dnsmasq-dns" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.019940 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.022079 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.022249 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.040404 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.114100 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99lbh\" (UniqueName: \"kubernetes.io/projected/29253fdb-c316-4287-b2e4-0e5d129bfed5-kube-api-access-99lbh\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.114602 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.114818 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29253fdb-c316-4287-b2e4-0e5d129bfed5-log-httpd\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.114961 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.115070 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-config-data\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.115209 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29253fdb-c316-4287-b2e4-0e5d129bfed5-run-httpd\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.115355 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-scripts\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.216811 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29253fdb-c316-4287-b2e4-0e5d129bfed5-log-httpd\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.216878 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.216909 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-config-data\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.216944 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29253fdb-c316-4287-b2e4-0e5d129bfed5-run-httpd\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.216967 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-scripts\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.217018 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99lbh\" (UniqueName: \"kubernetes.io/projected/29253fdb-c316-4287-b2e4-0e5d129bfed5-kube-api-access-99lbh\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.217036 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.217653 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29253fdb-c316-4287-b2e4-0e5d129bfed5-log-httpd\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.217855 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29253fdb-c316-4287-b2e4-0e5d129bfed5-run-httpd\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.222193 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-scripts\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.222898 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.223899 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.225241 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-config-data\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.233377 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99lbh\" (UniqueName: \"kubernetes.io/projected/29253fdb-c316-4287-b2e4-0e5d129bfed5-kube-api-access-99lbh\") pod \"ceilometer-0\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.332441 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.447988 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95528d5c-f34b-4913-9db3-05ef436c106d" path="/var/lib/kubelet/pods/95528d5c-f34b-4913-9db3-05ef436c106d/volumes" Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.576752 4841 generic.go:334] "Generic (PLEG): container finished" podID="502d6fe3-4215-4b32-8546-a55e5a4afc91" containerID="142f87d2cb0e9372bbcdecb138aed934cb74505996af8c75485dcbe012426e16" exitCode=0 Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.576803 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7thb" event={"ID":"502d6fe3-4215-4b32-8546-a55e5a4afc91","Type":"ContainerDied","Data":"142f87d2cb0e9372bbcdecb138aed934cb74505996af8c75485dcbe012426e16"} Jan 30 05:26:38 crc kubenswrapper[4841]: I0130 05:26:38.873008 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:26:38 crc kubenswrapper[4841]: W0130 05:26:38.882798 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29253fdb_c316_4287_b2e4_0e5d129bfed5.slice/crio-3f8ad2455011d649cfcdcdbfa46334bebb54f7372acdeec67f025608cc7f3c11 WatchSource:0}: Error finding container 3f8ad2455011d649cfcdcdbfa46334bebb54f7372acdeec67f025608cc7f3c11: Status 404 returned error can't find the container with id 3f8ad2455011d649cfcdcdbfa46334bebb54f7372acdeec67f025608cc7f3c11 Jan 30 05:26:39 crc kubenswrapper[4841]: I0130 05:26:39.588295 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29253fdb-c316-4287-b2e4-0e5d129bfed5","Type":"ContainerStarted","Data":"3f8ad2455011d649cfcdcdbfa46334bebb54f7372acdeec67f025608cc7f3c11"} Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.130936 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7thb" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.166521 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-scripts\") pod \"502d6fe3-4215-4b32-8546-a55e5a4afc91\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.166736 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/502d6fe3-4215-4b32-8546-a55e5a4afc91-etc-machine-id\") pod \"502d6fe3-4215-4b32-8546-a55e5a4afc91\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.166792 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-db-sync-config-data\") pod \"502d6fe3-4215-4b32-8546-a55e5a4afc91\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.166941 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-combined-ca-bundle\") pod \"502d6fe3-4215-4b32-8546-a55e5a4afc91\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.166981 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-config-data\") pod \"502d6fe3-4215-4b32-8546-a55e5a4afc91\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.167022 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj5wd\" (UniqueName: \"kubernetes.io/projected/502d6fe3-4215-4b32-8546-a55e5a4afc91-kube-api-access-pj5wd\") pod \"502d6fe3-4215-4b32-8546-a55e5a4afc91\" (UID: \"502d6fe3-4215-4b32-8546-a55e5a4afc91\") " Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.168548 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/502d6fe3-4215-4b32-8546-a55e5a4afc91-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "502d6fe3-4215-4b32-8546-a55e5a4afc91" (UID: "502d6fe3-4215-4b32-8546-a55e5a4afc91"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.173075 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/502d6fe3-4215-4b32-8546-a55e5a4afc91-kube-api-access-pj5wd" (OuterVolumeSpecName: "kube-api-access-pj5wd") pod "502d6fe3-4215-4b32-8546-a55e5a4afc91" (UID: "502d6fe3-4215-4b32-8546-a55e5a4afc91"). InnerVolumeSpecName "kube-api-access-pj5wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.173665 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-scripts" (OuterVolumeSpecName: "scripts") pod "502d6fe3-4215-4b32-8546-a55e5a4afc91" (UID: "502d6fe3-4215-4b32-8546-a55e5a4afc91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.173816 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "502d6fe3-4215-4b32-8546-a55e5a4afc91" (UID: "502d6fe3-4215-4b32-8546-a55e5a4afc91"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.203652 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "502d6fe3-4215-4b32-8546-a55e5a4afc91" (UID: "502d6fe3-4215-4b32-8546-a55e5a4afc91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.215720 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-config-data" (OuterVolumeSpecName: "config-data") pod "502d6fe3-4215-4b32-8546-a55e5a4afc91" (UID: "502d6fe3-4215-4b32-8546-a55e5a4afc91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.269530 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.269558 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/502d6fe3-4215-4b32-8546-a55e5a4afc91-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.269569 4841 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.269578 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.269585 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502d6fe3-4215-4b32-8546-a55e5a4afc91-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.269594 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj5wd\" (UniqueName: \"kubernetes.io/projected/502d6fe3-4215-4b32-8546-a55e5a4afc91-kube-api-access-pj5wd\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.464886 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.464999 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.465093 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.466481 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"344e964639b293843f58c03939c43edd9bcd822c2de642650f9f33c6e2a4eb20"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.466619 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://344e964639b293843f58c03939c43edd9bcd822c2de642650f9f33c6e2a4eb20" gracePeriod=600 Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.485681 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77d66cd676-sj9d2" podUID="b0608cab-e4d7-4288-9ca2-831df372d653" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:40130->10.217.0.164:9311: read: connection reset by peer" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.485754 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77d66cd676-sj9d2" podUID="b0608cab-e4d7-4288-9ca2-831df372d653" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:40126->10.217.0.164:9311: read: connection reset by peer" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.604811 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7thb" event={"ID":"502d6fe3-4215-4b32-8546-a55e5a4afc91","Type":"ContainerDied","Data":"bbea313c627f342f17d623bf1060c6258e35bc29916fdb9b73500546ef1265bc"} Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.605067 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbea313c627f342f17d623bf1060c6258e35bc29916fdb9b73500546ef1265bc" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.604940 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7thb" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.607977 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29253fdb-c316-4287-b2e4-0e5d129bfed5","Type":"ContainerStarted","Data":"8ff43360aaa71073b396349def6db95a5090b4128841c76b4eb8f7a4aa9fc054"} Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.614585 4841 generic.go:334] "Generic (PLEG): container finished" podID="b0608cab-e4d7-4288-9ca2-831df372d653" containerID="8074b38b93b4b4437dfa123aa02f0cce562d3c49629666abfcfca788b967095c" exitCode=0 Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.614639 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77d66cd676-sj9d2" event={"ID":"b0608cab-e4d7-4288-9ca2-831df372d653","Type":"ContainerDied","Data":"8074b38b93b4b4437dfa123aa02f0cce562d3c49629666abfcfca788b967095c"} Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.896218 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.917796 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:40 crc kubenswrapper[4841]: E0130 05:26:40.918156 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502d6fe3-4215-4b32-8546-a55e5a4afc91" containerName="cinder-db-sync" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.918172 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="502d6fe3-4215-4b32-8546-a55e5a4afc91" containerName="cinder-db-sync" Jan 30 05:26:40 crc kubenswrapper[4841]: E0130 05:26:40.918196 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0608cab-e4d7-4288-9ca2-831df372d653" containerName="barbican-api-log" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.918203 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0608cab-e4d7-4288-9ca2-831df372d653" containerName="barbican-api-log" Jan 30 05:26:40 crc kubenswrapper[4841]: E0130 05:26:40.918216 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0608cab-e4d7-4288-9ca2-831df372d653" containerName="barbican-api" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.918222 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0608cab-e4d7-4288-9ca2-831df372d653" containerName="barbican-api" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.918804 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="502d6fe3-4215-4b32-8546-a55e5a4afc91" containerName="cinder-db-sync" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.918828 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0608cab-e4d7-4288-9ca2-831df372d653" containerName="barbican-api" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.918841 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0608cab-e4d7-4288-9ca2-831df372d653" containerName="barbican-api-log" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.919699 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.923058 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-s5wxv" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.923061 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.923199 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.923346 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.935082 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.979021 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-c2tx2"] Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.980723 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.989149 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0608cab-e4d7-4288-9ca2-831df372d653-logs\") pod \"b0608cab-e4d7-4288-9ca2-831df372d653\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.989259 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-config-data-custom\") pod \"b0608cab-e4d7-4288-9ca2-831df372d653\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.989300 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-config-data\") pod \"b0608cab-e4d7-4288-9ca2-831df372d653\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.989325 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-combined-ca-bundle\") pod \"b0608cab-e4d7-4288-9ca2-831df372d653\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.989391 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj5p8\" (UniqueName: \"kubernetes.io/projected/b0608cab-e4d7-4288-9ca2-831df372d653-kube-api-access-jj5p8\") pod \"b0608cab-e4d7-4288-9ca2-831df372d653\" (UID: \"b0608cab-e4d7-4288-9ca2-831df372d653\") " Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.991508 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0608cab-e4d7-4288-9ca2-831df372d653-logs" (OuterVolumeSpecName: "logs") pod "b0608cab-e4d7-4288-9ca2-831df372d653" (UID: "b0608cab-e4d7-4288-9ca2-831df372d653"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.992236 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b806b168-492c-451d-8e35-f550c4f690ed-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.992388 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-config-data\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.992637 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kdgm\" (UniqueName: \"kubernetes.io/projected/b806b168-492c-451d-8e35-f550c4f690ed-kube-api-access-7kdgm\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.992788 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.992885 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.993477 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-scripts\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:40 crc kubenswrapper[4841]: I0130 05:26:40.993611 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0608cab-e4d7-4288-9ca2-831df372d653-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.005124 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-c2tx2"] Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.022926 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0608cab-e4d7-4288-9ca2-831df372d653-kube-api-access-jj5p8" (OuterVolumeSpecName: "kube-api-access-jj5p8") pod "b0608cab-e4d7-4288-9ca2-831df372d653" (UID: "b0608cab-e4d7-4288-9ca2-831df372d653"). InnerVolumeSpecName "kube-api-access-jj5p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.025942 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0608cab-e4d7-4288-9ca2-831df372d653" (UID: "b0608cab-e4d7-4288-9ca2-831df372d653"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.026351 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b0608cab-e4d7-4288-9ca2-831df372d653" (UID: "b0608cab-e4d7-4288-9ca2-831df372d653"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.086641 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-config-data" (OuterVolumeSpecName: "config-data") pod "b0608cab-e4d7-4288-9ca2-831df372d653" (UID: "b0608cab-e4d7-4288-9ca2-831df372d653"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.098479 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-dns-svc\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.098547 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.098778 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.098881 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-dns-swift-storage-0\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.099031 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.099124 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-config\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.099327 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-scripts\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.099591 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.100063 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b806b168-492c-451d-8e35-f550c4f690ed-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.100293 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs75w\" (UniqueName: \"kubernetes.io/projected/ccfedfd0-1320-452d-b99e-5941a9601014-kube-api-access-bs75w\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.100505 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-config-data\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.100952 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kdgm\" (UniqueName: \"kubernetes.io/projected/b806b168-492c-451d-8e35-f550c4f690ed-kube-api-access-7kdgm\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.101119 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.101209 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.101268 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0608cab-e4d7-4288-9ca2-831df372d653-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.101362 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj5p8\" (UniqueName: \"kubernetes.io/projected/b0608cab-e4d7-4288-9ca2-831df372d653-kube-api-access-jj5p8\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.100206 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b806b168-492c-451d-8e35-f550c4f690ed-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.102100 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.103405 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.106007 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-scripts\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.112685 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-config-data\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.115575 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kdgm\" (UniqueName: \"kubernetes.io/projected/b806b168-492c-451d-8e35-f550c4f690ed-kube-api-access-7kdgm\") pod \"cinder-scheduler-0\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.155806 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.157571 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.160040 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.172893 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.205302 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-scripts\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.205340 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/212dccf2-8d10-47ad-acf5-3df6a404b15f-logs\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.205380 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/212dccf2-8d10-47ad-acf5-3df6a404b15f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.205454 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.205496 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs75w\" (UniqueName: \"kubernetes.io/projected/ccfedfd0-1320-452d-b99e-5941a9601014-kube-api-access-bs75w\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.205634 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-config-data-custom\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.205761 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-dns-svc\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.205840 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlsnk\" (UniqueName: \"kubernetes.io/projected/212dccf2-8d10-47ad-acf5-3df6a404b15f-kube-api-access-xlsnk\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.205876 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-dns-swift-storage-0\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.205913 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.205984 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.206008 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-config\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.206051 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-config-data\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.206373 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.206523 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-dns-svc\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.206952 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-dns-swift-storage-0\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.208139 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-config\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.209465 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.227132 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs75w\" (UniqueName: \"kubernetes.io/projected/ccfedfd0-1320-452d-b99e-5941a9601014-kube-api-access-bs75w\") pod \"dnsmasq-dns-75dbb546bf-c2tx2\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.247099 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.310854 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.311749 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-config-data-custom\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.311852 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlsnk\" (UniqueName: \"kubernetes.io/projected/212dccf2-8d10-47ad-acf5-3df6a404b15f-kube-api-access-xlsnk\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.311897 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.311939 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-config-data\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.311982 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-scripts\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.311995 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/212dccf2-8d10-47ad-acf5-3df6a404b15f-logs\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.312039 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/212dccf2-8d10-47ad-acf5-3df6a404b15f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.312178 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/212dccf2-8d10-47ad-acf5-3df6a404b15f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.314436 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/212dccf2-8d10-47ad-acf5-3df6a404b15f-logs\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.314933 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-config-data-custom\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.321447 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.322031 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-scripts\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.328213 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-config-data\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.328629 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlsnk\" (UniqueName: \"kubernetes.io/projected/212dccf2-8d10-47ad-acf5-3df6a404b15f-kube-api-access-xlsnk\") pod \"cinder-api-0\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.564535 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.651914 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="344e964639b293843f58c03939c43edd9bcd822c2de642650f9f33c6e2a4eb20" exitCode=0 Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.652216 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"344e964639b293843f58c03939c43edd9bcd822c2de642650f9f33c6e2a4eb20"} Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.652244 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"8bf8b4cf30e4dca128fb9c700ac455cfd4ad66705f2809226e958756cecd6fcb"} Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.652264 4841 scope.go:117] "RemoveContainer" containerID="6383000b3c9a91b0197a8dcc28ebef33bc3b14f4f0dc8fac3dcfc3c8dcddb775" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.656722 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29253fdb-c316-4287-b2e4-0e5d129bfed5","Type":"ContainerStarted","Data":"6991ac0d0f5787ebee296eaa5f3ad0cd646e494f4dcbc1bfe860414036eea26a"} Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.656755 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29253fdb-c316-4287-b2e4-0e5d129bfed5","Type":"ContainerStarted","Data":"6684dd85866b7d240e58a458411946df2ce04bbc98a59876a1263d23cc34dd60"} Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.670752 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77d66cd676-sj9d2" event={"ID":"b0608cab-e4d7-4288-9ca2-831df372d653","Type":"ContainerDied","Data":"068f0557b6f2402d00a6e002663214e793ed1601b0687c1852a732b8e5efb417"} Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.670847 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77d66cd676-sj9d2" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.714502 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-77d66cd676-sj9d2"] Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.716612 4841 scope.go:117] "RemoveContainer" containerID="8074b38b93b4b4437dfa123aa02f0cce562d3c49629666abfcfca788b967095c" Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.721506 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-77d66cd676-sj9d2"] Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.746497 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.758896 4841 scope.go:117] "RemoveContainer" containerID="8030d99396b89edc5d7d06dcf3d56bab00d6e79110e9c57a635ab6b9a5d1877e" Jan 30 05:26:41 crc kubenswrapper[4841]: W0130 05:26:41.768951 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb806b168_492c_451d_8e35_f550c4f690ed.slice/crio-1fb8c240d9f8fcd64531009da82dfee80abade1be7e540a31a4dd029397b2618 WatchSource:0}: Error finding container 1fb8c240d9f8fcd64531009da82dfee80abade1be7e540a31a4dd029397b2618: Status 404 returned error can't find the container with id 1fb8c240d9f8fcd64531009da82dfee80abade1be7e540a31a4dd029397b2618 Jan 30 05:26:41 crc kubenswrapper[4841]: I0130 05:26:41.825364 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-c2tx2"] Jan 30 05:26:42 crc kubenswrapper[4841]: I0130 05:26:42.063129 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:42 crc kubenswrapper[4841]: I0130 05:26:42.442808 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0608cab-e4d7-4288-9ca2-831df372d653" path="/var/lib/kubelet/pods/b0608cab-e4d7-4288-9ca2-831df372d653/volumes" Jan 30 05:26:42 crc kubenswrapper[4841]: I0130 05:26:42.691677 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"212dccf2-8d10-47ad-acf5-3df6a404b15f","Type":"ContainerStarted","Data":"b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d"} Jan 30 05:26:42 crc kubenswrapper[4841]: I0130 05:26:42.691849 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"212dccf2-8d10-47ad-acf5-3df6a404b15f","Type":"ContainerStarted","Data":"e69df113aa8f5a10c3a259f74e9ea7ef64aa8760edf981bd028e6d84b6ecda13"} Jan 30 05:26:42 crc kubenswrapper[4841]: I0130 05:26:42.703781 4841 generic.go:334] "Generic (PLEG): container finished" podID="ccfedfd0-1320-452d-b99e-5941a9601014" containerID="290b306f10c71213e52e22feccb083a2e99d29e7890b6f8bfa1024b4ea5849b5" exitCode=0 Jan 30 05:26:42 crc kubenswrapper[4841]: I0130 05:26:42.703854 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" event={"ID":"ccfedfd0-1320-452d-b99e-5941a9601014","Type":"ContainerDied","Data":"290b306f10c71213e52e22feccb083a2e99d29e7890b6f8bfa1024b4ea5849b5"} Jan 30 05:26:42 crc kubenswrapper[4841]: I0130 05:26:42.703878 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" event={"ID":"ccfedfd0-1320-452d-b99e-5941a9601014","Type":"ContainerStarted","Data":"4ad9ddcece4b7f4422f84b83fa5cb5c0af6b0640242305c0cdf1e7f49f3eded1"} Jan 30 05:26:42 crc kubenswrapper[4841]: I0130 05:26:42.707711 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b806b168-492c-451d-8e35-f550c4f690ed","Type":"ContainerStarted","Data":"1fb8c240d9f8fcd64531009da82dfee80abade1be7e540a31a4dd029397b2618"} Jan 30 05:26:43 crc kubenswrapper[4841]: I0130 05:26:43.725571 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" event={"ID":"ccfedfd0-1320-452d-b99e-5941a9601014","Type":"ContainerStarted","Data":"d070949f85ad9838c18c65ae2129ba936758deaf2d70aa746df18e3aa5a1e644"} Jan 30 05:26:43 crc kubenswrapper[4841]: I0130 05:26:43.725984 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:43 crc kubenswrapper[4841]: I0130 05:26:43.729013 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b806b168-492c-451d-8e35-f550c4f690ed","Type":"ContainerStarted","Data":"6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392"} Jan 30 05:26:43 crc kubenswrapper[4841]: I0130 05:26:43.734985 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"212dccf2-8d10-47ad-acf5-3df6a404b15f","Type":"ContainerStarted","Data":"6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9"} Jan 30 05:26:43 crc kubenswrapper[4841]: I0130 05:26:43.735143 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 05:26:43 crc kubenswrapper[4841]: I0130 05:26:43.754768 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" podStartSLOduration=3.754743252 podStartE2EDuration="3.754743252s" podCreationTimestamp="2026-01-30 05:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:43.747193105 +0000 UTC m=+1140.740665743" watchObservedRunningTime="2026-01-30 05:26:43.754743252 +0000 UTC m=+1140.748215890" Jan 30 05:26:43 crc kubenswrapper[4841]: I0130 05:26:43.769164 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.769149229 podStartE2EDuration="2.769149229s" podCreationTimestamp="2026-01-30 05:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:43.767365963 +0000 UTC m=+1140.760838601" watchObservedRunningTime="2026-01-30 05:26:43.769149229 +0000 UTC m=+1140.762621867" Jan 30 05:26:43 crc kubenswrapper[4841]: I0130 05:26:43.984022 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:44 crc kubenswrapper[4841]: I0130 05:26:44.744007 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29253fdb-c316-4287-b2e4-0e5d129bfed5","Type":"ContainerStarted","Data":"e586caec9c795381265e025d82640e7144ebc4152d95c80e46a5de9fea6a4eb1"} Jan 30 05:26:44 crc kubenswrapper[4841]: I0130 05:26:44.745285 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 05:26:44 crc kubenswrapper[4841]: I0130 05:26:44.747077 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b806b168-492c-451d-8e35-f550c4f690ed","Type":"ContainerStarted","Data":"31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c"} Jan 30 05:26:44 crc kubenswrapper[4841]: I0130 05:26:44.765936 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.066048485 podStartE2EDuration="7.765924049s" podCreationTimestamp="2026-01-30 05:26:37 +0000 UTC" firstStartedPulling="2026-01-30 05:26:38.885835434 +0000 UTC m=+1135.879308112" lastFinishedPulling="2026-01-30 05:26:43.585711038 +0000 UTC m=+1140.579183676" observedRunningTime="2026-01-30 05:26:44.759915292 +0000 UTC m=+1141.753387930" watchObservedRunningTime="2026-01-30 05:26:44.765924049 +0000 UTC m=+1141.759396687" Jan 30 05:26:44 crc kubenswrapper[4841]: I0130 05:26:44.788260 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.184220753 podStartE2EDuration="4.788243253s" podCreationTimestamp="2026-01-30 05:26:40 +0000 UTC" firstStartedPulling="2026-01-30 05:26:41.773765022 +0000 UTC m=+1138.767237660" lastFinishedPulling="2026-01-30 05:26:42.377787522 +0000 UTC m=+1139.371260160" observedRunningTime="2026-01-30 05:26:44.780724967 +0000 UTC m=+1141.774197615" watchObservedRunningTime="2026-01-30 05:26:44.788243253 +0000 UTC m=+1141.781715891" Jan 30 05:26:45 crc kubenswrapper[4841]: I0130 05:26:45.639275 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:26:45 crc kubenswrapper[4841]: I0130 05:26:45.757237 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="212dccf2-8d10-47ad-acf5-3df6a404b15f" containerName="cinder-api-log" containerID="cri-o://b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d" gracePeriod=30 Jan 30 05:26:45 crc kubenswrapper[4841]: I0130 05:26:45.758511 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="212dccf2-8d10-47ad-acf5-3df6a404b15f" containerName="cinder-api" containerID="cri-o://6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9" gracePeriod=30 Jan 30 05:26:45 crc kubenswrapper[4841]: I0130 05:26:45.989600 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bcfb9ffb5-q4t7z"] Jan 30 05:26:45 crc kubenswrapper[4841]: I0130 05:26:45.989868 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bcfb9ffb5-q4t7z" podUID="fc4310f7-3d6f-45dc-a17e-5311da837e81" containerName="neutron-api" containerID="cri-o://2b1a0e1c3a5b1bc5a2cea93c3da88301a3042badaf0a1dbc97c75b42d3595c7d" gracePeriod=30 Jan 30 05:26:45 crc kubenswrapper[4841]: I0130 05:26:45.989986 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bcfb9ffb5-q4t7z" podUID="fc4310f7-3d6f-45dc-a17e-5311da837e81" containerName="neutron-httpd" containerID="cri-o://6b1c61cf113a9d02dc8e84edb3cccdb64ff41343455a288f83fa9a65400016f9" gracePeriod=30 Jan 30 05:26:45 crc kubenswrapper[4841]: I0130 05:26:45.998810 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6bcfb9ffb5-q4t7z" podUID="fc4310f7-3d6f-45dc-a17e-5311da837e81" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9696/\": EOF" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.026276 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-65977b5879-qctf6"] Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.027744 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.047621 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-65977b5879-qctf6"] Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.103232 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc5nf\" (UniqueName: \"kubernetes.io/projected/8ad9e30b-abf9-45fd-9088-103c94e4ed70-kube-api-access-cc5nf\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.103277 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-ovndb-tls-certs\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.103327 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-public-tls-certs\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.103350 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-internal-tls-certs\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.103387 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-config\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.103425 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-httpd-config\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.103523 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-combined-ca-bundle\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.204958 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc5nf\" (UniqueName: \"kubernetes.io/projected/8ad9e30b-abf9-45fd-9088-103c94e4ed70-kube-api-access-cc5nf\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.205026 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-ovndb-tls-certs\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.205104 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-public-tls-certs\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.205146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-internal-tls-certs\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.205206 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-config\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.205241 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-httpd-config\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.205311 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-combined-ca-bundle\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.211842 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-public-tls-certs\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.226164 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-combined-ca-bundle\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.226174 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-internal-tls-certs\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.226416 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-ovndb-tls-certs\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.226753 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-httpd-config\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.229307 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc5nf\" (UniqueName: \"kubernetes.io/projected/8ad9e30b-abf9-45fd-9088-103c94e4ed70-kube-api-access-cc5nf\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.234490 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-config\") pod \"neutron-65977b5879-qctf6\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.248003 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.339865 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.347880 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.409377 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-config-data\") pod \"212dccf2-8d10-47ad-acf5-3df6a404b15f\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.409558 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-combined-ca-bundle\") pod \"212dccf2-8d10-47ad-acf5-3df6a404b15f\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.409710 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlsnk\" (UniqueName: \"kubernetes.io/projected/212dccf2-8d10-47ad-acf5-3df6a404b15f-kube-api-access-xlsnk\") pod \"212dccf2-8d10-47ad-acf5-3df6a404b15f\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.409767 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-scripts\") pod \"212dccf2-8d10-47ad-acf5-3df6a404b15f\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.409794 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-config-data-custom\") pod \"212dccf2-8d10-47ad-acf5-3df6a404b15f\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.411576 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/212dccf2-8d10-47ad-acf5-3df6a404b15f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "212dccf2-8d10-47ad-acf5-3df6a404b15f" (UID: "212dccf2-8d10-47ad-acf5-3df6a404b15f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.413253 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/212dccf2-8d10-47ad-acf5-3df6a404b15f-etc-machine-id\") pod \"212dccf2-8d10-47ad-acf5-3df6a404b15f\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.413437 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/212dccf2-8d10-47ad-acf5-3df6a404b15f-logs\") pod \"212dccf2-8d10-47ad-acf5-3df6a404b15f\" (UID: \"212dccf2-8d10-47ad-acf5-3df6a404b15f\") " Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.414044 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/212dccf2-8d10-47ad-acf5-3df6a404b15f-logs" (OuterVolumeSpecName: "logs") pod "212dccf2-8d10-47ad-acf5-3df6a404b15f" (UID: "212dccf2-8d10-47ad-acf5-3df6a404b15f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.414505 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/212dccf2-8d10-47ad-acf5-3df6a404b15f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.414525 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/212dccf2-8d10-47ad-acf5-3df6a404b15f-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.417470 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-scripts" (OuterVolumeSpecName: "scripts") pod "212dccf2-8d10-47ad-acf5-3df6a404b15f" (UID: "212dccf2-8d10-47ad-acf5-3df6a404b15f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.417877 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/212dccf2-8d10-47ad-acf5-3df6a404b15f-kube-api-access-xlsnk" (OuterVolumeSpecName: "kube-api-access-xlsnk") pod "212dccf2-8d10-47ad-acf5-3df6a404b15f" (UID: "212dccf2-8d10-47ad-acf5-3df6a404b15f"). InnerVolumeSpecName "kube-api-access-xlsnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.427694 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "212dccf2-8d10-47ad-acf5-3df6a404b15f" (UID: "212dccf2-8d10-47ad-acf5-3df6a404b15f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.455593 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "212dccf2-8d10-47ad-acf5-3df6a404b15f" (UID: "212dccf2-8d10-47ad-acf5-3df6a404b15f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.474649 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-config-data" (OuterVolumeSpecName: "config-data") pod "212dccf2-8d10-47ad-acf5-3df6a404b15f" (UID: "212dccf2-8d10-47ad-acf5-3df6a404b15f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.516055 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlsnk\" (UniqueName: \"kubernetes.io/projected/212dccf2-8d10-47ad-acf5-3df6a404b15f-kube-api-access-xlsnk\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.516089 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.516101 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.516110 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.516118 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212dccf2-8d10-47ad-acf5-3df6a404b15f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.778455 4841 generic.go:334] "Generic (PLEG): container finished" podID="fc4310f7-3d6f-45dc-a17e-5311da837e81" containerID="6b1c61cf113a9d02dc8e84edb3cccdb64ff41343455a288f83fa9a65400016f9" exitCode=0 Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.778627 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bcfb9ffb5-q4t7z" event={"ID":"fc4310f7-3d6f-45dc-a17e-5311da837e81","Type":"ContainerDied","Data":"6b1c61cf113a9d02dc8e84edb3cccdb64ff41343455a288f83fa9a65400016f9"} Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.785781 4841 generic.go:334] "Generic (PLEG): container finished" podID="212dccf2-8d10-47ad-acf5-3df6a404b15f" containerID="6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9" exitCode=0 Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.785814 4841 generic.go:334] "Generic (PLEG): container finished" podID="212dccf2-8d10-47ad-acf5-3df6a404b15f" containerID="b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d" exitCode=143 Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.785933 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.786030 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"212dccf2-8d10-47ad-acf5-3df6a404b15f","Type":"ContainerDied","Data":"6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9"} Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.786249 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"212dccf2-8d10-47ad-acf5-3df6a404b15f","Type":"ContainerDied","Data":"b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d"} Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.786273 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"212dccf2-8d10-47ad-acf5-3df6a404b15f","Type":"ContainerDied","Data":"e69df113aa8f5a10c3a259f74e9ea7ef64aa8760edf981bd028e6d84b6ecda13"} Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.786300 4841 scope.go:117] "RemoveContainer" containerID="6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.830310 4841 scope.go:117] "RemoveContainer" containerID="b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.843530 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.860926 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.873224 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:46 crc kubenswrapper[4841]: E0130 05:26:46.873644 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212dccf2-8d10-47ad-acf5-3df6a404b15f" containerName="cinder-api" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.873655 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="212dccf2-8d10-47ad-acf5-3df6a404b15f" containerName="cinder-api" Jan 30 05:26:46 crc kubenswrapper[4841]: E0130 05:26:46.873681 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212dccf2-8d10-47ad-acf5-3df6a404b15f" containerName="cinder-api-log" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.873689 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="212dccf2-8d10-47ad-acf5-3df6a404b15f" containerName="cinder-api-log" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.873898 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="212dccf2-8d10-47ad-acf5-3df6a404b15f" containerName="cinder-api-log" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.873913 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="212dccf2-8d10-47ad-acf5-3df6a404b15f" containerName="cinder-api" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.874848 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.882294 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.883546 4841 scope.go:117] "RemoveContainer" containerID="6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.883968 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.884199 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.884342 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 30 05:26:46 crc kubenswrapper[4841]: E0130 05:26:46.886907 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9\": container with ID starting with 6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9 not found: ID does not exist" containerID="6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.887186 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9"} err="failed to get container status \"6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9\": rpc error: code = NotFound desc = could not find container \"6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9\": container with ID starting with 6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9 not found: ID does not exist" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.887213 4841 scope.go:117] "RemoveContainer" containerID="b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d" Jan 30 05:26:46 crc kubenswrapper[4841]: E0130 05:26:46.887522 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d\": container with ID starting with b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d not found: ID does not exist" containerID="b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.887561 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d"} err="failed to get container status \"b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d\": rpc error: code = NotFound desc = could not find container \"b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d\": container with ID starting with b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d not found: ID does not exist" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.887584 4841 scope.go:117] "RemoveContainer" containerID="6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.887805 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9"} err="failed to get container status \"6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9\": rpc error: code = NotFound desc = could not find container \"6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9\": container with ID starting with 6f15ae02a146e5e3384bf1d22948535731994dacc56c2351797c929f5e8edff9 not found: ID does not exist" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.887822 4841 scope.go:117] "RemoveContainer" containerID="b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.887975 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d"} err="failed to get container status \"b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d\": rpc error: code = NotFound desc = could not find container \"b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d\": container with ID starting with b1e37bd4d8b336a6d3e10ea399f75e17ae679724dbf3c67a972b2e5ee1b8d63d not found: ID does not exist" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.923001 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.923051 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-scripts\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.923082 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-config-data-custom\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.923110 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.923170 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a2724da-6b9b-4947-a4e3-894938742304-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.923210 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-config-data\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.923225 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.923242 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a2724da-6b9b-4947-a4e3-894938742304-logs\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.923262 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9g78\" (UniqueName: \"kubernetes.io/projected/1a2724da-6b9b-4947-a4e3-894938742304-kube-api-access-z9g78\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:46 crc kubenswrapper[4841]: W0130 05:26:46.944882 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ad9e30b_abf9_45fd_9088_103c94e4ed70.slice/crio-707af2a63bdd05af37293d4e7a98f5b96492243fd1c69d8b55b8fce62b1f6704 WatchSource:0}: Error finding container 707af2a63bdd05af37293d4e7a98f5b96492243fd1c69d8b55b8fce62b1f6704: Status 404 returned error can't find the container with id 707af2a63bdd05af37293d4e7a98f5b96492243fd1c69d8b55b8fce62b1f6704 Jan 30 05:26:46 crc kubenswrapper[4841]: I0130 05:26:46.948136 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-65977b5879-qctf6"] Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.025345 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-config-data-custom\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.025449 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.025554 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a2724da-6b9b-4947-a4e3-894938742304-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.025613 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-config-data\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.025641 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.025671 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a2724da-6b9b-4947-a4e3-894938742304-logs\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.025693 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9g78\" (UniqueName: \"kubernetes.io/projected/1a2724da-6b9b-4947-a4e3-894938742304-kube-api-access-z9g78\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.025751 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.025783 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-scripts\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.025888 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a2724da-6b9b-4947-a4e3-894938742304-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.026525 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a2724da-6b9b-4947-a4e3-894938742304-logs\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.028775 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-config-data-custom\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.029597 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-scripts\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.029785 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.030359 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.030888 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.032353 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-config-data\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.052944 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9g78\" (UniqueName: \"kubernetes.io/projected/1a2724da-6b9b-4947-a4e3-894938742304-kube-api-access-z9g78\") pod \"cinder-api-0\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.114311 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.633563 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:26:47 crc kubenswrapper[4841]: W0130 05:26:47.638644 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a2724da_6b9b_4947_a4e3_894938742304.slice/crio-36f7afdbe5084d5625b8087adb504fde099a7eba590244560f3e9ec87dbbbff8 WatchSource:0}: Error finding container 36f7afdbe5084d5625b8087adb504fde099a7eba590244560f3e9ec87dbbbff8: Status 404 returned error can't find the container with id 36f7afdbe5084d5625b8087adb504fde099a7eba590244560f3e9ec87dbbbff8 Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.798650 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a2724da-6b9b-4947-a4e3-894938742304","Type":"ContainerStarted","Data":"36f7afdbe5084d5625b8087adb504fde099a7eba590244560f3e9ec87dbbbff8"} Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.800630 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65977b5879-qctf6" event={"ID":"8ad9e30b-abf9-45fd-9088-103c94e4ed70","Type":"ContainerStarted","Data":"71a6a5266aac4658f33e51c0327009b65b98cc2a8a908dc821c307eb9aa11b89"} Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.800667 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65977b5879-qctf6" event={"ID":"8ad9e30b-abf9-45fd-9088-103c94e4ed70","Type":"ContainerStarted","Data":"507b9c74c5025e94796600d70c3425d021076e350f65b3c2da7a6f02528353c9"} Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.800676 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65977b5879-qctf6" event={"ID":"8ad9e30b-abf9-45fd-9088-103c94e4ed70","Type":"ContainerStarted","Data":"707af2a63bdd05af37293d4e7a98f5b96492243fd1c69d8b55b8fce62b1f6704"} Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.800823 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:26:47 crc kubenswrapper[4841]: I0130 05:26:47.823121 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-65977b5879-qctf6" podStartSLOduration=1.8231037589999999 podStartE2EDuration="1.823103759s" podCreationTimestamp="2026-01-30 05:26:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:47.822814121 +0000 UTC m=+1144.816286759" watchObservedRunningTime="2026-01-30 05:26:47.823103759 +0000 UTC m=+1144.816576417" Jan 30 05:26:48 crc kubenswrapper[4841]: I0130 05:26:48.067381 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6bcfb9ffb5-q4t7z" podUID="fc4310f7-3d6f-45dc-a17e-5311da837e81" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9696/\": dial tcp 10.217.0.154:9696: connect: connection refused" Jan 30 05:26:48 crc kubenswrapper[4841]: I0130 05:26:48.446186 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="212dccf2-8d10-47ad-acf5-3df6a404b15f" path="/var/lib/kubelet/pods/212dccf2-8d10-47ad-acf5-3df6a404b15f/volumes" Jan 30 05:26:48 crc kubenswrapper[4841]: I0130 05:26:48.811053 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a2724da-6b9b-4947-a4e3-894938742304","Type":"ContainerStarted","Data":"111f861be82792295977d9e1509a0d34e506936e498cf079b08adb56c250f14b"} Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.820745 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a2724da-6b9b-4947-a4e3-894938742304","Type":"ContainerStarted","Data":"a480751fcbd360c7947b1114ae086d16d2b9b26b051a46078f3d5cb250e0a982"} Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.821671 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.822978 4841 generic.go:334] "Generic (PLEG): container finished" podID="fc4310f7-3d6f-45dc-a17e-5311da837e81" containerID="2b1a0e1c3a5b1bc5a2cea93c3da88301a3042badaf0a1dbc97c75b42d3595c7d" exitCode=0 Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.823022 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bcfb9ffb5-q4t7z" event={"ID":"fc4310f7-3d6f-45dc-a17e-5311da837e81","Type":"ContainerDied","Data":"2b1a0e1c3a5b1bc5a2cea93c3da88301a3042badaf0a1dbc97c75b42d3595c7d"} Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.823065 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bcfb9ffb5-q4t7z" event={"ID":"fc4310f7-3d6f-45dc-a17e-5311da837e81","Type":"ContainerDied","Data":"df0465c42a22f9325cdc36a94eb6914e40060da2ba8c0b4deb2b8e7e9c081ea8"} Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.823081 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df0465c42a22f9325cdc36a94eb6914e40060da2ba8c0b4deb2b8e7e9c081ea8" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.843825 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.856385 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.856359607 podStartE2EDuration="3.856359607s" podCreationTimestamp="2026-01-30 05:26:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:49.851233633 +0000 UTC m=+1146.844706311" watchObservedRunningTime="2026-01-30 05:26:49.856359607 +0000 UTC m=+1146.849832275" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.888795 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-internal-tls-certs\") pod \"fc4310f7-3d6f-45dc-a17e-5311da837e81\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.888878 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-httpd-config\") pod \"fc4310f7-3d6f-45dc-a17e-5311da837e81\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.888927 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-config\") pod \"fc4310f7-3d6f-45dc-a17e-5311da837e81\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.888981 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-ovndb-tls-certs\") pod \"fc4310f7-3d6f-45dc-a17e-5311da837e81\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.889194 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-public-tls-certs\") pod \"fc4310f7-3d6f-45dc-a17e-5311da837e81\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.889288 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-combined-ca-bundle\") pod \"fc4310f7-3d6f-45dc-a17e-5311da837e81\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.889343 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk56k\" (UniqueName: \"kubernetes.io/projected/fc4310f7-3d6f-45dc-a17e-5311da837e81-kube-api-access-xk56k\") pod \"fc4310f7-3d6f-45dc-a17e-5311da837e81\" (UID: \"fc4310f7-3d6f-45dc-a17e-5311da837e81\") " Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.894610 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc4310f7-3d6f-45dc-a17e-5311da837e81-kube-api-access-xk56k" (OuterVolumeSpecName: "kube-api-access-xk56k") pod "fc4310f7-3d6f-45dc-a17e-5311da837e81" (UID: "fc4310f7-3d6f-45dc-a17e-5311da837e81"). InnerVolumeSpecName "kube-api-access-xk56k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.900076 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "fc4310f7-3d6f-45dc-a17e-5311da837e81" (UID: "fc4310f7-3d6f-45dc-a17e-5311da837e81"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.940838 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-config" (OuterVolumeSpecName: "config") pod "fc4310f7-3d6f-45dc-a17e-5311da837e81" (UID: "fc4310f7-3d6f-45dc-a17e-5311da837e81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.957074 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fc4310f7-3d6f-45dc-a17e-5311da837e81" (UID: "fc4310f7-3d6f-45dc-a17e-5311da837e81"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.958310 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc4310f7-3d6f-45dc-a17e-5311da837e81" (UID: "fc4310f7-3d6f-45dc-a17e-5311da837e81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.970185 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "fc4310f7-3d6f-45dc-a17e-5311da837e81" (UID: "fc4310f7-3d6f-45dc-a17e-5311da837e81"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.970560 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fc4310f7-3d6f-45dc-a17e-5311da837e81" (UID: "fc4310f7-3d6f-45dc-a17e-5311da837e81"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.991829 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.991869 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.991881 4841 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.991893 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.991905 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.991917 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk56k\" (UniqueName: \"kubernetes.io/projected/fc4310f7-3d6f-45dc-a17e-5311da837e81-kube-api-access-xk56k\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:49 crc kubenswrapper[4841]: I0130 05:26:49.991929 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4310f7-3d6f-45dc-a17e-5311da837e81-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:50 crc kubenswrapper[4841]: I0130 05:26:50.833966 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bcfb9ffb5-q4t7z" Jan 30 05:26:50 crc kubenswrapper[4841]: I0130 05:26:50.868444 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bcfb9ffb5-q4t7z"] Jan 30 05:26:50 crc kubenswrapper[4841]: I0130 05:26:50.887746 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6bcfb9ffb5-q4t7z"] Jan 30 05:26:51 crc kubenswrapper[4841]: I0130 05:26:51.312689 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:26:51 crc kubenswrapper[4841]: I0130 05:26:51.412315 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66cdd4b5b5-n9dmx"] Jan 30 05:26:51 crc kubenswrapper[4841]: I0130 05:26:51.412544 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" podUID="569f263e-9a57-4cff-bb00-da7a3a5923c8" containerName="dnsmasq-dns" containerID="cri-o://aabf880281ea8e9bfd967d4b1672d7e1b1a4b6bda8d727751bb2143a2b045b4e" gracePeriod=10 Jan 30 05:26:51 crc kubenswrapper[4841]: I0130 05:26:51.575496 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 05:26:51 crc kubenswrapper[4841]: I0130 05:26:51.629223 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:51 crc kubenswrapper[4841]: I0130 05:26:51.849595 4841 generic.go:334] "Generic (PLEG): container finished" podID="569f263e-9a57-4cff-bb00-da7a3a5923c8" containerID="aabf880281ea8e9bfd967d4b1672d7e1b1a4b6bda8d727751bb2143a2b045b4e" exitCode=0 Jan 30 05:26:51 crc kubenswrapper[4841]: I0130 05:26:51.849842 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b806b168-492c-451d-8e35-f550c4f690ed" containerName="cinder-scheduler" containerID="cri-o://6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392" gracePeriod=30 Jan 30 05:26:51 crc kubenswrapper[4841]: I0130 05:26:51.849924 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" event={"ID":"569f263e-9a57-4cff-bb00-da7a3a5923c8","Type":"ContainerDied","Data":"aabf880281ea8e9bfd967d4b1672d7e1b1a4b6bda8d727751bb2143a2b045b4e"} Jan 30 05:26:51 crc kubenswrapper[4841]: I0130 05:26:51.850600 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b806b168-492c-451d-8e35-f550c4f690ed" containerName="probe" containerID="cri-o://31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c" gracePeriod=30 Jan 30 05:26:51 crc kubenswrapper[4841]: I0130 05:26:51.934508 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.029217 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-ovsdbserver-nb\") pod \"569f263e-9a57-4cff-bb00-da7a3a5923c8\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.029322 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf8wt\" (UniqueName: \"kubernetes.io/projected/569f263e-9a57-4cff-bb00-da7a3a5923c8-kube-api-access-zf8wt\") pod \"569f263e-9a57-4cff-bb00-da7a3a5923c8\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.029351 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-config\") pod \"569f263e-9a57-4cff-bb00-da7a3a5923c8\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.029470 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-dns-svc\") pod \"569f263e-9a57-4cff-bb00-da7a3a5923c8\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.029561 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-dns-swift-storage-0\") pod \"569f263e-9a57-4cff-bb00-da7a3a5923c8\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.029621 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-ovsdbserver-sb\") pod \"569f263e-9a57-4cff-bb00-da7a3a5923c8\" (UID: \"569f263e-9a57-4cff-bb00-da7a3a5923c8\") " Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.043075 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/569f263e-9a57-4cff-bb00-da7a3a5923c8-kube-api-access-zf8wt" (OuterVolumeSpecName: "kube-api-access-zf8wt") pod "569f263e-9a57-4cff-bb00-da7a3a5923c8" (UID: "569f263e-9a57-4cff-bb00-da7a3a5923c8"). InnerVolumeSpecName "kube-api-access-zf8wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.101952 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "569f263e-9a57-4cff-bb00-da7a3a5923c8" (UID: "569f263e-9a57-4cff-bb00-da7a3a5923c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.108772 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "569f263e-9a57-4cff-bb00-da7a3a5923c8" (UID: "569f263e-9a57-4cff-bb00-da7a3a5923c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.109221 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "569f263e-9a57-4cff-bb00-da7a3a5923c8" (UID: "569f263e-9a57-4cff-bb00-da7a3a5923c8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.133070 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-config" (OuterVolumeSpecName: "config") pod "569f263e-9a57-4cff-bb00-da7a3a5923c8" (UID: "569f263e-9a57-4cff-bb00-da7a3a5923c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.133591 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.133620 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf8wt\" (UniqueName: \"kubernetes.io/projected/569f263e-9a57-4cff-bb00-da7a3a5923c8-kube-api-access-zf8wt\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.133631 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.133640 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.133648 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.145876 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "569f263e-9a57-4cff-bb00-da7a3a5923c8" (UID: "569f263e-9a57-4cff-bb00-da7a3a5923c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.235859 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/569f263e-9a57-4cff-bb00-da7a3a5923c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.441985 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc4310f7-3d6f-45dc-a17e-5311da837e81" path="/var/lib/kubelet/pods/fc4310f7-3d6f-45dc-a17e-5311da837e81/volumes" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.860750 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" event={"ID":"569f263e-9a57-4cff-bb00-da7a3a5923c8","Type":"ContainerDied","Data":"6e023825af3d1da526affd9f95a0db4a2494646b21789b16565cd36340b200d6"} Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.860763 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cdd4b5b5-n9dmx" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.860805 4841 scope.go:117] "RemoveContainer" containerID="aabf880281ea8e9bfd967d4b1672d7e1b1a4b6bda8d727751bb2143a2b045b4e" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.863781 4841 generic.go:334] "Generic (PLEG): container finished" podID="b806b168-492c-451d-8e35-f550c4f690ed" containerID="31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c" exitCode=0 Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.863822 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b806b168-492c-451d-8e35-f550c4f690ed","Type":"ContainerDied","Data":"31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c"} Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.889895 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66cdd4b5b5-n9dmx"] Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.890201 4841 scope.go:117] "RemoveContainer" containerID="874020d90aa08682927abe4c4d2d7fd0c134e81df493ef2d1dd120899bbaf959" Jan 30 05:26:52 crc kubenswrapper[4841]: I0130 05:26:52.896987 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66cdd4b5b5-n9dmx"] Jan 30 05:26:53 crc kubenswrapper[4841]: I0130 05:26:53.472941 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:53 crc kubenswrapper[4841]: I0130 05:26:53.503601 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:53 crc kubenswrapper[4841]: I0130 05:26:53.909841 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5445c58497-n245m" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.353112 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.367123 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.388373 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-combined-ca-bundle\") pod \"b806b168-492c-451d-8e35-f550c4f690ed\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.388502 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kdgm\" (UniqueName: \"kubernetes.io/projected/b806b168-492c-451d-8e35-f550c4f690ed-kube-api-access-7kdgm\") pod \"b806b168-492c-451d-8e35-f550c4f690ed\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.388525 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-config-data-custom\") pod \"b806b168-492c-451d-8e35-f550c4f690ed\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.388557 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b806b168-492c-451d-8e35-f550c4f690ed-etc-machine-id\") pod \"b806b168-492c-451d-8e35-f550c4f690ed\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.388585 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-config-data\") pod \"b806b168-492c-451d-8e35-f550c4f690ed\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.388668 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-scripts\") pod \"b806b168-492c-451d-8e35-f550c4f690ed\" (UID: \"b806b168-492c-451d-8e35-f550c4f690ed\") " Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.388864 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b806b168-492c-451d-8e35-f550c4f690ed-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b806b168-492c-451d-8e35-f550c4f690ed" (UID: "b806b168-492c-451d-8e35-f550c4f690ed"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.389261 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b806b168-492c-451d-8e35-f550c4f690ed-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.393792 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-scripts" (OuterVolumeSpecName: "scripts") pod "b806b168-492c-451d-8e35-f550c4f690ed" (UID: "b806b168-492c-451d-8e35-f550c4f690ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.393792 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b806b168-492c-451d-8e35-f550c4f690ed-kube-api-access-7kdgm" (OuterVolumeSpecName: "kube-api-access-7kdgm") pod "b806b168-492c-451d-8e35-f550c4f690ed" (UID: "b806b168-492c-451d-8e35-f550c4f690ed"). InnerVolumeSpecName "kube-api-access-7kdgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.405643 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.417179 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b806b168-492c-451d-8e35-f550c4f690ed" (UID: "b806b168-492c-451d-8e35-f550c4f690ed"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.457273 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="569f263e-9a57-4cff-bb00-da7a3a5923c8" path="/var/lib/kubelet/pods/569f263e-9a57-4cff-bb00-da7a3a5923c8/volumes" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.460197 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b806b168-492c-451d-8e35-f550c4f690ed" (UID: "b806b168-492c-451d-8e35-f550c4f690ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.490900 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7cb787b658-zbrbz"] Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.491390 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.491435 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.491445 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kdgm\" (UniqueName: \"kubernetes.io/projected/b806b168-492c-451d-8e35-f550c4f690ed-kube-api-access-7kdgm\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.491453 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.523844 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-config-data" (OuterVolumeSpecName: "config-data") pod "b806b168-492c-451d-8e35-f550c4f690ed" (UID: "b806b168-492c-451d-8e35-f550c4f690ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.594262 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b806b168-492c-451d-8e35-f550c4f690ed-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.884571 4841 generic.go:334] "Generic (PLEG): container finished" podID="b806b168-492c-451d-8e35-f550c4f690ed" containerID="6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392" exitCode=0 Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.884631 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.884648 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b806b168-492c-451d-8e35-f550c4f690ed","Type":"ContainerDied","Data":"6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392"} Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.884952 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b806b168-492c-451d-8e35-f550c4f690ed","Type":"ContainerDied","Data":"1fb8c240d9f8fcd64531009da82dfee80abade1be7e540a31a4dd029397b2618"} Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.884970 4841 scope.go:117] "RemoveContainer" containerID="31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.885521 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7cb787b658-zbrbz" podUID="8b660389-643d-46fa-aa90-747dc6dbe5f9" containerName="placement-log" containerID="cri-o://71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0" gracePeriod=30 Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.885559 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7cb787b658-zbrbz" podUID="8b660389-643d-46fa-aa90-747dc6dbe5f9" containerName="placement-api" containerID="cri-o://e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6" gracePeriod=30 Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.917219 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.917360 4841 scope.go:117] "RemoveContainer" containerID="6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.930585 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.942414 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:54 crc kubenswrapper[4841]: E0130 05:26:54.944353 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4310f7-3d6f-45dc-a17e-5311da837e81" containerName="neutron-api" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.944446 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4310f7-3d6f-45dc-a17e-5311da837e81" containerName="neutron-api" Jan 30 05:26:54 crc kubenswrapper[4841]: E0130 05:26:54.944486 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="569f263e-9a57-4cff-bb00-da7a3a5923c8" containerName="dnsmasq-dns" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.944494 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="569f263e-9a57-4cff-bb00-da7a3a5923c8" containerName="dnsmasq-dns" Jan 30 05:26:54 crc kubenswrapper[4841]: E0130 05:26:54.944501 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b806b168-492c-451d-8e35-f550c4f690ed" containerName="cinder-scheduler" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.944507 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b806b168-492c-451d-8e35-f550c4f690ed" containerName="cinder-scheduler" Jan 30 05:26:54 crc kubenswrapper[4841]: E0130 05:26:54.944526 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="569f263e-9a57-4cff-bb00-da7a3a5923c8" containerName="init" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.944533 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="569f263e-9a57-4cff-bb00-da7a3a5923c8" containerName="init" Jan 30 05:26:54 crc kubenswrapper[4841]: E0130 05:26:54.944544 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b806b168-492c-451d-8e35-f550c4f690ed" containerName="probe" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.944549 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b806b168-492c-451d-8e35-f550c4f690ed" containerName="probe" Jan 30 05:26:54 crc kubenswrapper[4841]: E0130 05:26:54.944564 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4310f7-3d6f-45dc-a17e-5311da837e81" containerName="neutron-httpd" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.944569 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4310f7-3d6f-45dc-a17e-5311da837e81" containerName="neutron-httpd" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.944739 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc4310f7-3d6f-45dc-a17e-5311da837e81" containerName="neutron-api" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.944757 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b806b168-492c-451d-8e35-f550c4f690ed" containerName="probe" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.944768 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="569f263e-9a57-4cff-bb00-da7a3a5923c8" containerName="dnsmasq-dns" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.944777 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b806b168-492c-451d-8e35-f550c4f690ed" containerName="cinder-scheduler" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.944788 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc4310f7-3d6f-45dc-a17e-5311da837e81" containerName="neutron-httpd" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.945783 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.950677 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.952604 4841 scope.go:117] "RemoveContainer" containerID="31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c" Jan 30 05:26:54 crc kubenswrapper[4841]: E0130 05:26:54.953376 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c\": container with ID starting with 31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c not found: ID does not exist" containerID="31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.953565 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c"} err="failed to get container status \"31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c\": rpc error: code = NotFound desc = could not find container \"31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c\": container with ID starting with 31b8194ed81b283fe2ee072b40c4e54ec4f869d353ee01ac271ca3494ef2e72c not found: ID does not exist" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.953596 4841 scope.go:117] "RemoveContainer" containerID="6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392" Jan 30 05:26:54 crc kubenswrapper[4841]: E0130 05:26:54.953940 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392\": container with ID starting with 6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392 not found: ID does not exist" containerID="6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.953977 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392"} err="failed to get container status \"6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392\": rpc error: code = NotFound desc = could not find container \"6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392\": container with ID starting with 6286f2eeae4d2bc22daaa97d86103a67b48d07b416198492983fac47b153c392 not found: ID does not exist" Jan 30 05:26:54 crc kubenswrapper[4841]: I0130 05:26:54.957344 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.001907 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-scripts\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.002012 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwkbg\" (UniqueName: \"kubernetes.io/projected/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-kube-api-access-nwkbg\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.002068 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.002166 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.002232 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.002280 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-config-data\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.103384 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.103852 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.104002 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.104186 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-config-data\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.104276 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-scripts\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.104370 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwkbg\" (UniqueName: \"kubernetes.io/projected/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-kube-api-access-nwkbg\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.104481 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.108673 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.109308 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-config-data\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.109761 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.117964 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-scripts\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.123534 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwkbg\" (UniqueName: \"kubernetes.io/projected/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-kube-api-access-nwkbg\") pod \"cinder-scheduler-0\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.170896 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.185016 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.188053 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.188349 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-8tmfn" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.188988 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.190998 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.306517 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-openstack-config-secret\") pod \"openstackclient\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.306573 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzhlr\" (UniqueName: \"kubernetes.io/projected/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-kube-api-access-lzhlr\") pod \"openstackclient\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.306603 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-openstack-config\") pod \"openstackclient\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.306817 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.338899 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.409808 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-openstack-config-secret\") pod \"openstackclient\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.410249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzhlr\" (UniqueName: \"kubernetes.io/projected/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-kube-api-access-lzhlr\") pod \"openstackclient\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.410304 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-openstack-config\") pod \"openstackclient\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.410352 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.413199 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-openstack-config\") pod \"openstackclient\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.414305 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-openstack-config-secret\") pod \"openstackclient\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.427982 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.429801 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzhlr\" (UniqueName: \"kubernetes.io/projected/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-kube-api-access-lzhlr\") pod \"openstackclient\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.519686 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.769595 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 05:26:55 crc kubenswrapper[4841]: W0130 05:26:55.774016 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf39f9d1b_33bb_4c4f_b168_3e7b2cdcf7d5.slice/crio-b79f236a321bfa0df59d90243aa34bff32bc348982127bc773bfc65d7b9e059f WatchSource:0}: Error finding container b79f236a321bfa0df59d90243aa34bff32bc348982127bc773bfc65d7b9e059f: Status 404 returned error can't find the container with id b79f236a321bfa0df59d90243aa34bff32bc348982127bc773bfc65d7b9e059f Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.817907 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.894699 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5","Type":"ContainerStarted","Data":"b79f236a321bfa0df59d90243aa34bff32bc348982127bc773bfc65d7b9e059f"} Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.895623 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc","Type":"ContainerStarted","Data":"3ef8c0e368d68aa24591968b393ecd2a6f1a6f912debf6a12b2028a83d4c98e8"} Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.904373 4841 generic.go:334] "Generic (PLEG): container finished" podID="8b660389-643d-46fa-aa90-747dc6dbe5f9" containerID="71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0" exitCode=143 Jan 30 05:26:55 crc kubenswrapper[4841]: I0130 05:26:55.904432 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cb787b658-zbrbz" event={"ID":"8b660389-643d-46fa-aa90-747dc6dbe5f9","Type":"ContainerDied","Data":"71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0"} Jan 30 05:26:56 crc kubenswrapper[4841]: I0130 05:26:56.442066 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b806b168-492c-451d-8e35-f550c4f690ed" path="/var/lib/kubelet/pods/b806b168-492c-451d-8e35-f550c4f690ed/volumes" Jan 30 05:26:56 crc kubenswrapper[4841]: I0130 05:26:56.921420 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc","Type":"ContainerStarted","Data":"008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25"} Jan 30 05:26:57 crc kubenswrapper[4841]: I0130 05:26:57.954625 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc","Type":"ContainerStarted","Data":"0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87"} Jan 30 05:26:57 crc kubenswrapper[4841]: I0130 05:26:57.986775 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.986753873 podStartE2EDuration="3.986753873s" podCreationTimestamp="2026-01-30 05:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:26:57.978556568 +0000 UTC m=+1154.972029206" watchObservedRunningTime="2026-01-30 05:26:57.986753873 +0000 UTC m=+1154.980226511" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.590519 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.768559 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-public-tls-certs\") pod \"8b660389-643d-46fa-aa90-747dc6dbe5f9\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.768628 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-config-data\") pod \"8b660389-643d-46fa-aa90-747dc6dbe5f9\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.768668 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-combined-ca-bundle\") pod \"8b660389-643d-46fa-aa90-747dc6dbe5f9\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.768724 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c4xt\" (UniqueName: \"kubernetes.io/projected/8b660389-643d-46fa-aa90-747dc6dbe5f9-kube-api-access-9c4xt\") pod \"8b660389-643d-46fa-aa90-747dc6dbe5f9\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.768779 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b660389-643d-46fa-aa90-747dc6dbe5f9-logs\") pod \"8b660389-643d-46fa-aa90-747dc6dbe5f9\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.768802 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-internal-tls-certs\") pod \"8b660389-643d-46fa-aa90-747dc6dbe5f9\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.768885 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-scripts\") pod \"8b660389-643d-46fa-aa90-747dc6dbe5f9\" (UID: \"8b660389-643d-46fa-aa90-747dc6dbe5f9\") " Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.769543 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b660389-643d-46fa-aa90-747dc6dbe5f9-logs" (OuterVolumeSpecName: "logs") pod "8b660389-643d-46fa-aa90-747dc6dbe5f9" (UID: "8b660389-643d-46fa-aa90-747dc6dbe5f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.776501 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-scripts" (OuterVolumeSpecName: "scripts") pod "8b660389-643d-46fa-aa90-747dc6dbe5f9" (UID: "8b660389-643d-46fa-aa90-747dc6dbe5f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.776540 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b660389-643d-46fa-aa90-747dc6dbe5f9-kube-api-access-9c4xt" (OuterVolumeSpecName: "kube-api-access-9c4xt") pod "8b660389-643d-46fa-aa90-747dc6dbe5f9" (UID: "8b660389-643d-46fa-aa90-747dc6dbe5f9"). InnerVolumeSpecName "kube-api-access-9c4xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.834559 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b660389-643d-46fa-aa90-747dc6dbe5f9" (UID: "8b660389-643d-46fa-aa90-747dc6dbe5f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.871547 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.871769 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c4xt\" (UniqueName: \"kubernetes.io/projected/8b660389-643d-46fa-aa90-747dc6dbe5f9-kube-api-access-9c4xt\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.871938 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b660389-643d-46fa-aa90-747dc6dbe5f9-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.872095 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.876519 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8b660389-643d-46fa-aa90-747dc6dbe5f9" (UID: "8b660389-643d-46fa-aa90-747dc6dbe5f9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.885461 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-config-data" (OuterVolumeSpecName: "config-data") pod "8b660389-643d-46fa-aa90-747dc6dbe5f9" (UID: "8b660389-643d-46fa-aa90-747dc6dbe5f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.892520 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8b660389-643d-46fa-aa90-747dc6dbe5f9" (UID: "8b660389-643d-46fa-aa90-747dc6dbe5f9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.964915 4841 generic.go:334] "Generic (PLEG): container finished" podID="8b660389-643d-46fa-aa90-747dc6dbe5f9" containerID="e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6" exitCode=0 Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.964959 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cb787b658-zbrbz" event={"ID":"8b660389-643d-46fa-aa90-747dc6dbe5f9","Type":"ContainerDied","Data":"e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6"} Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.964999 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cb787b658-zbrbz" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.965020 4841 scope.go:117] "RemoveContainer" containerID="e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6" Jan 30 05:26:58 crc kubenswrapper[4841]: I0130 05:26:58.965007 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cb787b658-zbrbz" event={"ID":"8b660389-643d-46fa-aa90-747dc6dbe5f9","Type":"ContainerDied","Data":"6e921f6df5c6d836874ba876b9fd7e3235b60728d4235eb932dee3d6d7b157f5"} Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.001093 4841 scope.go:117] "RemoveContainer" containerID="71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.010188 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.010213 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.010239 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b660389-643d-46fa-aa90-747dc6dbe5f9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.031767 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7cb787b658-zbrbz"] Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.033798 4841 scope.go:117] "RemoveContainer" containerID="e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6" Jan 30 05:26:59 crc kubenswrapper[4841]: E0130 05:26:59.034275 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6\": container with ID starting with e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6 not found: ID does not exist" containerID="e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.034345 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6"} err="failed to get container status \"e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6\": rpc error: code = NotFound desc = could not find container \"e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6\": container with ID starting with e986251cf81ed5c82eff960464cef0566e0bb17b00de1d192a643b64efd746c6 not found: ID does not exist" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.034372 4841 scope.go:117] "RemoveContainer" containerID="71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0" Jan 30 05:26:59 crc kubenswrapper[4841]: E0130 05:26:59.035896 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0\": container with ID starting with 71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0 not found: ID does not exist" containerID="71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.035953 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0"} err="failed to get container status \"71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0\": rpc error: code = NotFound desc = could not find container \"71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0\": container with ID starting with 71e1ae2f136ea6ea98636350a268c9409f4fcbf1273be71a77fa708405a43dd0 not found: ID does not exist" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.043162 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7cb787b658-zbrbz"] Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.138821 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5fb964589-phnmn"] Jan 30 05:26:59 crc kubenswrapper[4841]: E0130 05:26:59.139197 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b660389-643d-46fa-aa90-747dc6dbe5f9" containerName="placement-log" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.139214 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b660389-643d-46fa-aa90-747dc6dbe5f9" containerName="placement-log" Jan 30 05:26:59 crc kubenswrapper[4841]: E0130 05:26:59.139227 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b660389-643d-46fa-aa90-747dc6dbe5f9" containerName="placement-api" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.139233 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b660389-643d-46fa-aa90-747dc6dbe5f9" containerName="placement-api" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.139424 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b660389-643d-46fa-aa90-747dc6dbe5f9" containerName="placement-log" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.139447 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b660389-643d-46fa-aa90-747dc6dbe5f9" containerName="placement-api" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.140341 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.143415 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.143872 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.144065 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.149014 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5fb964589-phnmn"] Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.214703 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-combined-ca-bundle\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.214785 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-public-tls-certs\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.214837 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-internal-tls-certs\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.214883 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cglr\" (UniqueName: \"kubernetes.io/projected/f198eff9-f493-43d9-9b64-06196b205142-kube-api-access-8cglr\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.214910 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f198eff9-f493-43d9-9b64-06196b205142-etc-swift\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.215020 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-config-data\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.215055 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f198eff9-f493-43d9-9b64-06196b205142-log-httpd\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.215096 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f198eff9-f493-43d9-9b64-06196b205142-run-httpd\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.310885 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.316720 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-config-data\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.316769 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f198eff9-f493-43d9-9b64-06196b205142-log-httpd\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.316812 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f198eff9-f493-43d9-9b64-06196b205142-run-httpd\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.316846 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-combined-ca-bundle\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.316884 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-public-tls-certs\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.316930 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-internal-tls-certs\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.316972 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cglr\" (UniqueName: \"kubernetes.io/projected/f198eff9-f493-43d9-9b64-06196b205142-kube-api-access-8cglr\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.317001 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f198eff9-f493-43d9-9b64-06196b205142-etc-swift\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.317372 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f198eff9-f493-43d9-9b64-06196b205142-log-httpd\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.317463 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f198eff9-f493-43d9-9b64-06196b205142-run-httpd\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.320785 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-internal-tls-certs\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.320898 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-public-tls-certs\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.321503 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-config-data\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.327504 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f198eff9-f493-43d9-9b64-06196b205142-etc-swift\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.327865 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-combined-ca-bundle\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.345959 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cglr\" (UniqueName: \"kubernetes.io/projected/f198eff9-f493-43d9-9b64-06196b205142-kube-api-access-8cglr\") pod \"swift-proxy-5fb964589-phnmn\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:26:59 crc kubenswrapper[4841]: I0130 05:26:59.471696 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:27:00 crc kubenswrapper[4841]: I0130 05:27:00.049927 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5fb964589-phnmn"] Jan 30 05:27:00 crc kubenswrapper[4841]: W0130 05:27:00.056051 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf198eff9_f493_43d9_9b64_06196b205142.slice/crio-fe33d56b31bdb343fca3d156e40bf9674067615261c3d2c6b427d4633546e434 WatchSource:0}: Error finding container fe33d56b31bdb343fca3d156e40bf9674067615261c3d2c6b427d4633546e434: Status 404 returned error can't find the container with id fe33d56b31bdb343fca3d156e40bf9674067615261c3d2c6b427d4633546e434 Jan 30 05:27:00 crc kubenswrapper[4841]: I0130 05:27:00.340455 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 05:27:00 crc kubenswrapper[4841]: I0130 05:27:00.443558 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b660389-643d-46fa-aa90-747dc6dbe5f9" path="/var/lib/kubelet/pods/8b660389-643d-46fa-aa90-747dc6dbe5f9/volumes" Jan 30 05:27:01 crc kubenswrapper[4841]: I0130 05:27:01.003040 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5fb964589-phnmn" event={"ID":"f198eff9-f493-43d9-9b64-06196b205142","Type":"ContainerStarted","Data":"8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d"} Jan 30 05:27:01 crc kubenswrapper[4841]: I0130 05:27:01.003081 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5fb964589-phnmn" event={"ID":"f198eff9-f493-43d9-9b64-06196b205142","Type":"ContainerStarted","Data":"fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3"} Jan 30 05:27:01 crc kubenswrapper[4841]: I0130 05:27:01.003092 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5fb964589-phnmn" event={"ID":"f198eff9-f493-43d9-9b64-06196b205142","Type":"ContainerStarted","Data":"fe33d56b31bdb343fca3d156e40bf9674067615261c3d2c6b427d4633546e434"} Jan 30 05:27:01 crc kubenswrapper[4841]: I0130 05:27:01.004104 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:27:01 crc kubenswrapper[4841]: I0130 05:27:01.004127 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:27:01 crc kubenswrapper[4841]: I0130 05:27:01.037344 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5fb964589-phnmn" podStartSLOduration=2.037322959 podStartE2EDuration="2.037322959s" podCreationTimestamp="2026-01-30 05:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:01.034235278 +0000 UTC m=+1158.027707916" watchObservedRunningTime="2026-01-30 05:27:01.037322959 +0000 UTC m=+1158.030795597" Jan 30 05:27:01 crc kubenswrapper[4841]: I0130 05:27:01.950942 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:01 crc kubenswrapper[4841]: I0130 05:27:01.951480 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="proxy-httpd" containerID="cri-o://e586caec9c795381265e025d82640e7144ebc4152d95c80e46a5de9fea6a4eb1" gracePeriod=30 Jan 30 05:27:01 crc kubenswrapper[4841]: I0130 05:27:01.951505 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="sg-core" containerID="cri-o://6991ac0d0f5787ebee296eaa5f3ad0cd646e494f4dcbc1bfe860414036eea26a" gracePeriod=30 Jan 30 05:27:01 crc kubenswrapper[4841]: I0130 05:27:01.951544 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="ceilometer-notification-agent" containerID="cri-o://6684dd85866b7d240e58a458411946df2ce04bbc98a59876a1263d23cc34dd60" gracePeriod=30 Jan 30 05:27:01 crc kubenswrapper[4841]: I0130 05:27:01.952916 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="ceilometer-central-agent" containerID="cri-o://8ff43360aaa71073b396349def6db95a5090b4128841c76b4eb8f7a4aa9fc054" gracePeriod=30 Jan 30 05:27:01 crc kubenswrapper[4841]: I0130 05:27:01.960319 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.166:3000/\": EOF" Jan 30 05:27:03 crc kubenswrapper[4841]: I0130 05:27:03.023343 4841 generic.go:334] "Generic (PLEG): container finished" podID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerID="e586caec9c795381265e025d82640e7144ebc4152d95c80e46a5de9fea6a4eb1" exitCode=0 Jan 30 05:27:03 crc kubenswrapper[4841]: I0130 05:27:03.023430 4841 generic.go:334] "Generic (PLEG): container finished" podID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerID="6991ac0d0f5787ebee296eaa5f3ad0cd646e494f4dcbc1bfe860414036eea26a" exitCode=2 Jan 30 05:27:03 crc kubenswrapper[4841]: I0130 05:27:03.023443 4841 generic.go:334] "Generic (PLEG): container finished" podID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerID="8ff43360aaa71073b396349def6db95a5090b4128841c76b4eb8f7a4aa9fc054" exitCode=0 Jan 30 05:27:03 crc kubenswrapper[4841]: I0130 05:27:03.023432 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29253fdb-c316-4287-b2e4-0e5d129bfed5","Type":"ContainerDied","Data":"e586caec9c795381265e025d82640e7144ebc4152d95c80e46a5de9fea6a4eb1"} Jan 30 05:27:03 crc kubenswrapper[4841]: I0130 05:27:03.023562 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29253fdb-c316-4287-b2e4-0e5d129bfed5","Type":"ContainerDied","Data":"6991ac0d0f5787ebee296eaa5f3ad0cd646e494f4dcbc1bfe860414036eea26a"} Jan 30 05:27:03 crc kubenswrapper[4841]: I0130 05:27:03.023589 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29253fdb-c316-4287-b2e4-0e5d129bfed5","Type":"ContainerDied","Data":"8ff43360aaa71073b396349def6db95a5090b4128841c76b4eb8f7a4aa9fc054"} Jan 30 05:27:05 crc kubenswrapper[4841]: I0130 05:27:05.549090 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.060151 4841 generic.go:334] "Generic (PLEG): container finished" podID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerID="6684dd85866b7d240e58a458411946df2ce04bbc98a59876a1263d23cc34dd60" exitCode=0 Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.060621 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29253fdb-c316-4287-b2e4-0e5d129bfed5","Type":"ContainerDied","Data":"6684dd85866b7d240e58a458411946df2ce04bbc98a59876a1263d23cc34dd60"} Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.183552 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.324741 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-combined-ca-bundle\") pod \"29253fdb-c316-4287-b2e4-0e5d129bfed5\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.324812 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29253fdb-c316-4287-b2e4-0e5d129bfed5-log-httpd\") pod \"29253fdb-c316-4287-b2e4-0e5d129bfed5\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.324857 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-scripts\") pod \"29253fdb-c316-4287-b2e4-0e5d129bfed5\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.324879 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99lbh\" (UniqueName: \"kubernetes.io/projected/29253fdb-c316-4287-b2e4-0e5d129bfed5-kube-api-access-99lbh\") pod \"29253fdb-c316-4287-b2e4-0e5d129bfed5\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.324909 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-sg-core-conf-yaml\") pod \"29253fdb-c316-4287-b2e4-0e5d129bfed5\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.324928 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29253fdb-c316-4287-b2e4-0e5d129bfed5-run-httpd\") pod \"29253fdb-c316-4287-b2e4-0e5d129bfed5\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.324967 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-config-data\") pod \"29253fdb-c316-4287-b2e4-0e5d129bfed5\" (UID: \"29253fdb-c316-4287-b2e4-0e5d129bfed5\") " Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.326073 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29253fdb-c316-4287-b2e4-0e5d129bfed5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "29253fdb-c316-4287-b2e4-0e5d129bfed5" (UID: "29253fdb-c316-4287-b2e4-0e5d129bfed5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.326157 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29253fdb-c316-4287-b2e4-0e5d129bfed5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "29253fdb-c316-4287-b2e4-0e5d129bfed5" (UID: "29253fdb-c316-4287-b2e4-0e5d129bfed5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.340011 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-scripts" (OuterVolumeSpecName: "scripts") pod "29253fdb-c316-4287-b2e4-0e5d129bfed5" (UID: "29253fdb-c316-4287-b2e4-0e5d129bfed5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.346564 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29253fdb-c316-4287-b2e4-0e5d129bfed5-kube-api-access-99lbh" (OuterVolumeSpecName: "kube-api-access-99lbh") pod "29253fdb-c316-4287-b2e4-0e5d129bfed5" (UID: "29253fdb-c316-4287-b2e4-0e5d129bfed5"). InnerVolumeSpecName "kube-api-access-99lbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.375514 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "29253fdb-c316-4287-b2e4-0e5d129bfed5" (UID: "29253fdb-c316-4287-b2e4-0e5d129bfed5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.426416 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29253fdb-c316-4287-b2e4-0e5d129bfed5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.426448 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.426456 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99lbh\" (UniqueName: \"kubernetes.io/projected/29253fdb-c316-4287-b2e4-0e5d129bfed5-kube-api-access-99lbh\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.426465 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.426473 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29253fdb-c316-4287-b2e4-0e5d129bfed5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.437875 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29253fdb-c316-4287-b2e4-0e5d129bfed5" (UID: "29253fdb-c316-4287-b2e4-0e5d129bfed5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.440673 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-config-data" (OuterVolumeSpecName: "config-data") pod "29253fdb-c316-4287-b2e4-0e5d129bfed5" (UID: "29253fdb-c316-4287-b2e4-0e5d129bfed5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.528278 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:07 crc kubenswrapper[4841]: I0130 05:27:07.528305 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29253fdb-c316-4287-b2e4-0e5d129bfed5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.080652 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29253fdb-c316-4287-b2e4-0e5d129bfed5","Type":"ContainerDied","Data":"3f8ad2455011d649cfcdcdbfa46334bebb54f7372acdeec67f025608cc7f3c11"} Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.080728 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.080987 4841 scope.go:117] "RemoveContainer" containerID="e586caec9c795381265e025d82640e7144ebc4152d95c80e46a5de9fea6a4eb1" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.081857 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5","Type":"ContainerStarted","Data":"3c7852e63582709a9e74b827146a470d704988cb06dbe6df77ac6ac4fc666c94"} Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.098879 4841 scope.go:117] "RemoveContainer" containerID="6991ac0d0f5787ebee296eaa5f3ad0cd646e494f4dcbc1bfe860414036eea26a" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.110069 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.036222583 podStartE2EDuration="13.11005173s" podCreationTimestamp="2026-01-30 05:26:55 +0000 UTC" firstStartedPulling="2026-01-30 05:26:55.777941689 +0000 UTC m=+1152.771414337" lastFinishedPulling="2026-01-30 05:27:06.851770846 +0000 UTC m=+1163.845243484" observedRunningTime="2026-01-30 05:27:08.105759318 +0000 UTC m=+1165.099231956" watchObservedRunningTime="2026-01-30 05:27:08.11005173 +0000 UTC m=+1165.103524368" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.125190 4841 scope.go:117] "RemoveContainer" containerID="6684dd85866b7d240e58a458411946df2ce04bbc98a59876a1263d23cc34dd60" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.132982 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.141817 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.157386 4841 scope.go:117] "RemoveContainer" containerID="8ff43360aaa71073b396349def6db95a5090b4128841c76b4eb8f7a4aa9fc054" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.166530 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:08 crc kubenswrapper[4841]: E0130 05:27:08.166880 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="ceilometer-notification-agent" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.166897 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="ceilometer-notification-agent" Jan 30 05:27:08 crc kubenswrapper[4841]: E0130 05:27:08.166913 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="sg-core" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.166919 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="sg-core" Jan 30 05:27:08 crc kubenswrapper[4841]: E0130 05:27:08.166929 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="ceilometer-central-agent" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.166935 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="ceilometer-central-agent" Jan 30 05:27:08 crc kubenswrapper[4841]: E0130 05:27:08.166948 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="proxy-httpd" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.166953 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="proxy-httpd" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.167110 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="proxy-httpd" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.167123 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="sg-core" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.167134 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="ceilometer-central-agent" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.167146 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" containerName="ceilometer-notification-agent" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.168600 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.171304 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.171690 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.182630 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.344653 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-log-httpd\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.344694 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.344721 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-config-data\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.344741 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.344780 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-scripts\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.344816 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-run-httpd\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.344853 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5m9p\" (UniqueName: \"kubernetes.io/projected/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-kube-api-access-z5m9p\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.445897 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29253fdb-c316-4287-b2e4-0e5d129bfed5" path="/var/lib/kubelet/pods/29253fdb-c316-4287-b2e4-0e5d129bfed5/volumes" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.446901 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-log-httpd\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.446944 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.446968 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-config-data\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.446986 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.447037 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-scripts\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.447090 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-run-httpd\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.447131 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5m9p\" (UniqueName: \"kubernetes.io/projected/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-kube-api-access-z5m9p\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.448607 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-run-httpd\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.448625 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-log-httpd\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.452350 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-scripts\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.453508 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.453705 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.454634 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-config-data\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.463973 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5m9p\" (UniqueName: \"kubernetes.io/projected/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-kube-api-access-z5m9p\") pod \"ceilometer-0\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.519527 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:08 crc kubenswrapper[4841]: I0130 05:27:08.970380 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:08 crc kubenswrapper[4841]: W0130 05:27:08.976230 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode91f8b2f_dd5c_435f_b0bb_3ee3b8437454.slice/crio-06b535c52dbbbc78e205de2f1dcc6406f61c1d417aab7b252d06a50b4ea00012 WatchSource:0}: Error finding container 06b535c52dbbbc78e205de2f1dcc6406f61c1d417aab7b252d06a50b4ea00012: Status 404 returned error can't find the container with id 06b535c52dbbbc78e205de2f1dcc6406f61c1d417aab7b252d06a50b4ea00012 Jan 30 05:27:09 crc kubenswrapper[4841]: I0130 05:27:09.093713 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454","Type":"ContainerStarted","Data":"06b535c52dbbbc78e205de2f1dcc6406f61c1d417aab7b252d06a50b4ea00012"} Jan 30 05:27:09 crc kubenswrapper[4841]: I0130 05:27:09.477265 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:27:09 crc kubenswrapper[4841]: I0130 05:27:09.489914 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:27:10 crc kubenswrapper[4841]: I0130 05:27:10.110661 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454","Type":"ContainerStarted","Data":"0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62"} Jan 30 05:27:10 crc kubenswrapper[4841]: I0130 05:27:10.617787 4841 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod502d6fe3-4215-4b32-8546-a55e5a4afc91"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod502d6fe3-4215-4b32-8546-a55e5a4afc91] : Timed out while waiting for systemd to remove kubepods-besteffort-pod502d6fe3_4215_4b32_8546_a55e5a4afc91.slice" Jan 30 05:27:11 crc kubenswrapper[4841]: I0130 05:27:11.118348 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454","Type":"ContainerStarted","Data":"586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a"} Jan 30 05:27:11 crc kubenswrapper[4841]: I0130 05:27:11.641004 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:12 crc kubenswrapper[4841]: I0130 05:27:12.131394 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454","Type":"ContainerStarted","Data":"1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576"} Jan 30 05:27:13 crc kubenswrapper[4841]: I0130 05:27:13.473316 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:27:13 crc kubenswrapper[4841]: I0130 05:27:13.474045 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" containerName="glance-log" containerID="cri-o://01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3" gracePeriod=30 Jan 30 05:27:13 crc kubenswrapper[4841]: I0130 05:27:13.474238 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" containerName="glance-httpd" containerID="cri-o://78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7" gracePeriod=30 Jan 30 05:27:14 crc kubenswrapper[4841]: I0130 05:27:14.162942 4841 generic.go:334] "Generic (PLEG): container finished" podID="fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" containerID="01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3" exitCode=143 Jan 30 05:27:14 crc kubenswrapper[4841]: I0130 05:27:14.163046 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556","Type":"ContainerDied","Data":"01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3"} Jan 30 05:27:14 crc kubenswrapper[4841]: I0130 05:27:14.501386 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:27:14 crc kubenswrapper[4841]: I0130 05:27:14.501934 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" containerName="glance-log" containerID="cri-o://e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52" gracePeriod=30 Jan 30 05:27:14 crc kubenswrapper[4841]: I0130 05:27:14.501981 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" containerName="glance-httpd" containerID="cri-o://c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723" gracePeriod=30 Jan 30 05:27:14 crc kubenswrapper[4841]: E0130 05:27:14.680699 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d68f5be_dc23_4bfd_9dc3_4d934aff76d0.slice/crio-conmon-e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d68f5be_dc23_4bfd_9dc3_4d934aff76d0.slice/crio-e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:27:15 crc kubenswrapper[4841]: I0130 05:27:15.174348 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454","Type":"ContainerStarted","Data":"5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094"} Jan 30 05:27:15 crc kubenswrapper[4841]: I0130 05:27:15.174733 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 05:27:15 crc kubenswrapper[4841]: I0130 05:27:15.174731 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="ceilometer-central-agent" containerID="cri-o://0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62" gracePeriod=30 Jan 30 05:27:15 crc kubenswrapper[4841]: I0130 05:27:15.174885 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="proxy-httpd" containerID="cri-o://5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094" gracePeriod=30 Jan 30 05:27:15 crc kubenswrapper[4841]: I0130 05:27:15.174949 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="sg-core" containerID="cri-o://1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576" gracePeriod=30 Jan 30 05:27:15 crc kubenswrapper[4841]: I0130 05:27:15.175008 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="ceilometer-notification-agent" containerID="cri-o://586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a" gracePeriod=30 Jan 30 05:27:15 crc kubenswrapper[4841]: I0130 05:27:15.177691 4841 generic.go:334] "Generic (PLEG): container finished" podID="9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" containerID="e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52" exitCode=143 Jan 30 05:27:15 crc kubenswrapper[4841]: I0130 05:27:15.177737 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0","Type":"ContainerDied","Data":"e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52"} Jan 30 05:27:16 crc kubenswrapper[4841]: I0130 05:27:16.189022 4841 generic.go:334] "Generic (PLEG): container finished" podID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerID="5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094" exitCode=0 Jan 30 05:27:16 crc kubenswrapper[4841]: I0130 05:27:16.189068 4841 generic.go:334] "Generic (PLEG): container finished" podID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerID="1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576" exitCode=2 Jan 30 05:27:16 crc kubenswrapper[4841]: I0130 05:27:16.189082 4841 generic.go:334] "Generic (PLEG): container finished" podID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerID="586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a" exitCode=0 Jan 30 05:27:16 crc kubenswrapper[4841]: I0130 05:27:16.189108 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454","Type":"ContainerDied","Data":"5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094"} Jan 30 05:27:16 crc kubenswrapper[4841]: I0130 05:27:16.189143 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454","Type":"ContainerDied","Data":"1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576"} Jan 30 05:27:16 crc kubenswrapper[4841]: I0130 05:27:16.189162 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454","Type":"ContainerDied","Data":"586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a"} Jan 30 05:27:16 crc kubenswrapper[4841]: I0130 05:27:16.376686 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:27:16 crc kubenswrapper[4841]: I0130 05:27:16.406382 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.330103603 podStartE2EDuration="8.406349488s" podCreationTimestamp="2026-01-30 05:27:08 +0000 UTC" firstStartedPulling="2026-01-30 05:27:08.979115628 +0000 UTC m=+1165.972588266" lastFinishedPulling="2026-01-30 05:27:14.055361513 +0000 UTC m=+1171.048834151" observedRunningTime="2026-01-30 05:27:15.195248539 +0000 UTC m=+1172.188721177" watchObservedRunningTime="2026-01-30 05:27:16.406349488 +0000 UTC m=+1173.399822166" Jan 30 05:27:16 crc kubenswrapper[4841]: I0130 05:27:16.457817 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f5986d768-c66l5"] Jan 30 05:27:16 crc kubenswrapper[4841]: I0130 05:27:16.458037 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f5986d768-c66l5" podUID="3afd6c02-28f4-4dae-92ea-b3d04085383b" containerName="neutron-api" containerID="cri-o://667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345" gracePeriod=30 Jan 30 05:27:16 crc kubenswrapper[4841]: I0130 05:27:16.458407 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f5986d768-c66l5" podUID="3afd6c02-28f4-4dae-92ea-b3d04085383b" containerName="neutron-httpd" containerID="cri-o://db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f" gracePeriod=30 Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.116323 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.201532 4841 generic.go:334] "Generic (PLEG): container finished" podID="3afd6c02-28f4-4dae-92ea-b3d04085383b" containerID="db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f" exitCode=0 Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.201610 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f5986d768-c66l5" event={"ID":"3afd6c02-28f4-4dae-92ea-b3d04085383b","Type":"ContainerDied","Data":"db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f"} Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.204687 4841 generic.go:334] "Generic (PLEG): container finished" podID="fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" containerID="78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7" exitCode=0 Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.204810 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.204836 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556","Type":"ContainerDied","Data":"78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7"} Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.205160 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556","Type":"ContainerDied","Data":"f9520a966ad3395753a91a91f2d76e8ba7e77f0407a1d27fa908e2ab168dfbd2"} Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.205185 4841 scope.go:117] "RemoveContainer" containerID="78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.238029 4841 scope.go:117] "RemoveContainer" containerID="01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.243064 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-scripts\") pod \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.243117 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-combined-ca-bundle\") pod \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.243160 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.243214 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfqzd\" (UniqueName: \"kubernetes.io/projected/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-kube-api-access-zfqzd\") pod \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.243240 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-httpd-run\") pod \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.243373 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-config-data\") pod \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.243417 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-logs\") pod \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.243458 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-public-tls-certs\") pod \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\" (UID: \"fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556\") " Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.243998 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-logs" (OuterVolumeSpecName: "logs") pod "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" (UID: "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.244221 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" (UID: "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.248503 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" (UID: "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.248887 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-kube-api-access-zfqzd" (OuterVolumeSpecName: "kube-api-access-zfqzd") pod "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" (UID: "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556"). InnerVolumeSpecName "kube-api-access-zfqzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.254734 4841 scope.go:117] "RemoveContainer" containerID="78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7" Jan 30 05:27:17 crc kubenswrapper[4841]: E0130 05:27:17.255193 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7\": container with ID starting with 78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7 not found: ID does not exist" containerID="78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.255290 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7"} err="failed to get container status \"78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7\": rpc error: code = NotFound desc = could not find container \"78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7\": container with ID starting with 78dc4743670a9c1435d9e30b0d851c56bcc707c53c9295d963d7568553c6bac7 not found: ID does not exist" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.255367 4841 scope.go:117] "RemoveContainer" containerID="01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3" Jan 30 05:27:17 crc kubenswrapper[4841]: E0130 05:27:17.255719 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3\": container with ID starting with 01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3 not found: ID does not exist" containerID="01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.255818 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3"} err="failed to get container status \"01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3\": rpc error: code = NotFound desc = could not find container \"01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3\": container with ID starting with 01ea67d3e037bd8ff1c7a1b3da1f887d9bb259c367426b8122a6f452f62883c3 not found: ID does not exist" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.266601 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-scripts" (OuterVolumeSpecName: "scripts") pod "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" (UID: "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.271742 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" (UID: "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.300705 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" (UID: "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.307214 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-config-data" (OuterVolumeSpecName: "config-data") pod "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" (UID: "fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.345280 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.345319 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.345332 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.345343 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.345382 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.345412 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfqzd\" (UniqueName: \"kubernetes.io/projected/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-kube-api-access-zfqzd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.345427 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.345439 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.369507 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.446524 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.538184 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.548486 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.564366 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:27:17 crc kubenswrapper[4841]: E0130 05:27:17.565993 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" containerName="glance-log" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.566011 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" containerName="glance-log" Jan 30 05:27:17 crc kubenswrapper[4841]: E0130 05:27:17.566033 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" containerName="glance-httpd" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.566039 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" containerName="glance-httpd" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.566207 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" containerName="glance-httpd" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.566222 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" containerName="glance-log" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.568780 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.575744 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.576122 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.588975 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.752510 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e49c58-8075-46a1-8bfd-44412a673589-logs\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.752782 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9e49c58-8075-46a1-8bfd-44412a673589-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.752812 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.752833 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.752896 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5pwb\" (UniqueName: \"kubernetes.io/projected/a9e49c58-8075-46a1-8bfd-44412a673589-kube-api-access-m5pwb\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.752924 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-config-data\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.752944 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.752972 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-scripts\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.854990 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5pwb\" (UniqueName: \"kubernetes.io/projected/a9e49c58-8075-46a1-8bfd-44412a673589-kube-api-access-m5pwb\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.855045 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-config-data\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.855072 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.855102 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-scripts\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.855146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e49c58-8075-46a1-8bfd-44412a673589-logs\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.855169 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9e49c58-8075-46a1-8bfd-44412a673589-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.855198 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.855218 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.855521 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.855670 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e49c58-8075-46a1-8bfd-44412a673589-logs\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.855735 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9e49c58-8075-46a1-8bfd-44412a673589-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.860638 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-config-data\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.862373 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.868846 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.877532 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-scripts\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.878044 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5pwb\" (UniqueName: \"kubernetes.io/projected/a9e49c58-8075-46a1-8bfd-44412a673589-kube-api-access-m5pwb\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:17 crc kubenswrapper[4841]: I0130 05:27:17.904689 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " pod="openstack/glance-default-external-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.123712 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.187019 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.218214 4841 generic.go:334] "Generic (PLEG): container finished" podID="9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" containerID="c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723" exitCode=0 Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.218254 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0","Type":"ContainerDied","Data":"c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723"} Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.218280 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0","Type":"ContainerDied","Data":"2d3900b1448fe376ef696f3456d87f5c517c207d97e6b3b893204ecbe37b4613"} Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.218296 4841 scope.go:117] "RemoveContainer" containerID="c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.218390 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.261319 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.261558 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-config-data\") pod \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.261579 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc2n5\" (UniqueName: \"kubernetes.io/projected/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-kube-api-access-bc2n5\") pod \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.261715 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-internal-tls-certs\") pod \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.261776 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-httpd-run\") pod \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.261805 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-scripts\") pod \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.261828 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-logs\") pod \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.261859 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-combined-ca-bundle\") pod \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\" (UID: \"9d68f5be-dc23-4bfd-9dc3-4d934aff76d0\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.263533 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-logs" (OuterVolumeSpecName: "logs") pod "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" (UID: "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.263611 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" (UID: "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.267414 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" (UID: "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.267496 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-scripts" (OuterVolumeSpecName: "scripts") pod "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" (UID: "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.273225 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-kube-api-access-bc2n5" (OuterVolumeSpecName: "kube-api-access-bc2n5") pod "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" (UID: "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0"). InnerVolumeSpecName "kube-api-access-bc2n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.291554 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" (UID: "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.312191 4841 scope.go:117] "RemoveContainer" containerID="e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.326657 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-config-data" (OuterVolumeSpecName: "config-data") pod "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" (UID: "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.345102 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" (UID: "9d68f5be-dc23-4bfd-9dc3-4d934aff76d0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.354551 4841 scope.go:117] "RemoveContainer" containerID="c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723" Jan 30 05:27:18 crc kubenswrapper[4841]: E0130 05:27:18.360738 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723\": container with ID starting with c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723 not found: ID does not exist" containerID="c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.360800 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723"} err="failed to get container status \"c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723\": rpc error: code = NotFound desc = could not find container \"c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723\": container with ID starting with c36e71db3c14264d57e279f0d10849ac31ef0ad0b9a2b4151b61698a82ea7723 not found: ID does not exist" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.360869 4841 scope.go:117] "RemoveContainer" containerID="e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52" Jan 30 05:27:18 crc kubenswrapper[4841]: E0130 05:27:18.361382 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52\": container with ID starting with e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52 not found: ID does not exist" containerID="e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.361447 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52"} err="failed to get container status \"e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52\": rpc error: code = NotFound desc = could not find container \"e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52\": container with ID starting with e583d00a8ed0b6857b53c84da97edb8216115958ba35533068bb8296acb89d52 not found: ID does not exist" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.366188 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.366225 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.366251 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.366262 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.366273 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc2n5\" (UniqueName: \"kubernetes.io/projected/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-kube-api-access-bc2n5\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.366281 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.366288 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.366296 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.386935 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.441499 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556" path="/var/lib/kubelet/pods/fb3bf6fa-a8e3-43f1-b7ea-aac9debc1556/volumes" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.468288 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.570701 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.603899 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.645990 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.676096 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:27:18 crc kubenswrapper[4841]: E0130 05:27:18.676778 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="ceilometer-notification-agent" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.676790 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="ceilometer-notification-agent" Jan 30 05:27:18 crc kubenswrapper[4841]: E0130 05:27:18.676809 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="proxy-httpd" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.676815 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="proxy-httpd" Jan 30 05:27:18 crc kubenswrapper[4841]: E0130 05:27:18.676831 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" containerName="glance-log" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.676838 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" containerName="glance-log" Jan 30 05:27:18 crc kubenswrapper[4841]: E0130 05:27:18.676860 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" containerName="glance-httpd" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.676865 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" containerName="glance-httpd" Jan 30 05:27:18 crc kubenswrapper[4841]: E0130 05:27:18.676876 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="sg-core" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.676882 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="sg-core" Jan 30 05:27:18 crc kubenswrapper[4841]: E0130 05:27:18.676911 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="ceilometer-central-agent" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.676916 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="ceilometer-central-agent" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.677206 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="proxy-httpd" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.677229 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" containerName="glance-log" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.677242 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" containerName="glance-httpd" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.677254 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="ceilometer-notification-agent" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.677271 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="ceilometer-central-agent" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.677288 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerName="sg-core" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.678667 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.698435 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.698748 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.730457 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.779000 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5m9p\" (UniqueName: \"kubernetes.io/projected/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-kube-api-access-z5m9p\") pod \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.779053 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-log-httpd\") pod \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.779092 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-scripts\") pod \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.779128 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-config-data\") pod \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.779181 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-run-httpd\") pod \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.779230 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-sg-core-conf-yaml\") pod \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.779288 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-combined-ca-bundle\") pod \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\" (UID: \"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454\") " Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.785582 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-scripts" (OuterVolumeSpecName: "scripts") pod "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" (UID: "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.785864 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" (UID: "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.786161 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" (UID: "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.802687 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-kube-api-access-z5m9p" (OuterVolumeSpecName: "kube-api-access-z5m9p") pod "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" (UID: "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454"). InnerVolumeSpecName "kube-api-access-z5m9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.859081 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" (UID: "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.865259 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:27:18 crc kubenswrapper[4841]: W0130 05:27:18.868846 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9e49c58_8075_46a1_8bfd_44412a673589.slice/crio-87ea243204ba0ef9295c71af73b5b0686f8305cf223636aee885ab9dcd4663a4 WatchSource:0}: Error finding container 87ea243204ba0ef9295c71af73b5b0686f8305cf223636aee885ab9dcd4663a4: Status 404 returned error can't find the container with id 87ea243204ba0ef9295c71af73b5b0686f8305cf223636aee885ab9dcd4663a4 Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.880670 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.880717 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.880780 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73fdf532-7bb7-43db-acbc-b949166ccd6b-logs\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.880803 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.880818 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73fdf532-7bb7-43db-acbc-b949166ccd6b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.880846 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9ltz\" (UniqueName: \"kubernetes.io/projected/73fdf532-7bb7-43db-acbc-b949166ccd6b-kube-api-access-q9ltz\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.880908 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.880930 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.880972 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.880983 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5m9p\" (UniqueName: \"kubernetes.io/projected/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-kube-api-access-z5m9p\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.880993 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.881001 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.881008 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.894493 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" (UID: "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.940523 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-config-data" (OuterVolumeSpecName: "config-data") pod "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" (UID: "e91f8b2f-dd5c-435f-b0bb-3ee3b8437454"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.982820 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.982909 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73fdf532-7bb7-43db-acbc-b949166ccd6b-logs\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.982932 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.982948 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73fdf532-7bb7-43db-acbc-b949166ccd6b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.982976 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9ltz\" (UniqueName: \"kubernetes.io/projected/73fdf532-7bb7-43db-acbc-b949166ccd6b-kube-api-access-q9ltz\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.983038 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.983057 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.983081 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.983129 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.983140 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.983325 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.983833 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73fdf532-7bb7-43db-acbc-b949166ccd6b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.983940 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73fdf532-7bb7-43db-acbc-b949166ccd6b-logs\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.986983 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.989202 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.990207 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:18 crc kubenswrapper[4841]: I0130 05:27:18.990819 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.001089 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9ltz\" (UniqueName: \"kubernetes.io/projected/73fdf532-7bb7-43db-acbc-b949166ccd6b-kube-api-access-q9ltz\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.010858 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " pod="openstack/glance-default-internal-api-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.045675 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.231634 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9e49c58-8075-46a1-8bfd-44412a673589","Type":"ContainerStarted","Data":"87ea243204ba0ef9295c71af73b5b0686f8305cf223636aee885ab9dcd4663a4"} Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.240454 4841 generic.go:334] "Generic (PLEG): container finished" podID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" containerID="0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62" exitCode=0 Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.240516 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454","Type":"ContainerDied","Data":"0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62"} Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.240554 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91f8b2f-dd5c-435f-b0bb-3ee3b8437454","Type":"ContainerDied","Data":"06b535c52dbbbc78e205de2f1dcc6406f61c1d417aab7b252d06a50b4ea00012"} Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.240571 4841 scope.go:117] "RemoveContainer" containerID="5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.240683 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.283696 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.298459 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.298840 4841 scope.go:117] "RemoveContainer" containerID="1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.308523 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.310557 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.312384 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.312679 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.317131 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.346553 4841 scope.go:117] "RemoveContainer" containerID="586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.367256 4841 scope.go:117] "RemoveContainer" containerID="0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.385842 4841 scope.go:117] "RemoveContainer" containerID="5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094" Jan 30 05:27:19 crc kubenswrapper[4841]: E0130 05:27:19.386192 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094\": container with ID starting with 5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094 not found: ID does not exist" containerID="5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.386225 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094"} err="failed to get container status \"5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094\": rpc error: code = NotFound desc = could not find container \"5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094\": container with ID starting with 5caf65545c81b78d2268c8116ba3e02ad11edfa11cf359b548e9753b060d0094 not found: ID does not exist" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.386252 4841 scope.go:117] "RemoveContainer" containerID="1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576" Jan 30 05:27:19 crc kubenswrapper[4841]: E0130 05:27:19.386491 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576\": container with ID starting with 1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576 not found: ID does not exist" containerID="1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.386522 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576"} err="failed to get container status \"1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576\": rpc error: code = NotFound desc = could not find container \"1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576\": container with ID starting with 1ceb7f5e9a558dae0e5eae7bdc291afb240bee21c618849f0e9bae2aab853576 not found: ID does not exist" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.386556 4841 scope.go:117] "RemoveContainer" containerID="586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a" Jan 30 05:27:19 crc kubenswrapper[4841]: E0130 05:27:19.386893 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a\": container with ID starting with 586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a not found: ID does not exist" containerID="586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.386910 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a"} err="failed to get container status \"586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a\": rpc error: code = NotFound desc = could not find container \"586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a\": container with ID starting with 586d420b895c7411701364fb161bd74f8731eba821f9a52dd49f75667cd03e3a not found: ID does not exist" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.386923 4841 scope.go:117] "RemoveContainer" containerID="0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62" Jan 30 05:27:19 crc kubenswrapper[4841]: E0130 05:27:19.387133 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62\": container with ID starting with 0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62 not found: ID does not exist" containerID="0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.387149 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62"} err="failed to get container status \"0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62\": rpc error: code = NotFound desc = could not find container \"0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62\": container with ID starting with 0137347f2747624021a9014e22d5049b818216eae761b0d21774d63a8737eb62 not found: ID does not exist" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.393051 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x97pl\" (UniqueName: \"kubernetes.io/projected/89b3dd40-3f94-42a2-ab50-564dbe9eef21-kube-api-access-x97pl\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.393096 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.393155 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b3dd40-3f94-42a2-ab50-564dbe9eef21-log-httpd\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.393172 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.393207 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-config-data\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.393233 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b3dd40-3f94-42a2-ab50-564dbe9eef21-run-httpd\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.393251 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-scripts\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.494441 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b3dd40-3f94-42a2-ab50-564dbe9eef21-run-httpd\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.494486 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-scripts\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.494611 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x97pl\" (UniqueName: \"kubernetes.io/projected/89b3dd40-3f94-42a2-ab50-564dbe9eef21-kube-api-access-x97pl\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.494657 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.494685 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b3dd40-3f94-42a2-ab50-564dbe9eef21-log-httpd\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.494701 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.494752 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-config-data\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.495215 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b3dd40-3f94-42a2-ab50-564dbe9eef21-log-httpd\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.495748 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b3dd40-3f94-42a2-ab50-564dbe9eef21-run-httpd\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.502221 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-scripts\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.503028 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.503895 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.504441 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-config-data\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.513324 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x97pl\" (UniqueName: \"kubernetes.io/projected/89b3dd40-3f94-42a2-ab50-564dbe9eef21-kube-api-access-x97pl\") pod \"ceilometer-0\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " pod="openstack/ceilometer-0" Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.586291 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:27:19 crc kubenswrapper[4841]: I0130 05:27:19.672751 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.137304 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.266971 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"73fdf532-7bb7-43db-acbc-b949166ccd6b","Type":"ContainerStarted","Data":"f9bffcb1c98bf20cc71c95a2813cb2b9fa85cff321938e56af140ba1916af595"} Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.267221 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"73fdf532-7bb7-43db-acbc-b949166ccd6b","Type":"ContainerStarted","Data":"8c202ec5dddac615730615bf9ba4d68a18e872bb7cf1ae7d419c5d66943a6f6c"} Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.272651 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9e49c58-8075-46a1-8bfd-44412a673589","Type":"ContainerStarted","Data":"423863723ace49a11675767356ead1a32dc7658d7769ea7c64a96cba5343a3ca"} Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.272696 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9e49c58-8075-46a1-8bfd-44412a673589","Type":"ContainerStarted","Data":"31ef83de4f5eec8256ad9a9714b034f94d5c23abce8bac25f0acaf273fe574d8"} Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.275615 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b3dd40-3f94-42a2-ab50-564dbe9eef21","Type":"ContainerStarted","Data":"e4e1e45b3f81028e1b0286239f83e2f457d0028dbc470739e382b0d7ce70b0a7"} Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.293993 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.293978674 podStartE2EDuration="3.293978674s" podCreationTimestamp="2026-01-30 05:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:20.293753418 +0000 UTC m=+1177.287226056" watchObservedRunningTime="2026-01-30 05:27:20.293978674 +0000 UTC m=+1177.287451302" Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.450002 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d68f5be-dc23-4bfd-9dc3-4d934aff76d0" path="/var/lib/kubelet/pods/9d68f5be-dc23-4bfd-9dc3-4d934aff76d0/volumes" Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.450962 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e91f8b2f-dd5c-435f-b0bb-3ee3b8437454" path="/var/lib/kubelet/pods/e91f8b2f-dd5c-435f-b0bb-3ee3b8437454/volumes" Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.839628 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.917746 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-httpd-config\") pod \"3afd6c02-28f4-4dae-92ea-b3d04085383b\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.917890 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-combined-ca-bundle\") pod \"3afd6c02-28f4-4dae-92ea-b3d04085383b\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.917921 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hmbk\" (UniqueName: \"kubernetes.io/projected/3afd6c02-28f4-4dae-92ea-b3d04085383b-kube-api-access-6hmbk\") pod \"3afd6c02-28f4-4dae-92ea-b3d04085383b\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.918086 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-config\") pod \"3afd6c02-28f4-4dae-92ea-b3d04085383b\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.918111 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-ovndb-tls-certs\") pod \"3afd6c02-28f4-4dae-92ea-b3d04085383b\" (UID: \"3afd6c02-28f4-4dae-92ea-b3d04085383b\") " Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.936052 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3afd6c02-28f4-4dae-92ea-b3d04085383b" (UID: "3afd6c02-28f4-4dae-92ea-b3d04085383b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.936449 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afd6c02-28f4-4dae-92ea-b3d04085383b-kube-api-access-6hmbk" (OuterVolumeSpecName: "kube-api-access-6hmbk") pod "3afd6c02-28f4-4dae-92ea-b3d04085383b" (UID: "3afd6c02-28f4-4dae-92ea-b3d04085383b"). InnerVolumeSpecName "kube-api-access-6hmbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.966992 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-config" (OuterVolumeSpecName: "config") pod "3afd6c02-28f4-4dae-92ea-b3d04085383b" (UID: "3afd6c02-28f4-4dae-92ea-b3d04085383b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.975969 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3afd6c02-28f4-4dae-92ea-b3d04085383b" (UID: "3afd6c02-28f4-4dae-92ea-b3d04085383b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:20 crc kubenswrapper[4841]: I0130 05:27:20.979481 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3afd6c02-28f4-4dae-92ea-b3d04085383b" (UID: "3afd6c02-28f4-4dae-92ea-b3d04085383b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.020755 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.020996 4841 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.021072 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.021138 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afd6c02-28f4-4dae-92ea-b3d04085383b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.021200 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hmbk\" (UniqueName: \"kubernetes.io/projected/3afd6c02-28f4-4dae-92ea-b3d04085383b-kube-api-access-6hmbk\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.303717 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"73fdf532-7bb7-43db-acbc-b949166ccd6b","Type":"ContainerStarted","Data":"249a278941e10cc86f1a6bbc7212e043a936d0a3e53d7ee03f0ed2a158d8459b"} Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.306259 4841 generic.go:334] "Generic (PLEG): container finished" podID="3afd6c02-28f4-4dae-92ea-b3d04085383b" containerID="667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345" exitCode=0 Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.306345 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f5986d768-c66l5" event={"ID":"3afd6c02-28f4-4dae-92ea-b3d04085383b","Type":"ContainerDied","Data":"667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345"} Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.306384 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f5986d768-c66l5" event={"ID":"3afd6c02-28f4-4dae-92ea-b3d04085383b","Type":"ContainerDied","Data":"36b6381960abe233954061fb0206a9bd82970ebe445f1def087e33f4c6cbb3d9"} Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.306455 4841 scope.go:117] "RemoveContainer" containerID="db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.306612 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f5986d768-c66l5" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.321003 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b3dd40-3f94-42a2-ab50-564dbe9eef21","Type":"ContainerStarted","Data":"3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16"} Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.337729 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.337708512 podStartE2EDuration="3.337708512s" podCreationTimestamp="2026-01-30 05:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:21.327607058 +0000 UTC m=+1178.321079726" watchObservedRunningTime="2026-01-30 05:27:21.337708512 +0000 UTC m=+1178.331181150" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.355734 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f5986d768-c66l5"] Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.359948 4841 scope.go:117] "RemoveContainer" containerID="667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.363027 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f5986d768-c66l5"] Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.381893 4841 scope.go:117] "RemoveContainer" containerID="db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f" Jan 30 05:27:21 crc kubenswrapper[4841]: E0130 05:27:21.384438 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f\": container with ID starting with db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f not found: ID does not exist" containerID="db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.384472 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f"} err="failed to get container status \"db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f\": rpc error: code = NotFound desc = could not find container \"db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f\": container with ID starting with db511b3c63e64490035cd3249413695bdf49e4668ecf1a798e170265971f471f not found: ID does not exist" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.384493 4841 scope.go:117] "RemoveContainer" containerID="667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345" Jan 30 05:27:21 crc kubenswrapper[4841]: E0130 05:27:21.385092 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345\": container with ID starting with 667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345 not found: ID does not exist" containerID="667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.385132 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345"} err="failed to get container status \"667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345\": rpc error: code = NotFound desc = could not find container \"667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345\": container with ID starting with 667fc496f5fcedd9d79f070ced5a148226c33b47bd2a4db4a27b4170a9ee9345 not found: ID does not exist" Jan 30 05:27:21 crc kubenswrapper[4841]: I0130 05:27:21.969013 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:22 crc kubenswrapper[4841]: I0130 05:27:22.344469 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b3dd40-3f94-42a2-ab50-564dbe9eef21","Type":"ContainerStarted","Data":"69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889"} Jan 30 05:27:22 crc kubenswrapper[4841]: I0130 05:27:22.344580 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b3dd40-3f94-42a2-ab50-564dbe9eef21","Type":"ContainerStarted","Data":"ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7"} Jan 30 05:27:22 crc kubenswrapper[4841]: I0130 05:27:22.445909 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afd6c02-28f4-4dae-92ea-b3d04085383b" path="/var/lib/kubelet/pods/3afd6c02-28f4-4dae-92ea-b3d04085383b/volumes" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.629011 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-j4ls7"] Jan 30 05:27:23 crc kubenswrapper[4841]: E0130 05:27:23.629383 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afd6c02-28f4-4dae-92ea-b3d04085383b" containerName="neutron-api" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.629411 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afd6c02-28f4-4dae-92ea-b3d04085383b" containerName="neutron-api" Jan 30 05:27:23 crc kubenswrapper[4841]: E0130 05:27:23.629427 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afd6c02-28f4-4dae-92ea-b3d04085383b" containerName="neutron-httpd" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.629435 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afd6c02-28f4-4dae-92ea-b3d04085383b" containerName="neutron-httpd" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.629597 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afd6c02-28f4-4dae-92ea-b3d04085383b" containerName="neutron-httpd" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.629623 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afd6c02-28f4-4dae-92ea-b3d04085383b" containerName="neutron-api" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.630151 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j4ls7" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.660062 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-j4ls7"] Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.678701 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2214644-246d-408c-9315-91b23e85d3f2-operator-scripts\") pod \"nova-api-db-create-j4ls7\" (UID: \"d2214644-246d-408c-9315-91b23e85d3f2\") " pod="openstack/nova-api-db-create-j4ls7" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.678897 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxfdx\" (UniqueName: \"kubernetes.io/projected/d2214644-246d-408c-9315-91b23e85d3f2-kube-api-access-lxfdx\") pod \"nova-api-db-create-j4ls7\" (UID: \"d2214644-246d-408c-9315-91b23e85d3f2\") " pod="openstack/nova-api-db-create-j4ls7" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.723276 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-brvmt"] Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.724460 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-brvmt" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.739478 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-brvmt"] Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.750239 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d99c-account-create-update-hpmhc"] Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.751317 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d99c-account-create-update-hpmhc" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.753145 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.759949 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d99c-account-create-update-hpmhc"] Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.780131 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzhbj\" (UniqueName: \"kubernetes.io/projected/4f2c58b7-18b1-453c-be45-be86c5008871-kube-api-access-hzhbj\") pod \"nova-cell0-db-create-brvmt\" (UID: \"4f2c58b7-18b1-453c-be45-be86c5008871\") " pod="openstack/nova-cell0-db-create-brvmt" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.780298 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f2c58b7-18b1-453c-be45-be86c5008871-operator-scripts\") pod \"nova-cell0-db-create-brvmt\" (UID: \"4f2c58b7-18b1-453c-be45-be86c5008871\") " pod="openstack/nova-cell0-db-create-brvmt" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.780350 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfdx\" (UniqueName: \"kubernetes.io/projected/d2214644-246d-408c-9315-91b23e85d3f2-kube-api-access-lxfdx\") pod \"nova-api-db-create-j4ls7\" (UID: \"d2214644-246d-408c-9315-91b23e85d3f2\") " pod="openstack/nova-api-db-create-j4ls7" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.780429 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2214644-246d-408c-9315-91b23e85d3f2-operator-scripts\") pod \"nova-api-db-create-j4ls7\" (UID: \"d2214644-246d-408c-9315-91b23e85d3f2\") " pod="openstack/nova-api-db-create-j4ls7" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.781191 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2214644-246d-408c-9315-91b23e85d3f2-operator-scripts\") pod \"nova-api-db-create-j4ls7\" (UID: \"d2214644-246d-408c-9315-91b23e85d3f2\") " pod="openstack/nova-api-db-create-j4ls7" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.802277 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxfdx\" (UniqueName: \"kubernetes.io/projected/d2214644-246d-408c-9315-91b23e85d3f2-kube-api-access-lxfdx\") pod \"nova-api-db-create-j4ls7\" (UID: \"d2214644-246d-408c-9315-91b23e85d3f2\") " pod="openstack/nova-api-db-create-j4ls7" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.882139 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmbkr\" (UniqueName: \"kubernetes.io/projected/60131c9c-b83a-472c-ad14-5ea846e9b04d-kube-api-access-lmbkr\") pod \"nova-api-d99c-account-create-update-hpmhc\" (UID: \"60131c9c-b83a-472c-ad14-5ea846e9b04d\") " pod="openstack/nova-api-d99c-account-create-update-hpmhc" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.882240 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f2c58b7-18b1-453c-be45-be86c5008871-operator-scripts\") pod \"nova-cell0-db-create-brvmt\" (UID: \"4f2c58b7-18b1-453c-be45-be86c5008871\") " pod="openstack/nova-cell0-db-create-brvmt" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.882671 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60131c9c-b83a-472c-ad14-5ea846e9b04d-operator-scripts\") pod \"nova-api-d99c-account-create-update-hpmhc\" (UID: \"60131c9c-b83a-472c-ad14-5ea846e9b04d\") " pod="openstack/nova-api-d99c-account-create-update-hpmhc" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.882777 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzhbj\" (UniqueName: \"kubernetes.io/projected/4f2c58b7-18b1-453c-be45-be86c5008871-kube-api-access-hzhbj\") pod \"nova-cell0-db-create-brvmt\" (UID: \"4f2c58b7-18b1-453c-be45-be86c5008871\") " pod="openstack/nova-cell0-db-create-brvmt" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.883304 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f2c58b7-18b1-453c-be45-be86c5008871-operator-scripts\") pod \"nova-cell0-db-create-brvmt\" (UID: \"4f2c58b7-18b1-453c-be45-be86c5008871\") " pod="openstack/nova-cell0-db-create-brvmt" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.914042 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzhbj\" (UniqueName: \"kubernetes.io/projected/4f2c58b7-18b1-453c-be45-be86c5008871-kube-api-access-hzhbj\") pod \"nova-cell0-db-create-brvmt\" (UID: \"4f2c58b7-18b1-453c-be45-be86c5008871\") " pod="openstack/nova-cell0-db-create-brvmt" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.942858 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wsbdm"] Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.944701 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wsbdm" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.955506 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j4ls7" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.969688 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1fec-account-create-update-9sgmx"] Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.971804 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.973288 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.981557 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wsbdm"] Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.987486 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60131c9c-b83a-472c-ad14-5ea846e9b04d-operator-scripts\") pod \"nova-api-d99c-account-create-update-hpmhc\" (UID: \"60131c9c-b83a-472c-ad14-5ea846e9b04d\") " pod="openstack/nova-api-d99c-account-create-update-hpmhc" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.987586 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmbkr\" (UniqueName: \"kubernetes.io/projected/60131c9c-b83a-472c-ad14-5ea846e9b04d-kube-api-access-lmbkr\") pod \"nova-api-d99c-account-create-update-hpmhc\" (UID: \"60131c9c-b83a-472c-ad14-5ea846e9b04d\") " pod="openstack/nova-api-d99c-account-create-update-hpmhc" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.988390 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60131c9c-b83a-472c-ad14-5ea846e9b04d-operator-scripts\") pod \"nova-api-d99c-account-create-update-hpmhc\" (UID: \"60131c9c-b83a-472c-ad14-5ea846e9b04d\") " pod="openstack/nova-api-d99c-account-create-update-hpmhc" Jan 30 05:27:23 crc kubenswrapper[4841]: I0130 05:27:23.990553 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1fec-account-create-update-9sgmx"] Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.011876 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmbkr\" (UniqueName: \"kubernetes.io/projected/60131c9c-b83a-472c-ad14-5ea846e9b04d-kube-api-access-lmbkr\") pod \"nova-api-d99c-account-create-update-hpmhc\" (UID: \"60131c9c-b83a-472c-ad14-5ea846e9b04d\") " pod="openstack/nova-api-d99c-account-create-update-hpmhc" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.041867 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-brvmt" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.090678 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85fvk\" (UniqueName: \"kubernetes.io/projected/bda8dda1-2f95-43f2-952f-79c7f8adbb63-kube-api-access-85fvk\") pod \"nova-cell1-db-create-wsbdm\" (UID: \"bda8dda1-2f95-43f2-952f-79c7f8adbb63\") " pod="openstack/nova-cell1-db-create-wsbdm" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.090927 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3-operator-scripts\") pod \"nova-cell0-1fec-account-create-update-9sgmx\" (UID: \"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3\") " pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.090946 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda8dda1-2f95-43f2-952f-79c7f8adbb63-operator-scripts\") pod \"nova-cell1-db-create-wsbdm\" (UID: \"bda8dda1-2f95-43f2-952f-79c7f8adbb63\") " pod="openstack/nova-cell1-db-create-wsbdm" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.090978 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkkkd\" (UniqueName: \"kubernetes.io/projected/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3-kube-api-access-pkkkd\") pod \"nova-cell0-1fec-account-create-update-9sgmx\" (UID: \"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3\") " pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.144924 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-164a-account-create-update-vqvpq"] Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.146224 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-164a-account-create-update-vqvpq" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.148656 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d99c-account-create-update-hpmhc" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.149802 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.152329 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-164a-account-create-update-vqvpq"] Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.194237 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85fvk\" (UniqueName: \"kubernetes.io/projected/bda8dda1-2f95-43f2-952f-79c7f8adbb63-kube-api-access-85fvk\") pod \"nova-cell1-db-create-wsbdm\" (UID: \"bda8dda1-2f95-43f2-952f-79c7f8adbb63\") " pod="openstack/nova-cell1-db-create-wsbdm" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.194285 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3-operator-scripts\") pod \"nova-cell0-1fec-account-create-update-9sgmx\" (UID: \"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3\") " pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.194307 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda8dda1-2f95-43f2-952f-79c7f8adbb63-operator-scripts\") pod \"nova-cell1-db-create-wsbdm\" (UID: \"bda8dda1-2f95-43f2-952f-79c7f8adbb63\") " pod="openstack/nova-cell1-db-create-wsbdm" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.194342 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkkkd\" (UniqueName: \"kubernetes.io/projected/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3-kube-api-access-pkkkd\") pod \"nova-cell0-1fec-account-create-update-9sgmx\" (UID: \"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3\") " pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.195348 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda8dda1-2f95-43f2-952f-79c7f8adbb63-operator-scripts\") pod \"nova-cell1-db-create-wsbdm\" (UID: \"bda8dda1-2f95-43f2-952f-79c7f8adbb63\") " pod="openstack/nova-cell1-db-create-wsbdm" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.199014 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3-operator-scripts\") pod \"nova-cell0-1fec-account-create-update-9sgmx\" (UID: \"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3\") " pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.212927 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85fvk\" (UniqueName: \"kubernetes.io/projected/bda8dda1-2f95-43f2-952f-79c7f8adbb63-kube-api-access-85fvk\") pod \"nova-cell1-db-create-wsbdm\" (UID: \"bda8dda1-2f95-43f2-952f-79c7f8adbb63\") " pod="openstack/nova-cell1-db-create-wsbdm" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.215268 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkkkd\" (UniqueName: \"kubernetes.io/projected/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3-kube-api-access-pkkkd\") pod \"nova-cell0-1fec-account-create-update-9sgmx\" (UID: \"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3\") " pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.296412 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1-operator-scripts\") pod \"nova-cell1-164a-account-create-update-vqvpq\" (UID: \"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1\") " pod="openstack/nova-cell1-164a-account-create-update-vqvpq" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.296471 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4psfw\" (UniqueName: \"kubernetes.io/projected/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1-kube-api-access-4psfw\") pod \"nova-cell1-164a-account-create-update-vqvpq\" (UID: \"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1\") " pod="openstack/nova-cell1-164a-account-create-update-vqvpq" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.341222 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wsbdm" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.349682 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.369306 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b3dd40-3f94-42a2-ab50-564dbe9eef21","Type":"ContainerStarted","Data":"1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66"} Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.369482 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="ceilometer-central-agent" containerID="cri-o://3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16" gracePeriod=30 Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.369729 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.369945 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="proxy-httpd" containerID="cri-o://1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66" gracePeriod=30 Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.369987 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="sg-core" containerID="cri-o://69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889" gracePeriod=30 Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.370020 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="ceilometer-notification-agent" containerID="cri-o://ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7" gracePeriod=30 Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.398427 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1-operator-scripts\") pod \"nova-cell1-164a-account-create-update-vqvpq\" (UID: \"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1\") " pod="openstack/nova-cell1-164a-account-create-update-vqvpq" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.398518 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4psfw\" (UniqueName: \"kubernetes.io/projected/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1-kube-api-access-4psfw\") pod \"nova-cell1-164a-account-create-update-vqvpq\" (UID: \"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1\") " pod="openstack/nova-cell1-164a-account-create-update-vqvpq" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.400133 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1-operator-scripts\") pod \"nova-cell1-164a-account-create-update-vqvpq\" (UID: \"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1\") " pod="openstack/nova-cell1-164a-account-create-update-vqvpq" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.407069 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.798109618 podStartE2EDuration="5.407009868s" podCreationTimestamp="2026-01-30 05:27:19 +0000 UTC" firstStartedPulling="2026-01-30 05:27:20.143022413 +0000 UTC m=+1177.136495061" lastFinishedPulling="2026-01-30 05:27:23.751922673 +0000 UTC m=+1180.745395311" observedRunningTime="2026-01-30 05:27:24.397515811 +0000 UTC m=+1181.390988449" watchObservedRunningTime="2026-01-30 05:27:24.407009868 +0000 UTC m=+1181.400482506" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.418608 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4psfw\" (UniqueName: \"kubernetes.io/projected/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1-kube-api-access-4psfw\") pod \"nova-cell1-164a-account-create-update-vqvpq\" (UID: \"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1\") " pod="openstack/nova-cell1-164a-account-create-update-vqvpq" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.465946 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-j4ls7"] Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.469878 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-164a-account-create-update-vqvpq" Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.551199 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-brvmt"] Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.653669 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d99c-account-create-update-hpmhc"] Jan 30 05:27:24 crc kubenswrapper[4841]: W0130 05:27:24.690559 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60131c9c_b83a_472c_ad14_5ea846e9b04d.slice/crio-d6c2bcd6b0ffd18a74d6935bd7432c7c4944451034002027660bbf861cc2b3f1 WatchSource:0}: Error finding container d6c2bcd6b0ffd18a74d6935bd7432c7c4944451034002027660bbf861cc2b3f1: Status 404 returned error can't find the container with id d6c2bcd6b0ffd18a74d6935bd7432c7c4944451034002027660bbf861cc2b3f1 Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.843550 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wsbdm"] Jan 30 05:27:24 crc kubenswrapper[4841]: I0130 05:27:24.923702 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1fec-account-create-update-9sgmx"] Jan 30 05:27:24 crc kubenswrapper[4841]: E0130 05:27:24.983881 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89b3dd40_3f94_42a2_ab50_564dbe9eef21.slice/crio-ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89b3dd40_3f94_42a2_ab50_564dbe9eef21.slice/crio-conmon-ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.009547 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-164a-account-create-update-vqvpq"] Jan 30 05:27:25 crc kubenswrapper[4841]: W0130 05:27:25.013512 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac5bcf96_ceff_47c0_9dbb_4a96c0e7c8b1.slice/crio-6f6f2ce027a0dc7c2c6439f60d4c477952aa659a1fb40b3e21bbdcb7d6ef724a WatchSource:0}: Error finding container 6f6f2ce027a0dc7c2c6439f60d4c477952aa659a1fb40b3e21bbdcb7d6ef724a: Status 404 returned error can't find the container with id 6f6f2ce027a0dc7c2c6439f60d4c477952aa659a1fb40b3e21bbdcb7d6ef724a Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.381887 4841 generic.go:334] "Generic (PLEG): container finished" podID="4f2c58b7-18b1-453c-be45-be86c5008871" containerID="516c5fd4cd75d7f12188d52666c9e2f8ee88173bfc6842c75dfd525085b58059" exitCode=0 Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.382006 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-brvmt" event={"ID":"4f2c58b7-18b1-453c-be45-be86c5008871","Type":"ContainerDied","Data":"516c5fd4cd75d7f12188d52666c9e2f8ee88173bfc6842c75dfd525085b58059"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.382070 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-brvmt" event={"ID":"4f2c58b7-18b1-453c-be45-be86c5008871","Type":"ContainerStarted","Data":"2b11ec490502485fcfffcecfb7d41c30f806c1374b9a25e453ab159f18f04543"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.384520 4841 generic.go:334] "Generic (PLEG): container finished" podID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerID="1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66" exitCode=0 Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.384561 4841 generic.go:334] "Generic (PLEG): container finished" podID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerID="69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889" exitCode=2 Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.384569 4841 generic.go:334] "Generic (PLEG): container finished" podID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerID="ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7" exitCode=0 Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.384597 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b3dd40-3f94-42a2-ab50-564dbe9eef21","Type":"ContainerDied","Data":"1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.384639 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b3dd40-3f94-42a2-ab50-564dbe9eef21","Type":"ContainerDied","Data":"69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.384652 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b3dd40-3f94-42a2-ab50-564dbe9eef21","Type":"ContainerDied","Data":"ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.387383 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-164a-account-create-update-vqvpq" event={"ID":"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1","Type":"ContainerStarted","Data":"4569b4b5e37876eda335dcbd62e5a674767059e5c757482cdf828e323ff293d5"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.387549 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-164a-account-create-update-vqvpq" event={"ID":"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1","Type":"ContainerStarted","Data":"6f6f2ce027a0dc7c2c6439f60d4c477952aa659a1fb40b3e21bbdcb7d6ef724a"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.389056 4841 generic.go:334] "Generic (PLEG): container finished" podID="60131c9c-b83a-472c-ad14-5ea846e9b04d" containerID="4d21eef9c1c9ed046a97390ad3dd4aed5f37df0881ec47b58bf4b020f79db06f" exitCode=0 Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.389137 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d99c-account-create-update-hpmhc" event={"ID":"60131c9c-b83a-472c-ad14-5ea846e9b04d","Type":"ContainerDied","Data":"4d21eef9c1c9ed046a97390ad3dd4aed5f37df0881ec47b58bf4b020f79db06f"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.389286 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d99c-account-create-update-hpmhc" event={"ID":"60131c9c-b83a-472c-ad14-5ea846e9b04d","Type":"ContainerStarted","Data":"d6c2bcd6b0ffd18a74d6935bd7432c7c4944451034002027660bbf861cc2b3f1"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.391678 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wsbdm" event={"ID":"bda8dda1-2f95-43f2-952f-79c7f8adbb63","Type":"ContainerStarted","Data":"f6437c7d036b9ccabd53ece2b8030d2d059235373e2a041d4eb65c202968d653"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.391830 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wsbdm" event={"ID":"bda8dda1-2f95-43f2-952f-79c7f8adbb63","Type":"ContainerStarted","Data":"5fffc879a06934af09c7e65ae13fa718580400939977c7d4fc272b534c2df8b5"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.393613 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" event={"ID":"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3","Type":"ContainerStarted","Data":"f7006000010c70fd3939a321e93d354c55c7a81ade457ce7388432e79c0a136c"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.393676 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" event={"ID":"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3","Type":"ContainerStarted","Data":"98ab9b295002614d1e5c31c50833f338a837c5a3441107b9df1a384bf3783f3a"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.395152 4841 generic.go:334] "Generic (PLEG): container finished" podID="d2214644-246d-408c-9315-91b23e85d3f2" containerID="324bbfef51820fed27231247bde9ff6d0f8f2aab3fd13c87c30968e9cff134b6" exitCode=0 Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.395194 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j4ls7" event={"ID":"d2214644-246d-408c-9315-91b23e85d3f2","Type":"ContainerDied","Data":"324bbfef51820fed27231247bde9ff6d0f8f2aab3fd13c87c30968e9cff134b6"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.395238 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j4ls7" event={"ID":"d2214644-246d-408c-9315-91b23e85d3f2","Type":"ContainerStarted","Data":"131f8b9782786986578544fa75acec7ebb036171532c9d718eeb9d9ff56eff1d"} Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.429292 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" podStartSLOduration=2.429276716 podStartE2EDuration="2.429276716s" podCreationTimestamp="2026-01-30 05:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:25.428823583 +0000 UTC m=+1182.422296221" watchObservedRunningTime="2026-01-30 05:27:25.429276716 +0000 UTC m=+1182.422749354" Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.448532 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-wsbdm" podStartSLOduration=2.448514269 podStartE2EDuration="2.448514269s" podCreationTimestamp="2026-01-30 05:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:25.44359483 +0000 UTC m=+1182.437067468" watchObservedRunningTime="2026-01-30 05:27:25.448514269 +0000 UTC m=+1182.441986907" Jan 30 05:27:25 crc kubenswrapper[4841]: I0130 05:27:25.460832 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-164a-account-create-update-vqvpq" podStartSLOduration=1.460817321 podStartE2EDuration="1.460817321s" podCreationTimestamp="2026-01-30 05:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:25.460480322 +0000 UTC m=+1182.453952960" watchObservedRunningTime="2026-01-30 05:27:25.460817321 +0000 UTC m=+1182.454289959" Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.412054 4841 generic.go:334] "Generic (PLEG): container finished" podID="3c5871a8-286c-4ae9-90b8-5accaf3e8fa3" containerID="f7006000010c70fd3939a321e93d354c55c7a81ade457ce7388432e79c0a136c" exitCode=0 Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.412180 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" event={"ID":"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3","Type":"ContainerDied","Data":"f7006000010c70fd3939a321e93d354c55c7a81ade457ce7388432e79c0a136c"} Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.417120 4841 generic.go:334] "Generic (PLEG): container finished" podID="ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1" containerID="4569b4b5e37876eda335dcbd62e5a674767059e5c757482cdf828e323ff293d5" exitCode=0 Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.417209 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-164a-account-create-update-vqvpq" event={"ID":"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1","Type":"ContainerDied","Data":"4569b4b5e37876eda335dcbd62e5a674767059e5c757482cdf828e323ff293d5"} Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.420649 4841 generic.go:334] "Generic (PLEG): container finished" podID="bda8dda1-2f95-43f2-952f-79c7f8adbb63" containerID="f6437c7d036b9ccabd53ece2b8030d2d059235373e2a041d4eb65c202968d653" exitCode=0 Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.421060 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wsbdm" event={"ID":"bda8dda1-2f95-43f2-952f-79c7f8adbb63","Type":"ContainerDied","Data":"f6437c7d036b9ccabd53ece2b8030d2d059235373e2a041d4eb65c202968d653"} Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.821415 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-brvmt" Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.906467 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j4ls7" Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.911450 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d99c-account-create-update-hpmhc" Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.959833 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzhbj\" (UniqueName: \"kubernetes.io/projected/4f2c58b7-18b1-453c-be45-be86c5008871-kube-api-access-hzhbj\") pod \"4f2c58b7-18b1-453c-be45-be86c5008871\" (UID: \"4f2c58b7-18b1-453c-be45-be86c5008871\") " Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.959983 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f2c58b7-18b1-453c-be45-be86c5008871-operator-scripts\") pod \"4f2c58b7-18b1-453c-be45-be86c5008871\" (UID: \"4f2c58b7-18b1-453c-be45-be86c5008871\") " Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.961776 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f2c58b7-18b1-453c-be45-be86c5008871-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f2c58b7-18b1-453c-be45-be86c5008871" (UID: "4f2c58b7-18b1-453c-be45-be86c5008871"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:27:26 crc kubenswrapper[4841]: I0130 05:27:26.969789 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2c58b7-18b1-453c-be45-be86c5008871-kube-api-access-hzhbj" (OuterVolumeSpecName: "kube-api-access-hzhbj") pod "4f2c58b7-18b1-453c-be45-be86c5008871" (UID: "4f2c58b7-18b1-453c-be45-be86c5008871"). InnerVolumeSpecName "kube-api-access-hzhbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.067510 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmbkr\" (UniqueName: \"kubernetes.io/projected/60131c9c-b83a-472c-ad14-5ea846e9b04d-kube-api-access-lmbkr\") pod \"60131c9c-b83a-472c-ad14-5ea846e9b04d\" (UID: \"60131c9c-b83a-472c-ad14-5ea846e9b04d\") " Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.068807 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2214644-246d-408c-9315-91b23e85d3f2-operator-scripts\") pod \"d2214644-246d-408c-9315-91b23e85d3f2\" (UID: \"d2214644-246d-408c-9315-91b23e85d3f2\") " Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.068961 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60131c9c-b83a-472c-ad14-5ea846e9b04d-operator-scripts\") pod \"60131c9c-b83a-472c-ad14-5ea846e9b04d\" (UID: \"60131c9c-b83a-472c-ad14-5ea846e9b04d\") " Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.069668 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxfdx\" (UniqueName: \"kubernetes.io/projected/d2214644-246d-408c-9315-91b23e85d3f2-kube-api-access-lxfdx\") pod \"d2214644-246d-408c-9315-91b23e85d3f2\" (UID: \"d2214644-246d-408c-9315-91b23e85d3f2\") " Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.069718 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60131c9c-b83a-472c-ad14-5ea846e9b04d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60131c9c-b83a-472c-ad14-5ea846e9b04d" (UID: "60131c9c-b83a-472c-ad14-5ea846e9b04d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.070629 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2214644-246d-408c-9315-91b23e85d3f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2214644-246d-408c-9315-91b23e85d3f2" (UID: "d2214644-246d-408c-9315-91b23e85d3f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.074017 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2214644-246d-408c-9315-91b23e85d3f2-kube-api-access-lxfdx" (OuterVolumeSpecName: "kube-api-access-lxfdx") pod "d2214644-246d-408c-9315-91b23e85d3f2" (UID: "d2214644-246d-408c-9315-91b23e85d3f2"). InnerVolumeSpecName "kube-api-access-lxfdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.075158 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60131c9c-b83a-472c-ad14-5ea846e9b04d-kube-api-access-lmbkr" (OuterVolumeSpecName: "kube-api-access-lmbkr") pod "60131c9c-b83a-472c-ad14-5ea846e9b04d" (UID: "60131c9c-b83a-472c-ad14-5ea846e9b04d"). InnerVolumeSpecName "kube-api-access-lmbkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.076268 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2214644-246d-408c-9315-91b23e85d3f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.076302 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60131c9c-b83a-472c-ad14-5ea846e9b04d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.076314 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzhbj\" (UniqueName: \"kubernetes.io/projected/4f2c58b7-18b1-453c-be45-be86c5008871-kube-api-access-hzhbj\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.076325 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxfdx\" (UniqueName: \"kubernetes.io/projected/d2214644-246d-408c-9315-91b23e85d3f2-kube-api-access-lxfdx\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.076335 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f2c58b7-18b1-453c-be45-be86c5008871-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.076344 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmbkr\" (UniqueName: \"kubernetes.io/projected/60131c9c-b83a-472c-ad14-5ea846e9b04d-kube-api-access-lmbkr\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.433718 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d99c-account-create-update-hpmhc" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.433711 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d99c-account-create-update-hpmhc" event={"ID":"60131c9c-b83a-472c-ad14-5ea846e9b04d","Type":"ContainerDied","Data":"d6c2bcd6b0ffd18a74d6935bd7432c7c4944451034002027660bbf861cc2b3f1"} Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.433935 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6c2bcd6b0ffd18a74d6935bd7432c7c4944451034002027660bbf861cc2b3f1" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.435748 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j4ls7" event={"ID":"d2214644-246d-408c-9315-91b23e85d3f2","Type":"ContainerDied","Data":"131f8b9782786986578544fa75acec7ebb036171532c9d718eeb9d9ff56eff1d"} Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.435776 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j4ls7" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.435795 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="131f8b9782786986578544fa75acec7ebb036171532c9d718eeb9d9ff56eff1d" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.438331 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-brvmt" event={"ID":"4f2c58b7-18b1-453c-be45-be86c5008871","Type":"ContainerDied","Data":"2b11ec490502485fcfffcecfb7d41c30f806c1374b9a25e453ab159f18f04543"} Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.438394 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b11ec490502485fcfffcecfb7d41c30f806c1374b9a25e453ab159f18f04543" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.440314 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-brvmt" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.842614 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wsbdm" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.990752 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda8dda1-2f95-43f2-952f-79c7f8adbb63-operator-scripts\") pod \"bda8dda1-2f95-43f2-952f-79c7f8adbb63\" (UID: \"bda8dda1-2f95-43f2-952f-79c7f8adbb63\") " Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.990800 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85fvk\" (UniqueName: \"kubernetes.io/projected/bda8dda1-2f95-43f2-952f-79c7f8adbb63-kube-api-access-85fvk\") pod \"bda8dda1-2f95-43f2-952f-79c7f8adbb63\" (UID: \"bda8dda1-2f95-43f2-952f-79c7f8adbb63\") " Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.991778 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda8dda1-2f95-43f2-952f-79c7f8adbb63-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bda8dda1-2f95-43f2-952f-79c7f8adbb63" (UID: "bda8dda1-2f95-43f2-952f-79c7f8adbb63"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:27:27 crc kubenswrapper[4841]: I0130 05:27:27.994411 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda8dda1-2f95-43f2-952f-79c7f8adbb63-kube-api-access-85fvk" (OuterVolumeSpecName: "kube-api-access-85fvk") pod "bda8dda1-2f95-43f2-952f-79c7f8adbb63" (UID: "bda8dda1-2f95-43f2-952f-79c7f8adbb63"). InnerVolumeSpecName "kube-api-access-85fvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.036660 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.041905 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-164a-account-create-update-vqvpq" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.120646 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4psfw\" (UniqueName: \"kubernetes.io/projected/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1-kube-api-access-4psfw\") pod \"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1\" (UID: \"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1\") " Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.120715 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1-operator-scripts\") pod \"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1\" (UID: \"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1\") " Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.120761 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3-operator-scripts\") pod \"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3\" (UID: \"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3\") " Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.120854 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkkkd\" (UniqueName: \"kubernetes.io/projected/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3-kube-api-access-pkkkd\") pod \"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3\" (UID: \"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3\") " Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.121184 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda8dda1-2f95-43f2-952f-79c7f8adbb63-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.121200 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85fvk\" (UniqueName: \"kubernetes.io/projected/bda8dda1-2f95-43f2-952f-79c7f8adbb63-kube-api-access-85fvk\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.121519 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1" (UID: "ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.122132 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c5871a8-286c-4ae9-90b8-5accaf3e8fa3" (UID: "3c5871a8-286c-4ae9-90b8-5accaf3e8fa3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.125886 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3-kube-api-access-pkkkd" (OuterVolumeSpecName: "kube-api-access-pkkkd") pod "3c5871a8-286c-4ae9-90b8-5accaf3e8fa3" (UID: "3c5871a8-286c-4ae9-90b8-5accaf3e8fa3"). InnerVolumeSpecName "kube-api-access-pkkkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.127471 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1-kube-api-access-4psfw" (OuterVolumeSpecName: "kube-api-access-4psfw") pod "ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1" (UID: "ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1"). InnerVolumeSpecName "kube-api-access-4psfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.187324 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.187370 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.222948 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4psfw\" (UniqueName: \"kubernetes.io/projected/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1-kube-api-access-4psfw\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.222980 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.222989 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.223000 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkkkd\" (UniqueName: \"kubernetes.io/projected/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3-kube-api-access-pkkkd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.230295 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.233802 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.456179 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wsbdm" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.456159 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wsbdm" event={"ID":"bda8dda1-2f95-43f2-952f-79c7f8adbb63","Type":"ContainerDied","Data":"5fffc879a06934af09c7e65ae13fa718580400939977c7d4fc272b534c2df8b5"} Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.456688 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fffc879a06934af09c7e65ae13fa718580400939977c7d4fc272b534c2df8b5" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.459236 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" event={"ID":"3c5871a8-286c-4ae9-90b8-5accaf3e8fa3","Type":"ContainerDied","Data":"98ab9b295002614d1e5c31c50833f338a837c5a3441107b9df1a384bf3783f3a"} Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.459260 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fec-account-create-update-9sgmx" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.459277 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98ab9b295002614d1e5c31c50833f338a837c5a3441107b9df1a384bf3783f3a" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.463061 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-164a-account-create-update-vqvpq" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.463797 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-164a-account-create-update-vqvpq" event={"ID":"ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1","Type":"ContainerDied","Data":"6f6f2ce027a0dc7c2c6439f60d4c477952aa659a1fb40b3e21bbdcb7d6ef724a"} Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.463819 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f6f2ce027a0dc7c2c6439f60d4c477952aa659a1fb40b3e21bbdcb7d6ef724a" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.463834 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 05:27:28 crc kubenswrapper[4841]: I0130 05:27:28.463934 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.046627 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.046682 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.086498 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.115491 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.279729 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q4xj8"] Jan 30 05:27:29 crc kubenswrapper[4841]: E0130 05:27:29.280036 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60131c9c-b83a-472c-ad14-5ea846e9b04d" containerName="mariadb-account-create-update" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280051 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="60131c9c-b83a-472c-ad14-5ea846e9b04d" containerName="mariadb-account-create-update" Jan 30 05:27:29 crc kubenswrapper[4841]: E0130 05:27:29.280063 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2c58b7-18b1-453c-be45-be86c5008871" containerName="mariadb-database-create" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280070 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2c58b7-18b1-453c-be45-be86c5008871" containerName="mariadb-database-create" Jan 30 05:27:29 crc kubenswrapper[4841]: E0130 05:27:29.280077 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5871a8-286c-4ae9-90b8-5accaf3e8fa3" containerName="mariadb-account-create-update" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280083 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5871a8-286c-4ae9-90b8-5accaf3e8fa3" containerName="mariadb-account-create-update" Jan 30 05:27:29 crc kubenswrapper[4841]: E0130 05:27:29.280096 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2214644-246d-408c-9315-91b23e85d3f2" containerName="mariadb-database-create" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280103 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2214644-246d-408c-9315-91b23e85d3f2" containerName="mariadb-database-create" Jan 30 05:27:29 crc kubenswrapper[4841]: E0130 05:27:29.280113 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda8dda1-2f95-43f2-952f-79c7f8adbb63" containerName="mariadb-database-create" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280119 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda8dda1-2f95-43f2-952f-79c7f8adbb63" containerName="mariadb-database-create" Jan 30 05:27:29 crc kubenswrapper[4841]: E0130 05:27:29.280138 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1" containerName="mariadb-account-create-update" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280144 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1" containerName="mariadb-account-create-update" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280292 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5871a8-286c-4ae9-90b8-5accaf3e8fa3" containerName="mariadb-account-create-update" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280303 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2214644-246d-408c-9315-91b23e85d3f2" containerName="mariadb-database-create" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280317 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2c58b7-18b1-453c-be45-be86c5008871" containerName="mariadb-database-create" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280324 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="60131c9c-b83a-472c-ad14-5ea846e9b04d" containerName="mariadb-account-create-update" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280334 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bda8dda1-2f95-43f2-952f-79c7f8adbb63" containerName="mariadb-database-create" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280346 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1" containerName="mariadb-account-create-update" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.280838 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.283529 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-47ttp" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.283678 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.284813 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.295458 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q4xj8"] Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.344370 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-config-data\") pod \"nova-cell0-conductor-db-sync-q4xj8\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.344461 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-scripts\") pod \"nova-cell0-conductor-db-sync-q4xj8\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.344619 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4bkc\" (UniqueName: \"kubernetes.io/projected/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-kube-api-access-m4bkc\") pod \"nova-cell0-conductor-db-sync-q4xj8\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.344699 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q4xj8\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.446164 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q4xj8\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.446257 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-config-data\") pod \"nova-cell0-conductor-db-sync-q4xj8\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.446293 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-scripts\") pod \"nova-cell0-conductor-db-sync-q4xj8\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.446888 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4bkc\" (UniqueName: \"kubernetes.io/projected/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-kube-api-access-m4bkc\") pod \"nova-cell0-conductor-db-sync-q4xj8\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.481222 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.481267 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.484360 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q4xj8\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.487091 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4bkc\" (UniqueName: \"kubernetes.io/projected/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-kube-api-access-m4bkc\") pod \"nova-cell0-conductor-db-sync-q4xj8\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.490208 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-scripts\") pod \"nova-cell0-conductor-db-sync-q4xj8\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.496521 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-config-data\") pod \"nova-cell0-conductor-db-sync-q4xj8\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:29 crc kubenswrapper[4841]: I0130 05:27:29.641539 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:30 crc kubenswrapper[4841]: I0130 05:27:30.067853 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q4xj8"] Jan 30 05:27:30 crc kubenswrapper[4841]: W0130 05:27:30.071841 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34b47d5e_2531_4e8c_b1ea_49ba2f52b7c5.slice/crio-af44ecaf717da5c6e2a4fe8c2d1563f2ee5eeec39ae47dcac94f04113488eac4 WatchSource:0}: Error finding container af44ecaf717da5c6e2a4fe8c2d1563f2ee5eeec39ae47dcac94f04113488eac4: Status 404 returned error can't find the container with id af44ecaf717da5c6e2a4fe8c2d1563f2ee5eeec39ae47dcac94f04113488eac4 Jan 30 05:27:30 crc kubenswrapper[4841]: I0130 05:27:30.454640 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 05:27:30 crc kubenswrapper[4841]: I0130 05:27:30.458644 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 05:27:30 crc kubenswrapper[4841]: I0130 05:27:30.490939 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q4xj8" event={"ID":"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5","Type":"ContainerStarted","Data":"af44ecaf717da5c6e2a4fe8c2d1563f2ee5eeec39ae47dcac94f04113488eac4"} Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.030740 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.088930 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b3dd40-3f94-42a2-ab50-564dbe9eef21-run-httpd\") pod \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.089033 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-scripts\") pod \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.089084 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-config-data\") pod \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.089174 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-sg-core-conf-yaml\") pod \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.089219 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b3dd40-3f94-42a2-ab50-564dbe9eef21-log-httpd\") pod \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.089249 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x97pl\" (UniqueName: \"kubernetes.io/projected/89b3dd40-3f94-42a2-ab50-564dbe9eef21-kube-api-access-x97pl\") pod \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.089294 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-combined-ca-bundle\") pod \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\" (UID: \"89b3dd40-3f94-42a2-ab50-564dbe9eef21\") " Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.089903 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b3dd40-3f94-42a2-ab50-564dbe9eef21-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "89b3dd40-3f94-42a2-ab50-564dbe9eef21" (UID: "89b3dd40-3f94-42a2-ab50-564dbe9eef21"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.090246 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b3dd40-3f94-42a2-ab50-564dbe9eef21-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "89b3dd40-3f94-42a2-ab50-564dbe9eef21" (UID: "89b3dd40-3f94-42a2-ab50-564dbe9eef21"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.094886 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-scripts" (OuterVolumeSpecName: "scripts") pod "89b3dd40-3f94-42a2-ab50-564dbe9eef21" (UID: "89b3dd40-3f94-42a2-ab50-564dbe9eef21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.107568 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b3dd40-3f94-42a2-ab50-564dbe9eef21-kube-api-access-x97pl" (OuterVolumeSpecName: "kube-api-access-x97pl") pod "89b3dd40-3f94-42a2-ab50-564dbe9eef21" (UID: "89b3dd40-3f94-42a2-ab50-564dbe9eef21"). InnerVolumeSpecName "kube-api-access-x97pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.116487 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "89b3dd40-3f94-42a2-ab50-564dbe9eef21" (UID: "89b3dd40-3f94-42a2-ab50-564dbe9eef21"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.165617 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89b3dd40-3f94-42a2-ab50-564dbe9eef21" (UID: "89b3dd40-3f94-42a2-ab50-564dbe9eef21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.191331 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.191366 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b3dd40-3f94-42a2-ab50-564dbe9eef21-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.191385 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x97pl\" (UniqueName: \"kubernetes.io/projected/89b3dd40-3f94-42a2-ab50-564dbe9eef21-kube-api-access-x97pl\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.191478 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.191495 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b3dd40-3f94-42a2-ab50-564dbe9eef21-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.191511 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.197973 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-config-data" (OuterVolumeSpecName: "config-data") pod "89b3dd40-3f94-42a2-ab50-564dbe9eef21" (UID: "89b3dd40-3f94-42a2-ab50-564dbe9eef21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.292948 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89b3dd40-3f94-42a2-ab50-564dbe9eef21-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.356911 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.356970 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.516913 4841 generic.go:334] "Generic (PLEG): container finished" podID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerID="3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16" exitCode=0 Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.517853 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.518200 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b3dd40-3f94-42a2-ab50-564dbe9eef21","Type":"ContainerDied","Data":"3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16"} Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.518231 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b3dd40-3f94-42a2-ab50-564dbe9eef21","Type":"ContainerDied","Data":"e4e1e45b3f81028e1b0286239f83e2f457d0028dbc470739e382b0d7ce70b0a7"} Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.518247 4841 scope.go:117] "RemoveContainer" containerID="1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.552259 4841 scope.go:117] "RemoveContainer" containerID="69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.560870 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.581546 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.587775 4841 scope.go:117] "RemoveContainer" containerID="ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.600081 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:31 crc kubenswrapper[4841]: E0130 05:27:31.600500 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="ceilometer-central-agent" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.600519 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="ceilometer-central-agent" Jan 30 05:27:31 crc kubenswrapper[4841]: E0130 05:27:31.600530 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="ceilometer-notification-agent" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.600537 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="ceilometer-notification-agent" Jan 30 05:27:31 crc kubenswrapper[4841]: E0130 05:27:31.600554 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="sg-core" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.600560 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="sg-core" Jan 30 05:27:31 crc kubenswrapper[4841]: E0130 05:27:31.600570 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="proxy-httpd" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.600576 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="proxy-httpd" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.600724 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="proxy-httpd" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.600739 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="ceilometer-notification-agent" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.600756 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="ceilometer-central-agent" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.600765 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" containerName="sg-core" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.602826 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.608026 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.613993 4841 scope.go:117] "RemoveContainer" containerID="3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.614901 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.615579 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.642925 4841 scope.go:117] "RemoveContainer" containerID="1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66" Jan 30 05:27:31 crc kubenswrapper[4841]: E0130 05:27:31.643302 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66\": container with ID starting with 1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66 not found: ID does not exist" containerID="1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.643330 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66"} err="failed to get container status \"1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66\": rpc error: code = NotFound desc = could not find container \"1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66\": container with ID starting with 1a3fcf5d35df90414ef0c5147b3769fe80e2994a9c275db524767b9513d2ff66 not found: ID does not exist" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.643349 4841 scope.go:117] "RemoveContainer" containerID="69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889" Jan 30 05:27:31 crc kubenswrapper[4841]: E0130 05:27:31.643589 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889\": container with ID starting with 69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889 not found: ID does not exist" containerID="69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.643610 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889"} err="failed to get container status \"69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889\": rpc error: code = NotFound desc = could not find container \"69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889\": container with ID starting with 69864771ce6528f7051efa1b4052d4834f26ac4ecf913995fe86120951413889 not found: ID does not exist" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.643622 4841 scope.go:117] "RemoveContainer" containerID="ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7" Jan 30 05:27:31 crc kubenswrapper[4841]: E0130 05:27:31.643801 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7\": container with ID starting with ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7 not found: ID does not exist" containerID="ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.643818 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7"} err="failed to get container status \"ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7\": rpc error: code = NotFound desc = could not find container \"ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7\": container with ID starting with ffbd616c67b1aa94b31c923cb08528a6f65002fcdf600c8e8dd529e7125135c7 not found: ID does not exist" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.643831 4841 scope.go:117] "RemoveContainer" containerID="3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16" Jan 30 05:27:31 crc kubenswrapper[4841]: E0130 05:27:31.644048 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16\": container with ID starting with 3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16 not found: ID does not exist" containerID="3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.644074 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16"} err="failed to get container status \"3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16\": rpc error: code = NotFound desc = could not find container \"3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16\": container with ID starting with 3c7e7883744a8d373b54b5e61904cd9d4b7fd5078d18974a33b24fad79711e16 not found: ID does not exist" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.801627 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-run-httpd\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.801680 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-config-data\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.801712 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks5sb\" (UniqueName: \"kubernetes.io/projected/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-kube-api-access-ks5sb\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.801748 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-scripts\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.801784 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.801836 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-log-httpd\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.801852 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.903505 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-log-httpd\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.903553 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.903606 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-run-httpd\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.903633 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-config-data\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.903663 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks5sb\" (UniqueName: \"kubernetes.io/projected/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-kube-api-access-ks5sb\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.903698 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-scripts\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.903734 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.903982 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-log-httpd\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.904153 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-run-httpd\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.907659 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-scripts\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.913081 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-config-data\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.915216 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.917030 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.927738 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks5sb\" (UniqueName: \"kubernetes.io/projected/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-kube-api-access-ks5sb\") pod \"ceilometer-0\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " pod="openstack/ceilometer-0" Jan 30 05:27:31 crc kubenswrapper[4841]: I0130 05:27:31.930724 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:27:32 crc kubenswrapper[4841]: W0130 05:27:32.443621 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dd22c20_b602_4e60_a34f_dc2e2ce718e7.slice/crio-894b253fe02feab536a7b48c4126c9d95e0c096ee8cedae7b1b2286ed11849a9 WatchSource:0}: Error finding container 894b253fe02feab536a7b48c4126c9d95e0c096ee8cedae7b1b2286ed11849a9: Status 404 returned error can't find the container with id 894b253fe02feab536a7b48c4126c9d95e0c096ee8cedae7b1b2286ed11849a9 Jan 30 05:27:32 crc kubenswrapper[4841]: I0130 05:27:32.443656 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b3dd40-3f94-42a2-ab50-564dbe9eef21" path="/var/lib/kubelet/pods/89b3dd40-3f94-42a2-ab50-564dbe9eef21/volumes" Jan 30 05:27:32 crc kubenswrapper[4841]: I0130 05:27:32.445352 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:27:32 crc kubenswrapper[4841]: I0130 05:27:32.530383 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd22c20-b602-4e60-a34f-dc2e2ce718e7","Type":"ContainerStarted","Data":"894b253fe02feab536a7b48c4126c9d95e0c096ee8cedae7b1b2286ed11849a9"} Jan 30 05:27:33 crc kubenswrapper[4841]: I0130 05:27:33.545658 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd22c20-b602-4e60-a34f-dc2e2ce718e7","Type":"ContainerStarted","Data":"24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672"} Jan 30 05:27:38 crc kubenswrapper[4841]: I0130 05:27:38.598944 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd22c20-b602-4e60-a34f-dc2e2ce718e7","Type":"ContainerStarted","Data":"2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7"} Jan 30 05:27:38 crc kubenswrapper[4841]: I0130 05:27:38.625994 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q4xj8" event={"ID":"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5","Type":"ContainerStarted","Data":"995cb343dd7386b355ae3b1c076e4fe5c75a2ee001f0d1549fd9e15f90bb22a0"} Jan 30 05:27:38 crc kubenswrapper[4841]: I0130 05:27:38.672757 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-q4xj8" podStartSLOduration=1.7132933399999999 podStartE2EDuration="9.672734901s" podCreationTimestamp="2026-01-30 05:27:29 +0000 UTC" firstStartedPulling="2026-01-30 05:27:30.074587732 +0000 UTC m=+1187.068060370" lastFinishedPulling="2026-01-30 05:27:38.034029293 +0000 UTC m=+1195.027501931" observedRunningTime="2026-01-30 05:27:38.655956042 +0000 UTC m=+1195.649428680" watchObservedRunningTime="2026-01-30 05:27:38.672734901 +0000 UTC m=+1195.666207549" Jan 30 05:27:39 crc kubenswrapper[4841]: I0130 05:27:39.635224 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd22c20-b602-4e60-a34f-dc2e2ce718e7","Type":"ContainerStarted","Data":"6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953"} Jan 30 05:27:41 crc kubenswrapper[4841]: I0130 05:27:41.659518 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd22c20-b602-4e60-a34f-dc2e2ce718e7","Type":"ContainerStarted","Data":"195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933"} Jan 30 05:27:41 crc kubenswrapper[4841]: I0130 05:27:41.660132 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 05:27:49 crc kubenswrapper[4841]: I0130 05:27:49.747841 4841 generic.go:334] "Generic (PLEG): container finished" podID="34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5" containerID="995cb343dd7386b355ae3b1c076e4fe5c75a2ee001f0d1549fd9e15f90bb22a0" exitCode=0 Jan 30 05:27:49 crc kubenswrapper[4841]: I0130 05:27:49.747886 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q4xj8" event={"ID":"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5","Type":"ContainerDied","Data":"995cb343dd7386b355ae3b1c076e4fe5c75a2ee001f0d1549fd9e15f90bb22a0"} Jan 30 05:27:49 crc kubenswrapper[4841]: I0130 05:27:49.775140 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=10.563903265 podStartE2EDuration="18.775119957s" podCreationTimestamp="2026-01-30 05:27:31 +0000 UTC" firstStartedPulling="2026-01-30 05:27:32.447418129 +0000 UTC m=+1189.440890767" lastFinishedPulling="2026-01-30 05:27:40.658634811 +0000 UTC m=+1197.652107459" observedRunningTime="2026-01-30 05:27:41.68713479 +0000 UTC m=+1198.680607468" watchObservedRunningTime="2026-01-30 05:27:49.775119957 +0000 UTC m=+1206.768592605" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.289252 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.305966 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-config-data\") pod \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.306160 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4bkc\" (UniqueName: \"kubernetes.io/projected/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-kube-api-access-m4bkc\") pod \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.306351 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-scripts\") pod \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.306479 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-combined-ca-bundle\") pod \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\" (UID: \"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5\") " Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.319041 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-kube-api-access-m4bkc" (OuterVolumeSpecName: "kube-api-access-m4bkc") pod "34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5" (UID: "34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5"). InnerVolumeSpecName "kube-api-access-m4bkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.324178 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-scripts" (OuterVolumeSpecName: "scripts") pod "34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5" (UID: "34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.345690 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5" (UID: "34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.362875 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-config-data" (OuterVolumeSpecName: "config-data") pod "34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5" (UID: "34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.408807 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4bkc\" (UniqueName: \"kubernetes.io/projected/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-kube-api-access-m4bkc\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.408847 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.408858 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.408866 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.788524 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q4xj8" event={"ID":"34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5","Type":"ContainerDied","Data":"af44ecaf717da5c6e2a4fe8c2d1563f2ee5eeec39ae47dcac94f04113488eac4"} Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.788574 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af44ecaf717da5c6e2a4fe8c2d1563f2ee5eeec39ae47dcac94f04113488eac4" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.788607 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q4xj8" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.937825 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 05:27:51 crc kubenswrapper[4841]: E0130 05:27:51.938242 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5" containerName="nova-cell0-conductor-db-sync" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.938262 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5" containerName="nova-cell0-conductor-db-sync" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.938524 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5" containerName="nova-cell0-conductor-db-sync" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.939563 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.954806 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.987088 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 05:27:51 crc kubenswrapper[4841]: I0130 05:27:51.987312 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-47ttp" Jan 30 05:27:52 crc kubenswrapper[4841]: I0130 05:27:52.019631 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9baa24e8-552c-425b-a494-ca70b9bcff0c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9baa24e8-552c-425b-a494-ca70b9bcff0c\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:52 crc kubenswrapper[4841]: I0130 05:27:52.019914 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9baa24e8-552c-425b-a494-ca70b9bcff0c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9baa24e8-552c-425b-a494-ca70b9bcff0c\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:52 crc kubenswrapper[4841]: I0130 05:27:52.020118 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2t4x\" (UniqueName: \"kubernetes.io/projected/9baa24e8-552c-425b-a494-ca70b9bcff0c-kube-api-access-x2t4x\") pod \"nova-cell0-conductor-0\" (UID: \"9baa24e8-552c-425b-a494-ca70b9bcff0c\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:52 crc kubenswrapper[4841]: I0130 05:27:52.122095 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9baa24e8-552c-425b-a494-ca70b9bcff0c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9baa24e8-552c-425b-a494-ca70b9bcff0c\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:52 crc kubenswrapper[4841]: I0130 05:27:52.122450 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9baa24e8-552c-425b-a494-ca70b9bcff0c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9baa24e8-552c-425b-a494-ca70b9bcff0c\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:52 crc kubenswrapper[4841]: I0130 05:27:52.122769 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2t4x\" (UniqueName: \"kubernetes.io/projected/9baa24e8-552c-425b-a494-ca70b9bcff0c-kube-api-access-x2t4x\") pod \"nova-cell0-conductor-0\" (UID: \"9baa24e8-552c-425b-a494-ca70b9bcff0c\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:52 crc kubenswrapper[4841]: I0130 05:27:52.126427 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9baa24e8-552c-425b-a494-ca70b9bcff0c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9baa24e8-552c-425b-a494-ca70b9bcff0c\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:52 crc kubenswrapper[4841]: I0130 05:27:52.127604 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9baa24e8-552c-425b-a494-ca70b9bcff0c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9baa24e8-552c-425b-a494-ca70b9bcff0c\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:52 crc kubenswrapper[4841]: I0130 05:27:52.143046 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2t4x\" (UniqueName: \"kubernetes.io/projected/9baa24e8-552c-425b-a494-ca70b9bcff0c-kube-api-access-x2t4x\") pod \"nova-cell0-conductor-0\" (UID: \"9baa24e8-552c-425b-a494-ca70b9bcff0c\") " pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:52 crc kubenswrapper[4841]: I0130 05:27:52.303157 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:52 crc kubenswrapper[4841]: W0130 05:27:52.803488 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9baa24e8_552c_425b_a494_ca70b9bcff0c.slice/crio-370b2e574720f045233c9ef080ea7db818028730dc30c9be26a139d9ef9a22b0 WatchSource:0}: Error finding container 370b2e574720f045233c9ef080ea7db818028730dc30c9be26a139d9ef9a22b0: Status 404 returned error can't find the container with id 370b2e574720f045233c9ef080ea7db818028730dc30c9be26a139d9ef9a22b0 Jan 30 05:27:52 crc kubenswrapper[4841]: I0130 05:27:52.808552 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 05:27:53 crc kubenswrapper[4841]: I0130 05:27:53.816893 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9baa24e8-552c-425b-a494-ca70b9bcff0c","Type":"ContainerStarted","Data":"93721abe2b80f73cd1be85412e3cfc53afaba2d7b1d8e8eec69d4a39915539f1"} Jan 30 05:27:53 crc kubenswrapper[4841]: I0130 05:27:53.817188 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 30 05:27:53 crc kubenswrapper[4841]: I0130 05:27:53.817203 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9baa24e8-552c-425b-a494-ca70b9bcff0c","Type":"ContainerStarted","Data":"370b2e574720f045233c9ef080ea7db818028730dc30c9be26a139d9ef9a22b0"} Jan 30 05:27:53 crc kubenswrapper[4841]: I0130 05:27:53.849734 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.849708176 podStartE2EDuration="2.849708176s" podCreationTimestamp="2026-01-30 05:27:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:27:53.837456555 +0000 UTC m=+1210.830929203" watchObservedRunningTime="2026-01-30 05:27:53.849708176 +0000 UTC m=+1210.843180844" Jan 30 05:28:01 crc kubenswrapper[4841]: I0130 05:28:01.938996 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 05:28:02 crc kubenswrapper[4841]: I0130 05:28:02.346548 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 30 05:28:02 crc kubenswrapper[4841]: I0130 05:28:02.898095 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-wnzmn"] Jan 30 05:28:02 crc kubenswrapper[4841]: I0130 05:28:02.899292 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:02 crc kubenswrapper[4841]: I0130 05:28:02.901866 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 30 05:28:02 crc kubenswrapper[4841]: I0130 05:28:02.901890 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 30 05:28:02 crc kubenswrapper[4841]: I0130 05:28:02.922765 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wnzmn"] Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.087950 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wnzmn\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.088260 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-scripts\") pod \"nova-cell0-cell-mapping-wnzmn\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.088291 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79c4h\" (UniqueName: \"kubernetes.io/projected/654366d0-8b60-4ae2-bde2-981cdc9464a4-kube-api-access-79c4h\") pod \"nova-cell0-cell-mapping-wnzmn\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.088496 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-config-data\") pod \"nova-cell0-cell-mapping-wnzmn\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.088836 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.117392 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.140852 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.143143 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.177085 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.178960 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.190963 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.193621 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-scripts\") pod \"nova-cell0-cell-mapping-wnzmn\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.193739 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79c4h\" (UniqueName: \"kubernetes.io/projected/654366d0-8b60-4ae2-bde2-981cdc9464a4-kube-api-access-79c4h\") pod \"nova-cell0-cell-mapping-wnzmn\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.193895 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-config-data\") pod \"nova-cell0-cell-mapping-wnzmn\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.193999 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wnzmn\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.219037 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-scripts\") pod \"nova-cell0-cell-mapping-wnzmn\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.224214 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.225832 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.231573 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-config-data\") pod \"nova-cell0-cell-mapping-wnzmn\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.234072 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.235590 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wnzmn\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.239315 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.239937 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79c4h\" (UniqueName: \"kubernetes.io/projected/654366d0-8b60-4ae2-bde2-981cdc9464a4-kube-api-access-79c4h\") pod \"nova-cell0-cell-mapping-wnzmn\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.265135 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.273585 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.295725 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-config-data\") pod \"nova-api-0\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.295790 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.295818 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-logs\") pod \"nova-api-0\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.295862 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-config-data\") pod \"nova-scheduler-0\" (UID: \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.295884 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfsww\" (UniqueName: \"kubernetes.io/projected/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-kube-api-access-nfsww\") pod \"nova-scheduler-0\" (UID: \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.295909 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.295981 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rh2s\" (UniqueName: \"kubernetes.io/projected/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-kube-api-access-8rh2s\") pod \"nova-api-0\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.318370 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.325746 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.334688 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.345488 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.378867 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-8nh7c"] Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.380314 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.397095 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sskx\" (UniqueName: \"kubernetes.io/projected/87439d95-8eff-46d0-82a9-011d6f396f56-kube-api-access-8sskx\") pod \"nova-metadata-0\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.397141 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.397158 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87439d95-8eff-46d0-82a9-011d6f396f56-logs\") pod \"nova-metadata-0\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.397187 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-logs\") pod \"nova-api-0\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.397230 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-config-data\") pod \"nova-scheduler-0\" (UID: \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.397244 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87439d95-8eff-46d0-82a9-011d6f396f56-config-data\") pod \"nova-metadata-0\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.397267 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfsww\" (UniqueName: \"kubernetes.io/projected/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-kube-api-access-nfsww\") pod \"nova-scheduler-0\" (UID: \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.397285 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.397350 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rh2s\" (UniqueName: \"kubernetes.io/projected/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-kube-api-access-8rh2s\") pod \"nova-api-0\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.397373 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87439d95-8eff-46d0-82a9-011d6f396f56-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.397393 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-config-data\") pod \"nova-api-0\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.401284 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-config-data\") pod \"nova-api-0\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.401704 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-8nh7c"] Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.401850 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-logs\") pod \"nova-api-0\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.406919 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.419987 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-config-data\") pod \"nova-scheduler-0\" (UID: \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.420156 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfsww\" (UniqueName: \"kubernetes.io/projected/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-kube-api-access-nfsww\") pod \"nova-scheduler-0\" (UID: \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.420164 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.428231 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rh2s\" (UniqueName: \"kubernetes.io/projected/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-kube-api-access-8rh2s\") pod \"nova-api-0\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.456259 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499121 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhpc6\" (UniqueName: \"kubernetes.io/projected/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-kube-api-access-vhpc6\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499180 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v7sb\" (UniqueName: \"kubernetes.io/projected/cfc8b4aa-d421-4c89-bc56-538a727a638f-kube-api-access-8v7sb\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfc8b4aa-d421-4c89-bc56-538a727a638f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499216 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc8b4aa-d421-4c89-bc56-538a727a638f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfc8b4aa-d421-4c89-bc56-538a727a638f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499249 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-dns-swift-storage-0\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499271 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87439d95-8eff-46d0-82a9-011d6f396f56-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499291 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-ovsdbserver-nb\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499322 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sskx\" (UniqueName: \"kubernetes.io/projected/87439d95-8eff-46d0-82a9-011d6f396f56-kube-api-access-8sskx\") pod \"nova-metadata-0\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499345 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87439d95-8eff-46d0-82a9-011d6f396f56-logs\") pod \"nova-metadata-0\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499373 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-dns-svc\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499418 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc8b4aa-d421-4c89-bc56-538a727a638f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfc8b4aa-d421-4c89-bc56-538a727a638f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499447 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87439d95-8eff-46d0-82a9-011d6f396f56-config-data\") pod \"nova-metadata-0\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499463 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-config\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.499482 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-ovsdbserver-sb\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.500232 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.500733 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87439d95-8eff-46d0-82a9-011d6f396f56-logs\") pod \"nova-metadata-0\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.513081 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87439d95-8eff-46d0-82a9-011d6f396f56-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.513163 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87439d95-8eff-46d0-82a9-011d6f396f56-config-data\") pod \"nova-metadata-0\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.531524 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sskx\" (UniqueName: \"kubernetes.io/projected/87439d95-8eff-46d0-82a9-011d6f396f56-kube-api-access-8sskx\") pod \"nova-metadata-0\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.601156 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-ovsdbserver-sb\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.601343 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhpc6\" (UniqueName: \"kubernetes.io/projected/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-kube-api-access-vhpc6\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.601587 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v7sb\" (UniqueName: \"kubernetes.io/projected/cfc8b4aa-d421-4c89-bc56-538a727a638f-kube-api-access-8v7sb\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfc8b4aa-d421-4c89-bc56-538a727a638f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.601688 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc8b4aa-d421-4c89-bc56-538a727a638f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfc8b4aa-d421-4c89-bc56-538a727a638f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.601821 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-dns-swift-storage-0\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.601888 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-ovsdbserver-nb\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.602166 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-dns-svc\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.602239 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc8b4aa-d421-4c89-bc56-538a727a638f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfc8b4aa-d421-4c89-bc56-538a727a638f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.602315 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-config\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.602728 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-ovsdbserver-sb\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.603588 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-dns-swift-storage-0\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.604470 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-ovsdbserver-nb\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.606822 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-dns-svc\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.606897 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-config\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.609966 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc8b4aa-d421-4c89-bc56-538a727a638f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfc8b4aa-d421-4c89-bc56-538a727a638f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.611113 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc8b4aa-d421-4c89-bc56-538a727a638f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfc8b4aa-d421-4c89-bc56-538a727a638f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.633397 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhpc6\" (UniqueName: \"kubernetes.io/projected/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-kube-api-access-vhpc6\") pod \"dnsmasq-dns-647df7b8c5-8nh7c\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.635905 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v7sb\" (UniqueName: \"kubernetes.io/projected/cfc8b4aa-d421-4c89-bc56-538a727a638f-kube-api-access-8v7sb\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfc8b4aa-d421-4c89-bc56-538a727a638f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.682184 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.704829 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.782846 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.873770 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wnzmn"] Jan 30 05:28:03 crc kubenswrapper[4841]: I0130 05:28:03.987347 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.104139 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fjjz6"] Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.105478 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.109388 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.109607 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.119594 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-scripts\") pod \"nova-cell1-conductor-db-sync-fjjz6\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.119648 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2gp9\" (UniqueName: \"kubernetes.io/projected/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-kube-api-access-s2gp9\") pod \"nova-cell1-conductor-db-sync-fjjz6\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.119760 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-config-data\") pod \"nova-cell1-conductor-db-sync-fjjz6\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.119781 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fjjz6\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.122215 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fjjz6"] Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.174588 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.223521 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-config-data\") pod \"nova-cell1-conductor-db-sync-fjjz6\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.223561 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fjjz6\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.223626 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-scripts\") pod \"nova-cell1-conductor-db-sync-fjjz6\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.223654 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2gp9\" (UniqueName: \"kubernetes.io/projected/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-kube-api-access-s2gp9\") pod \"nova-cell1-conductor-db-sync-fjjz6\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.229462 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-config-data\") pod \"nova-cell1-conductor-db-sync-fjjz6\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.232546 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-scripts\") pod \"nova-cell1-conductor-db-sync-fjjz6\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.239987 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fjjz6\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.240521 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2gp9\" (UniqueName: \"kubernetes.io/projected/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-kube-api-access-s2gp9\") pod \"nova-cell1-conductor-db-sync-fjjz6\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.391781 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.428447 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.485199 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.587655 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-8nh7c"] Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.922811 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fjjz6"] Jan 30 05:28:04 crc kubenswrapper[4841]: W0130 05:28:04.931465 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1faa8a3a_4660_4a7a_81d5_0fd94025b1ad.slice/crio-efd422053062933fb83364a3e31f2b4a59c34cf05f07a4c69529ea97cb571a47 WatchSource:0}: Error finding container efd422053062933fb83364a3e31f2b4a59c34cf05f07a4c69529ea97cb571a47: Status 404 returned error can't find the container with id efd422053062933fb83364a3e31f2b4a59c34cf05f07a4c69529ea97cb571a47 Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.994267 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fjjz6" event={"ID":"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad","Type":"ContainerStarted","Data":"efd422053062933fb83364a3e31f2b4a59c34cf05f07a4c69529ea97cb571a47"} Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.995910 4841 generic.go:334] "Generic (PLEG): container finished" podID="e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" containerID="4229641e7fecfa26b115d7c44f70e8ff0d358d403e0c398cd81f68758b65a24d" exitCode=0 Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.995990 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" event={"ID":"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4","Type":"ContainerDied","Data":"4229641e7fecfa26b115d7c44f70e8ff0d358d403e0c398cd81f68758b65a24d"} Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.996010 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" event={"ID":"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4","Type":"ContainerStarted","Data":"889e9b28afb1cf6ce1fe2f45668b36937c0c648cec467195a35ca0da45014d84"} Jan 30 05:28:04 crc kubenswrapper[4841]: I0130 05:28:04.997514 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87439d95-8eff-46d0-82a9-011d6f396f56","Type":"ContainerStarted","Data":"c7b7f8a5bef7100d06d6185d6cce9bb9b52f9eb211c1dbba9eaad076dd5ec720"} Jan 30 05:28:05 crc kubenswrapper[4841]: I0130 05:28:05.000013 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wnzmn" event={"ID":"654366d0-8b60-4ae2-bde2-981cdc9464a4","Type":"ContainerStarted","Data":"38df243ff6c49b1e54302c7946a385effd3bf2fb3d74c73e327e4d81515afc49"} Jan 30 05:28:05 crc kubenswrapper[4841]: I0130 05:28:05.000091 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wnzmn" event={"ID":"654366d0-8b60-4ae2-bde2-981cdc9464a4","Type":"ContainerStarted","Data":"68c8ca8a5121508f0687ae36d1ca733f3e9d6954b7a9294a6c7499f1bfe310d7"} Jan 30 05:28:05 crc kubenswrapper[4841]: I0130 05:28:05.007870 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce","Type":"ContainerStarted","Data":"75437907e450e72f4c05d632d90f100963f7d0e709318caf86ef8c880f92660f"} Jan 30 05:28:05 crc kubenswrapper[4841]: I0130 05:28:05.014794 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cfc8b4aa-d421-4c89-bc56-538a727a638f","Type":"ContainerStarted","Data":"9c7202904654ab26853a1d262a7b9bac3106973495fbcf3c4e11c8946cc10a64"} Jan 30 05:28:05 crc kubenswrapper[4841]: I0130 05:28:05.022531 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a","Type":"ContainerStarted","Data":"85de14798496a6b6aa90debf2c33c5265ab9a78e23c298587ffc6df2ac3188f2"} Jan 30 05:28:05 crc kubenswrapper[4841]: I0130 05:28:05.038710 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-wnzmn" podStartSLOduration=3.038681123 podStartE2EDuration="3.038681123s" podCreationTimestamp="2026-01-30 05:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:05.028095347 +0000 UTC m=+1222.021567985" watchObservedRunningTime="2026-01-30 05:28:05.038681123 +0000 UTC m=+1222.032153751" Jan 30 05:28:06 crc kubenswrapper[4841]: I0130 05:28:06.049698 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fjjz6" event={"ID":"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad","Type":"ContainerStarted","Data":"23495fc7312790940a345e75d7fefde2b7f84aea9ac21773cd38a0282daaaf9f"} Jan 30 05:28:06 crc kubenswrapper[4841]: I0130 05:28:06.061743 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" event={"ID":"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4","Type":"ContainerStarted","Data":"eb30ed9d206df00865353e2b78d42fe6279ddd3397ab2ec90f8a6d55b5a27596"} Jan 30 05:28:06 crc kubenswrapper[4841]: I0130 05:28:06.061785 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:06 crc kubenswrapper[4841]: I0130 05:28:06.067952 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-fjjz6" podStartSLOduration=2.067937454 podStartE2EDuration="2.067937454s" podCreationTimestamp="2026-01-30 05:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:06.063657628 +0000 UTC m=+1223.057130266" watchObservedRunningTime="2026-01-30 05:28:06.067937454 +0000 UTC m=+1223.061410092" Jan 30 05:28:06 crc kubenswrapper[4841]: I0130 05:28:06.089075 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" podStartSLOduration=3.089059387 podStartE2EDuration="3.089059387s" podCreationTimestamp="2026-01-30 05:28:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:06.085706607 +0000 UTC m=+1223.079179245" watchObservedRunningTime="2026-01-30 05:28:06.089059387 +0000 UTC m=+1223.082532025" Jan 30 05:28:06 crc kubenswrapper[4841]: I0130 05:28:06.918140 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:06 crc kubenswrapper[4841]: I0130 05:28:06.924032 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:08 crc kubenswrapper[4841]: I0130 05:28:08.087031 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cfc8b4aa-d421-4c89-bc56-538a727a638f","Type":"ContainerStarted","Data":"dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2"} Jan 30 05:28:08 crc kubenswrapper[4841]: I0130 05:28:08.087072 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="cfc8b4aa-d421-4c89-bc56-538a727a638f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2" gracePeriod=30 Jan 30 05:28:08 crc kubenswrapper[4841]: I0130 05:28:08.094821 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a","Type":"ContainerStarted","Data":"c5de62d5f742f5ea5c9d3c4d4fba4e2947cbdb9e394fc12da86e9378bb057b6d"} Jan 30 05:28:08 crc kubenswrapper[4841]: I0130 05:28:08.105377 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87439d95-8eff-46d0-82a9-011d6f396f56","Type":"ContainerStarted","Data":"da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57"} Jan 30 05:28:08 crc kubenswrapper[4841]: I0130 05:28:08.108968 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.053693781 podStartE2EDuration="5.108950392s" podCreationTimestamp="2026-01-30 05:28:03 +0000 UTC" firstStartedPulling="2026-01-30 05:28:04.492900542 +0000 UTC m=+1221.486373180" lastFinishedPulling="2026-01-30 05:28:07.548157153 +0000 UTC m=+1224.541629791" observedRunningTime="2026-01-30 05:28:08.105030675 +0000 UTC m=+1225.098503333" watchObservedRunningTime="2026-01-30 05:28:08.108950392 +0000 UTC m=+1225.102423030" Jan 30 05:28:08 crc kubenswrapper[4841]: I0130 05:28:08.109244 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce","Type":"ContainerStarted","Data":"6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1"} Jan 30 05:28:08 crc kubenswrapper[4841]: I0130 05:28:08.129518 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.608809288 podStartE2EDuration="5.12950045s" podCreationTimestamp="2026-01-30 05:28:03 +0000 UTC" firstStartedPulling="2026-01-30 05:28:04.025086727 +0000 UTC m=+1221.018559365" lastFinishedPulling="2026-01-30 05:28:07.545777899 +0000 UTC m=+1224.539250527" observedRunningTime="2026-01-30 05:28:08.122747836 +0000 UTC m=+1225.116220464" watchObservedRunningTime="2026-01-30 05:28:08.12950045 +0000 UTC m=+1225.122973088" Jan 30 05:28:08 crc kubenswrapper[4841]: I0130 05:28:08.147496 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.7812678480000002 podStartE2EDuration="5.147472037s" podCreationTimestamp="2026-01-30 05:28:03 +0000 UTC" firstStartedPulling="2026-01-30 05:28:04.188020579 +0000 UTC m=+1221.181493217" lastFinishedPulling="2026-01-30 05:28:07.554224768 +0000 UTC m=+1224.547697406" observedRunningTime="2026-01-30 05:28:08.138383171 +0000 UTC m=+1225.131855809" watchObservedRunningTime="2026-01-30 05:28:08.147472037 +0000 UTC m=+1225.140944685" Jan 30 05:28:08 crc kubenswrapper[4841]: I0130 05:28:08.457100 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 05:28:08 crc kubenswrapper[4841]: I0130 05:28:08.705844 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.124329 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87439d95-8eff-46d0-82a9-011d6f396f56","Type":"ContainerStarted","Data":"84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557"} Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.124436 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="87439d95-8eff-46d0-82a9-011d6f396f56" containerName="nova-metadata-log" containerID="cri-o://da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57" gracePeriod=30 Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.124465 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="87439d95-8eff-46d0-82a9-011d6f396f56" containerName="nova-metadata-metadata" containerID="cri-o://84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557" gracePeriod=30 Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.134734 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce","Type":"ContainerStarted","Data":"072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2"} Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.161733 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.01211408 podStartE2EDuration="6.161711881s" podCreationTimestamp="2026-01-30 05:28:03 +0000 UTC" firstStartedPulling="2026-01-30 05:28:04.398508971 +0000 UTC m=+1221.391981609" lastFinishedPulling="2026-01-30 05:28:07.548106772 +0000 UTC m=+1224.541579410" observedRunningTime="2026-01-30 05:28:09.157376433 +0000 UTC m=+1226.150849111" watchObservedRunningTime="2026-01-30 05:28:09.161711881 +0000 UTC m=+1226.155184529" Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.722017 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.828411 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87439d95-8eff-46d0-82a9-011d6f396f56-config-data\") pod \"87439d95-8eff-46d0-82a9-011d6f396f56\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.828559 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87439d95-8eff-46d0-82a9-011d6f396f56-logs\") pod \"87439d95-8eff-46d0-82a9-011d6f396f56\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.828585 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sskx\" (UniqueName: \"kubernetes.io/projected/87439d95-8eff-46d0-82a9-011d6f396f56-kube-api-access-8sskx\") pod \"87439d95-8eff-46d0-82a9-011d6f396f56\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.828615 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87439d95-8eff-46d0-82a9-011d6f396f56-combined-ca-bundle\") pod \"87439d95-8eff-46d0-82a9-011d6f396f56\" (UID: \"87439d95-8eff-46d0-82a9-011d6f396f56\") " Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.829950 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87439d95-8eff-46d0-82a9-011d6f396f56-logs" (OuterVolumeSpecName: "logs") pod "87439d95-8eff-46d0-82a9-011d6f396f56" (UID: "87439d95-8eff-46d0-82a9-011d6f396f56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.841274 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87439d95-8eff-46d0-82a9-011d6f396f56-kube-api-access-8sskx" (OuterVolumeSpecName: "kube-api-access-8sskx") pod "87439d95-8eff-46d0-82a9-011d6f396f56" (UID: "87439d95-8eff-46d0-82a9-011d6f396f56"). InnerVolumeSpecName "kube-api-access-8sskx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.870622 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87439d95-8eff-46d0-82a9-011d6f396f56-config-data" (OuterVolumeSpecName: "config-data") pod "87439d95-8eff-46d0-82a9-011d6f396f56" (UID: "87439d95-8eff-46d0-82a9-011d6f396f56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.885426 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.885635 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f" containerName="kube-state-metrics" containerID="cri-o://5bd35b53f796dccea4917358db368bf48a05386eb29d163817739bfea5cb9461" gracePeriod=30 Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.887071 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87439d95-8eff-46d0-82a9-011d6f396f56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87439d95-8eff-46d0-82a9-011d6f396f56" (UID: "87439d95-8eff-46d0-82a9-011d6f396f56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.930850 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87439d95-8eff-46d0-82a9-011d6f396f56-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.931083 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sskx\" (UniqueName: \"kubernetes.io/projected/87439d95-8eff-46d0-82a9-011d6f396f56-kube-api-access-8sskx\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.931095 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87439d95-8eff-46d0-82a9-011d6f396f56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:09 crc kubenswrapper[4841]: I0130 05:28:09.931104 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87439d95-8eff-46d0-82a9-011d6f396f56-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.161355 4841 generic.go:334] "Generic (PLEG): container finished" podID="e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f" containerID="5bd35b53f796dccea4917358db368bf48a05386eb29d163817739bfea5cb9461" exitCode=2 Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.161478 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f","Type":"ContainerDied","Data":"5bd35b53f796dccea4917358db368bf48a05386eb29d163817739bfea5cb9461"} Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.164808 4841 generic.go:334] "Generic (PLEG): container finished" podID="87439d95-8eff-46d0-82a9-011d6f396f56" containerID="84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557" exitCode=0 Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.164855 4841 generic.go:334] "Generic (PLEG): container finished" podID="87439d95-8eff-46d0-82a9-011d6f396f56" containerID="da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57" exitCode=143 Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.165955 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.167502 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87439d95-8eff-46d0-82a9-011d6f396f56","Type":"ContainerDied","Data":"84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557"} Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.167536 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87439d95-8eff-46d0-82a9-011d6f396f56","Type":"ContainerDied","Data":"da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57"} Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.167548 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87439d95-8eff-46d0-82a9-011d6f396f56","Type":"ContainerDied","Data":"c7b7f8a5bef7100d06d6185d6cce9bb9b52f9eb211c1dbba9eaad076dd5ec720"} Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.167563 4841 scope.go:117] "RemoveContainer" containerID="84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.208539 4841 scope.go:117] "RemoveContainer" containerID="da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.222132 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.247644 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.254649 4841 scope.go:117] "RemoveContainer" containerID="84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557" Jan 30 05:28:10 crc kubenswrapper[4841]: E0130 05:28:10.255107 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557\": container with ID starting with 84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557 not found: ID does not exist" containerID="84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.255150 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557"} err="failed to get container status \"84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557\": rpc error: code = NotFound desc = could not find container \"84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557\": container with ID starting with 84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557 not found: ID does not exist" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.255177 4841 scope.go:117] "RemoveContainer" containerID="da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57" Jan 30 05:28:10 crc kubenswrapper[4841]: E0130 05:28:10.256350 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57\": container with ID starting with da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57 not found: ID does not exist" containerID="da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.256380 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57"} err="failed to get container status \"da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57\": rpc error: code = NotFound desc = could not find container \"da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57\": container with ID starting with da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57 not found: ID does not exist" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.256418 4841 scope.go:117] "RemoveContainer" containerID="84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.256628 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557"} err="failed to get container status \"84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557\": rpc error: code = NotFound desc = could not find container \"84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557\": container with ID starting with 84189e5dd02f22d6a0c4c1a465fbccd2b0cda8eff94b62f63b700da2bf58f557 not found: ID does not exist" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.256653 4841 scope.go:117] "RemoveContainer" containerID="da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.260497 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:10 crc kubenswrapper[4841]: E0130 05:28:10.260887 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87439d95-8eff-46d0-82a9-011d6f396f56" containerName="nova-metadata-log" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.260904 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="87439d95-8eff-46d0-82a9-011d6f396f56" containerName="nova-metadata-log" Jan 30 05:28:10 crc kubenswrapper[4841]: E0130 05:28:10.260927 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87439d95-8eff-46d0-82a9-011d6f396f56" containerName="nova-metadata-metadata" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.260934 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="87439d95-8eff-46d0-82a9-011d6f396f56" containerName="nova-metadata-metadata" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.261104 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="87439d95-8eff-46d0-82a9-011d6f396f56" containerName="nova-metadata-metadata" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.261120 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="87439d95-8eff-46d0-82a9-011d6f396f56" containerName="nova-metadata-log" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.261720 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57"} err="failed to get container status \"da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57\": rpc error: code = NotFound desc = could not find container \"da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57\": container with ID starting with da9a3df5b29a2b6243b798e58c5c8e6cc501cf719d5ac841382c4649c4f80e57 not found: ID does not exist" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.262070 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.264342 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.264425 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.267207 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.329842 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.346630 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h4zg\" (UniqueName: \"kubernetes.io/projected/e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f-kube-api-access-2h4zg\") pod \"e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f\" (UID: \"e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f\") " Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.346886 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.346921 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456013ca-403f-4f94-914c-9111ffa0a678-logs\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.346947 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-config-data\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.346999 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfmkh\" (UniqueName: \"kubernetes.io/projected/456013ca-403f-4f94-914c-9111ffa0a678-kube-api-access-sfmkh\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.347049 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.352159 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f-kube-api-access-2h4zg" (OuterVolumeSpecName: "kube-api-access-2h4zg") pod "e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f" (UID: "e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f"). InnerVolumeSpecName "kube-api-access-2h4zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.441522 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87439d95-8eff-46d0-82a9-011d6f396f56" path="/var/lib/kubelet/pods/87439d95-8eff-46d0-82a9-011d6f396f56/volumes" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.448511 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfmkh\" (UniqueName: \"kubernetes.io/projected/456013ca-403f-4f94-914c-9111ffa0a678-kube-api-access-sfmkh\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.448810 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.448876 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.448925 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456013ca-403f-4f94-914c-9111ffa0a678-logs\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.448953 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-config-data\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.449022 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h4zg\" (UniqueName: \"kubernetes.io/projected/e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f-kube-api-access-2h4zg\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.449845 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456013ca-403f-4f94-914c-9111ffa0a678-logs\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.452141 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.452971 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.453693 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-config-data\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.469575 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfmkh\" (UniqueName: \"kubernetes.io/projected/456013ca-403f-4f94-914c-9111ffa0a678-kube-api-access-sfmkh\") pod \"nova-metadata-0\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " pod="openstack/nova-metadata-0" Jan 30 05:28:10 crc kubenswrapper[4841]: I0130 05:28:10.604058 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.094505 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:11 crc kubenswrapper[4841]: W0130 05:28:11.098598 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod456013ca_403f_4f94_914c_9111ffa0a678.slice/crio-77155c2dcba7a50755f35509119298b8ec170565c10797ac18845a947e07b6d4 WatchSource:0}: Error finding container 77155c2dcba7a50755f35509119298b8ec170565c10797ac18845a947e07b6d4: Status 404 returned error can't find the container with id 77155c2dcba7a50755f35509119298b8ec170565c10797ac18845a947e07b6d4 Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.211632 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.211687 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f","Type":"ContainerDied","Data":"692f951e0050bdc3d6c9526d02e8fbdbfe850520ea37731736a809e0bc15b7eb"} Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.211735 4841 scope.go:117] "RemoveContainer" containerID="5bd35b53f796dccea4917358db368bf48a05386eb29d163817739bfea5cb9461" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.216008 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"456013ca-403f-4f94-914c-9111ffa0a678","Type":"ContainerStarted","Data":"77155c2dcba7a50755f35509119298b8ec170565c10797ac18845a947e07b6d4"} Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.340352 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.357097 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.372441 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:28:11 crc kubenswrapper[4841]: E0130 05:28:11.372909 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f" containerName="kube-state-metrics" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.372929 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f" containerName="kube-state-metrics" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.373131 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f" containerName="kube-state-metrics" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.373761 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.377129 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.377298 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.384243 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.471042 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmtnz\" (UniqueName: \"kubernetes.io/projected/ef067657-4804-406e-b45f-e19553dcd2d8-kube-api-access-pmtnz\") pod \"kube-state-metrics-0\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.471093 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.471116 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.471159 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.573425 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmtnz\" (UniqueName: \"kubernetes.io/projected/ef067657-4804-406e-b45f-e19553dcd2d8-kube-api-access-pmtnz\") pod \"kube-state-metrics-0\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.573688 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.573734 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.573772 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.577847 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.578258 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.579999 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.588542 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmtnz\" (UniqueName: \"kubernetes.io/projected/ef067657-4804-406e-b45f-e19553dcd2d8-kube-api-access-pmtnz\") pod \"kube-state-metrics-0\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.691003 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.770088 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.770353 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="ceilometer-central-agent" containerID="cri-o://24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672" gracePeriod=30 Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.770574 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="proxy-httpd" containerID="cri-o://195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933" gracePeriod=30 Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.770650 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="sg-core" containerID="cri-o://6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953" gracePeriod=30 Jan 30 05:28:11 crc kubenswrapper[4841]: I0130 05:28:11.770665 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="ceilometer-notification-agent" containerID="cri-o://2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7" gracePeriod=30 Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.201951 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.231199 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"456013ca-403f-4f94-914c-9111ffa0a678","Type":"ContainerStarted","Data":"e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194"} Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.231244 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"456013ca-403f-4f94-914c-9111ffa0a678","Type":"ContainerStarted","Data":"9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288"} Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.234611 4841 generic.go:334] "Generic (PLEG): container finished" podID="1faa8a3a-4660-4a7a-81d5-0fd94025b1ad" containerID="23495fc7312790940a345e75d7fefde2b7f84aea9ac21773cd38a0282daaaf9f" exitCode=0 Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.234714 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fjjz6" event={"ID":"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad","Type":"ContainerDied","Data":"23495fc7312790940a345e75d7fefde2b7f84aea9ac21773cd38a0282daaaf9f"} Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.241108 4841 generic.go:334] "Generic (PLEG): container finished" podID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerID="195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933" exitCode=0 Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.241142 4841 generic.go:334] "Generic (PLEG): container finished" podID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerID="6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953" exitCode=2 Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.241151 4841 generic.go:334] "Generic (PLEG): container finished" podID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerID="24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672" exitCode=0 Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.241197 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd22c20-b602-4e60-a34f-dc2e2ce718e7","Type":"ContainerDied","Data":"195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933"} Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.241226 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd22c20-b602-4e60-a34f-dc2e2ce718e7","Type":"ContainerDied","Data":"6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953"} Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.241239 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd22c20-b602-4e60-a34f-dc2e2ce718e7","Type":"ContainerDied","Data":"24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672"} Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.243216 4841 generic.go:334] "Generic (PLEG): container finished" podID="654366d0-8b60-4ae2-bde2-981cdc9464a4" containerID="38df243ff6c49b1e54302c7946a385effd3bf2fb3d74c73e327e4d81515afc49" exitCode=0 Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.243249 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wnzmn" event={"ID":"654366d0-8b60-4ae2-bde2-981cdc9464a4","Type":"ContainerDied","Data":"38df243ff6c49b1e54302c7946a385effd3bf2fb3d74c73e327e4d81515afc49"} Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.265238 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.265212962 podStartE2EDuration="2.265212962s" podCreationTimestamp="2026-01-30 05:28:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:12.255695056 +0000 UTC m=+1229.249167704" watchObservedRunningTime="2026-01-30 05:28:12.265212962 +0000 UTC m=+1229.258685620" Jan 30 05:28:12 crc kubenswrapper[4841]: I0130 05:28:12.442295 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f" path="/var/lib/kubelet/pods/e7a3ee64-f1d6-4f70-90ac-ac7550a66d6f/volumes" Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.258485 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ef067657-4804-406e-b45f-e19553dcd2d8","Type":"ContainerStarted","Data":"c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad"} Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.258808 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ef067657-4804-406e-b45f-e19553dcd2d8","Type":"ContainerStarted","Data":"b9f7b2bd9deaf526075ef3854fe26c25fc2dbb657b638d758de565f569a85f74"} Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.294169 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.907350144 podStartE2EDuration="2.294141266s" podCreationTimestamp="2026-01-30 05:28:11 +0000 UTC" firstStartedPulling="2026-01-30 05:28:12.225730728 +0000 UTC m=+1229.219203376" lastFinishedPulling="2026-01-30 05:28:12.61252181 +0000 UTC m=+1229.605994498" observedRunningTime="2026-01-30 05:28:13.285834192 +0000 UTC m=+1230.279306860" watchObservedRunningTime="2026-01-30 05:28:13.294141266 +0000 UTC m=+1230.287613934" Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.456926 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.493047 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.501224 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.501278 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.760301 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.789630 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.882009 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-c2tx2"] Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.882274 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" podUID="ccfedfd0-1320-452d-b99e-5941a9601014" containerName="dnsmasq-dns" containerID="cri-o://d070949f85ad9838c18c65ae2129ba936758deaf2d70aa746df18e3aa5a1e644" gracePeriod=10 Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.888447 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.939190 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2gp9\" (UniqueName: \"kubernetes.io/projected/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-kube-api-access-s2gp9\") pod \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.939323 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-config-data\") pod \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.939365 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-combined-ca-bundle\") pod \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.939436 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-scripts\") pod \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\" (UID: \"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad\") " Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.947549 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-kube-api-access-s2gp9" (OuterVolumeSpecName: "kube-api-access-s2gp9") pod "1faa8a3a-4660-4a7a-81d5-0fd94025b1ad" (UID: "1faa8a3a-4660-4a7a-81d5-0fd94025b1ad"). InnerVolumeSpecName "kube-api-access-s2gp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.974481 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1faa8a3a-4660-4a7a-81d5-0fd94025b1ad" (UID: "1faa8a3a-4660-4a7a-81d5-0fd94025b1ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:13 crc kubenswrapper[4841]: I0130 05:28:13.985698 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-scripts" (OuterVolumeSpecName: "scripts") pod "1faa8a3a-4660-4a7a-81d5-0fd94025b1ad" (UID: "1faa8a3a-4660-4a7a-81d5-0fd94025b1ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.000817 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-config-data" (OuterVolumeSpecName: "config-data") pod "1faa8a3a-4660-4a7a-81d5-0fd94025b1ad" (UID: "1faa8a3a-4660-4a7a-81d5-0fd94025b1ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.018015 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.041778 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-scripts\") pod \"654366d0-8b60-4ae2-bde2-981cdc9464a4\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.041986 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-config-data\") pod \"654366d0-8b60-4ae2-bde2-981cdc9464a4\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.042039 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-combined-ca-bundle\") pod \"654366d0-8b60-4ae2-bde2-981cdc9464a4\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.042070 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79c4h\" (UniqueName: \"kubernetes.io/projected/654366d0-8b60-4ae2-bde2-981cdc9464a4-kube-api-access-79c4h\") pod \"654366d0-8b60-4ae2-bde2-981cdc9464a4\" (UID: \"654366d0-8b60-4ae2-bde2-981cdc9464a4\") " Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.042478 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2gp9\" (UniqueName: \"kubernetes.io/projected/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-kube-api-access-s2gp9\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.042496 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.042504 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.042514 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.045707 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654366d0-8b60-4ae2-bde2-981cdc9464a4-kube-api-access-79c4h" (OuterVolumeSpecName: "kube-api-access-79c4h") pod "654366d0-8b60-4ae2-bde2-981cdc9464a4" (UID: "654366d0-8b60-4ae2-bde2-981cdc9464a4"). InnerVolumeSpecName "kube-api-access-79c4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.056790 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-scripts" (OuterVolumeSpecName: "scripts") pod "654366d0-8b60-4ae2-bde2-981cdc9464a4" (UID: "654366d0-8b60-4ae2-bde2-981cdc9464a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.092667 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "654366d0-8b60-4ae2-bde2-981cdc9464a4" (UID: "654366d0-8b60-4ae2-bde2-981cdc9464a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.095921 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-config-data" (OuterVolumeSpecName: "config-data") pod "654366d0-8b60-4ae2-bde2-981cdc9464a4" (UID: "654366d0-8b60-4ae2-bde2-981cdc9464a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.143560 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-sg-core-conf-yaml\") pod \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.143632 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-config-data\") pod \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.143671 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks5sb\" (UniqueName: \"kubernetes.io/projected/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-kube-api-access-ks5sb\") pod \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.143698 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-combined-ca-bundle\") pod \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.143734 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-scripts\") pod \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.143779 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-log-httpd\") pod \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.143843 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-run-httpd\") pod \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\" (UID: \"7dd22c20-b602-4e60-a34f-dc2e2ce718e7\") " Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.144233 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.144249 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.144259 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79c4h\" (UniqueName: \"kubernetes.io/projected/654366d0-8b60-4ae2-bde2-981cdc9464a4-kube-api-access-79c4h\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.144269 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/654366d0-8b60-4ae2-bde2-981cdc9464a4-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.144584 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7dd22c20-b602-4e60-a34f-dc2e2ce718e7" (UID: "7dd22c20-b602-4e60-a34f-dc2e2ce718e7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.144845 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7dd22c20-b602-4e60-a34f-dc2e2ce718e7" (UID: "7dd22c20-b602-4e60-a34f-dc2e2ce718e7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.148523 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-scripts" (OuterVolumeSpecName: "scripts") pod "7dd22c20-b602-4e60-a34f-dc2e2ce718e7" (UID: "7dd22c20-b602-4e60-a34f-dc2e2ce718e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.150513 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-kube-api-access-ks5sb" (OuterVolumeSpecName: "kube-api-access-ks5sb") pod "7dd22c20-b602-4e60-a34f-dc2e2ce718e7" (UID: "7dd22c20-b602-4e60-a34f-dc2e2ce718e7"). InnerVolumeSpecName "kube-api-access-ks5sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.170562 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7dd22c20-b602-4e60-a34f-dc2e2ce718e7" (UID: "7dd22c20-b602-4e60-a34f-dc2e2ce718e7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.210664 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dd22c20-b602-4e60-a34f-dc2e2ce718e7" (UID: "7dd22c20-b602-4e60-a34f-dc2e2ce718e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.246647 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.246701 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks5sb\" (UniqueName: \"kubernetes.io/projected/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-kube-api-access-ks5sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.246714 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.246724 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.246837 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.246850 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.262650 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-config-data" (OuterVolumeSpecName: "config-data") pod "7dd22c20-b602-4e60-a34f-dc2e2ce718e7" (UID: "7dd22c20-b602-4e60-a34f-dc2e2ce718e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.269226 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wnzmn" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.269224 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wnzmn" event={"ID":"654366d0-8b60-4ae2-bde2-981cdc9464a4","Type":"ContainerDied","Data":"68c8ca8a5121508f0687ae36d1ca733f3e9d6954b7a9294a6c7499f1bfe310d7"} Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.269410 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68c8ca8a5121508f0687ae36d1ca733f3e9d6954b7a9294a6c7499f1bfe310d7" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.270584 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fjjz6" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.270623 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fjjz6" event={"ID":"1faa8a3a-4660-4a7a-81d5-0fd94025b1ad","Type":"ContainerDied","Data":"efd422053062933fb83364a3e31f2b4a59c34cf05f07a4c69529ea97cb571a47"} Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.270647 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efd422053062933fb83364a3e31f2b4a59c34cf05f07a4c69529ea97cb571a47" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.274350 4841 generic.go:334] "Generic (PLEG): container finished" podID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerID="2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7" exitCode=0 Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.274411 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd22c20-b602-4e60-a34f-dc2e2ce718e7","Type":"ContainerDied","Data":"2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7"} Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.274429 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd22c20-b602-4e60-a34f-dc2e2ce718e7","Type":"ContainerDied","Data":"894b253fe02feab536a7b48c4126c9d95e0c096ee8cedae7b1b2286ed11849a9"} Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.274446 4841 scope.go:117] "RemoveContainer" containerID="195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.274562 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.288738 4841 generic.go:334] "Generic (PLEG): container finished" podID="ccfedfd0-1320-452d-b99e-5941a9601014" containerID="d070949f85ad9838c18c65ae2129ba936758deaf2d70aa746df18e3aa5a1e644" exitCode=0 Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.288821 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" event={"ID":"ccfedfd0-1320-452d-b99e-5941a9601014","Type":"ContainerDied","Data":"d070949f85ad9838c18c65ae2129ba936758deaf2d70aa746df18e3aa5a1e644"} Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.289078 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.322373 4841 scope.go:117] "RemoveContainer" containerID="6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.334717 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.347990 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd22c20-b602-4e60-a34f-dc2e2ce718e7-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.352022 4841 scope.go:117] "RemoveContainer" containerID="2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.367654 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.375764 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.385230 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 05:28:14 crc kubenswrapper[4841]: E0130 05:28:14.385682 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="sg-core" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.385700 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="sg-core" Jan 30 05:28:14 crc kubenswrapper[4841]: E0130 05:28:14.385709 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="proxy-httpd" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.385715 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="proxy-httpd" Jan 30 05:28:14 crc kubenswrapper[4841]: E0130 05:28:14.385733 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1faa8a3a-4660-4a7a-81d5-0fd94025b1ad" containerName="nova-cell1-conductor-db-sync" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.385741 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1faa8a3a-4660-4a7a-81d5-0fd94025b1ad" containerName="nova-cell1-conductor-db-sync" Jan 30 05:28:14 crc kubenswrapper[4841]: E0130 05:28:14.385757 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="ceilometer-notification-agent" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.385763 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="ceilometer-notification-agent" Jan 30 05:28:14 crc kubenswrapper[4841]: E0130 05:28:14.385789 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="ceilometer-central-agent" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.385796 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="ceilometer-central-agent" Jan 30 05:28:14 crc kubenswrapper[4841]: E0130 05:28:14.385809 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654366d0-8b60-4ae2-bde2-981cdc9464a4" containerName="nova-manage" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.385815 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="654366d0-8b60-4ae2-bde2-981cdc9464a4" containerName="nova-manage" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.385971 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1faa8a3a-4660-4a7a-81d5-0fd94025b1ad" containerName="nova-cell1-conductor-db-sync" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.385985 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="ceilometer-central-agent" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.385998 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="ceilometer-notification-agent" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.386009 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="654366d0-8b60-4ae2-bde2-981cdc9464a4" containerName="nova-manage" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.386017 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="sg-core" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.386030 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" containerName="proxy-httpd" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.386656 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.397838 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.400063 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.416495 4841 scope.go:117] "RemoveContainer" containerID="24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.427826 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.430124 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.430251 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.432107 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.432341 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.432481 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.446025 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd22c20-b602-4e60-a34f-dc2e2ce718e7" path="/var/lib/kubelet/pods/7dd22c20-b602-4e60-a34f-dc2e2ce718e7/volumes" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.459241 4841 scope.go:117] "RemoveContainer" containerID="195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933" Jan 30 05:28:14 crc kubenswrapper[4841]: E0130 05:28:14.459973 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933\": container with ID starting with 195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933 not found: ID does not exist" containerID="195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.460012 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933"} err="failed to get container status \"195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933\": rpc error: code = NotFound desc = could not find container \"195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933\": container with ID starting with 195fefe5b5e5e5a1113f399c663b3e93995a7f57985db1a7f9848b266e37b933 not found: ID does not exist" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.460036 4841 scope.go:117] "RemoveContainer" containerID="6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953" Jan 30 05:28:14 crc kubenswrapper[4841]: E0130 05:28:14.460551 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953\": container with ID starting with 6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953 not found: ID does not exist" containerID="6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.460579 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953"} err="failed to get container status \"6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953\": rpc error: code = NotFound desc = could not find container \"6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953\": container with ID starting with 6af497c6ee4d0b31fe0f7b80ab6a725dd20684c0156ef2b8c55585e27000b953 not found: ID does not exist" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.460601 4841 scope.go:117] "RemoveContainer" containerID="2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7" Jan 30 05:28:14 crc kubenswrapper[4841]: E0130 05:28:14.465442 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7\": container with ID starting with 2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7 not found: ID does not exist" containerID="2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.465487 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7"} err="failed to get container status \"2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7\": rpc error: code = NotFound desc = could not find container \"2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7\": container with ID starting with 2c6cf9f630311d112ca892ac56b1491313a172b7a2e212a05a32c5d1a5f34bc7 not found: ID does not exist" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.465514 4841 scope.go:117] "RemoveContainer" containerID="24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672" Jan 30 05:28:14 crc kubenswrapper[4841]: E0130 05:28:14.470223 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672\": container with ID starting with 24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672 not found: ID does not exist" containerID="24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.470252 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672"} err="failed to get container status \"24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672\": rpc error: code = NotFound desc = could not find container \"24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672\": container with ID starting with 24561fa820f0eb772a068b205fa915d95d2a2284e25ff7e3c07b782ac788b672 not found: ID does not exist" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.506745 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.507199 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" containerName="nova-api-log" containerID="cri-o://6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1" gracePeriod=30 Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.507698 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" containerName="nova-api-api" containerID="cri-o://072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2" gracePeriod=30 Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.515269 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.515362 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.524345 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.524537 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="456013ca-403f-4f94-914c-9111ffa0a678" containerName="nova-metadata-log" containerID="cri-o://9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288" gracePeriod=30 Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.524907 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="456013ca-403f-4f94-914c-9111ffa0a678" containerName="nova-metadata-metadata" containerID="cri-o://e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194" gracePeriod=30 Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.560921 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.561010 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a29cc8-4615-40e7-a687-1852db124ba0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"60a29cc8-4615-40e7-a687-1852db124ba0\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.561031 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.561071 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.561103 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq46c\" (UniqueName: \"kubernetes.io/projected/60a29cc8-4615-40e7-a687-1852db124ba0-kube-api-access-bq46c\") pod \"nova-cell1-conductor-0\" (UID: \"60a29cc8-4615-40e7-a687-1852db124ba0\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.561180 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-scripts\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.561231 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750288ad-af7e-4723-90a9-00b070e12063-log-httpd\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.561249 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-config-data\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.561319 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750288ad-af7e-4723-90a9-00b070e12063-run-httpd\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.561340 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a29cc8-4615-40e7-a687-1852db124ba0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"60a29cc8-4615-40e7-a687-1852db124ba0\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.561418 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n245\" (UniqueName: \"kubernetes.io/projected/750288ad-af7e-4723-90a9-00b070e12063-kube-api-access-8n245\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.662748 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750288ad-af7e-4723-90a9-00b070e12063-run-httpd\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.662791 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a29cc8-4615-40e7-a687-1852db124ba0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"60a29cc8-4615-40e7-a687-1852db124ba0\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.662852 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n245\" (UniqueName: \"kubernetes.io/projected/750288ad-af7e-4723-90a9-00b070e12063-kube-api-access-8n245\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.662910 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.663241 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a29cc8-4615-40e7-a687-1852db124ba0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"60a29cc8-4615-40e7-a687-1852db124ba0\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.663279 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.663256 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750288ad-af7e-4723-90a9-00b070e12063-run-httpd\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.663298 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.663356 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq46c\" (UniqueName: \"kubernetes.io/projected/60a29cc8-4615-40e7-a687-1852db124ba0-kube-api-access-bq46c\") pod \"nova-cell1-conductor-0\" (UID: \"60a29cc8-4615-40e7-a687-1852db124ba0\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.663450 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-scripts\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.663539 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750288ad-af7e-4723-90a9-00b070e12063-log-httpd\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.663558 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-config-data\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.666956 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750288ad-af7e-4723-90a9-00b070e12063-log-httpd\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.669777 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.672296 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-config-data\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.672538 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a29cc8-4615-40e7-a687-1852db124ba0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"60a29cc8-4615-40e7-a687-1852db124ba0\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.672983 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-scripts\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.674182 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.674466 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.674554 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a29cc8-4615-40e7-a687-1852db124ba0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"60a29cc8-4615-40e7-a687-1852db124ba0\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.677720 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n245\" (UniqueName: \"kubernetes.io/projected/750288ad-af7e-4723-90a9-00b070e12063-kube-api-access-8n245\") pod \"ceilometer-0\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.682530 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq46c\" (UniqueName: \"kubernetes.io/projected/60a29cc8-4615-40e7-a687-1852db124ba0-kube-api-access-bq46c\") pod \"nova-cell1-conductor-0\" (UID: \"60a29cc8-4615-40e7-a687-1852db124ba0\") " pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.703086 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.759298 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.879313 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:14 crc kubenswrapper[4841]: I0130 05:28:14.952050 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.074264 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-dns-svc\") pod \"ccfedfd0-1320-452d-b99e-5941a9601014\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.074429 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-dns-swift-storage-0\") pod \"ccfedfd0-1320-452d-b99e-5941a9601014\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.074458 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-config\") pod \"ccfedfd0-1320-452d-b99e-5941a9601014\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.074556 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-ovsdbserver-sb\") pod \"ccfedfd0-1320-452d-b99e-5941a9601014\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.074629 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs75w\" (UniqueName: \"kubernetes.io/projected/ccfedfd0-1320-452d-b99e-5941a9601014-kube-api-access-bs75w\") pod \"ccfedfd0-1320-452d-b99e-5941a9601014\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.074673 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-ovsdbserver-nb\") pod \"ccfedfd0-1320-452d-b99e-5941a9601014\" (UID: \"ccfedfd0-1320-452d-b99e-5941a9601014\") " Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.105285 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccfedfd0-1320-452d-b99e-5941a9601014-kube-api-access-bs75w" (OuterVolumeSpecName: "kube-api-access-bs75w") pod "ccfedfd0-1320-452d-b99e-5941a9601014" (UID: "ccfedfd0-1320-452d-b99e-5941a9601014"). InnerVolumeSpecName "kube-api-access-bs75w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.157114 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ccfedfd0-1320-452d-b99e-5941a9601014" (UID: "ccfedfd0-1320-452d-b99e-5941a9601014"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.176617 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs75w\" (UniqueName: \"kubernetes.io/projected/ccfedfd0-1320-452d-b99e-5941a9601014-kube-api-access-bs75w\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.176637 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.183061 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.184879 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ccfedfd0-1320-452d-b99e-5941a9601014" (UID: "ccfedfd0-1320-452d-b99e-5941a9601014"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.189212 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ccfedfd0-1320-452d-b99e-5941a9601014" (UID: "ccfedfd0-1320-452d-b99e-5941a9601014"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.192864 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-config" (OuterVolumeSpecName: "config") pod "ccfedfd0-1320-452d-b99e-5941a9601014" (UID: "ccfedfd0-1320-452d-b99e-5941a9601014"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.206835 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ccfedfd0-1320-452d-b99e-5941a9601014" (UID: "ccfedfd0-1320-452d-b99e-5941a9601014"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.277870 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfmkh\" (UniqueName: \"kubernetes.io/projected/456013ca-403f-4f94-914c-9111ffa0a678-kube-api-access-sfmkh\") pod \"456013ca-403f-4f94-914c-9111ffa0a678\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.277945 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-config-data\") pod \"456013ca-403f-4f94-914c-9111ffa0a678\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.277967 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-combined-ca-bundle\") pod \"456013ca-403f-4f94-914c-9111ffa0a678\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.278034 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456013ca-403f-4f94-914c-9111ffa0a678-logs\") pod \"456013ca-403f-4f94-914c-9111ffa0a678\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.278320 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/456013ca-403f-4f94-914c-9111ffa0a678-logs" (OuterVolumeSpecName: "logs") pod "456013ca-403f-4f94-914c-9111ffa0a678" (UID: "456013ca-403f-4f94-914c-9111ffa0a678"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.278359 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-nova-metadata-tls-certs\") pod \"456013ca-403f-4f94-914c-9111ffa0a678\" (UID: \"456013ca-403f-4f94-914c-9111ffa0a678\") " Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.279124 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.279142 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456013ca-403f-4f94-914c-9111ffa0a678-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.279153 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.279162 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.279170 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccfedfd0-1320-452d-b99e-5941a9601014-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.281944 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/456013ca-403f-4f94-914c-9111ffa0a678-kube-api-access-sfmkh" (OuterVolumeSpecName: "kube-api-access-sfmkh") pod "456013ca-403f-4f94-914c-9111ffa0a678" (UID: "456013ca-403f-4f94-914c-9111ffa0a678"). InnerVolumeSpecName "kube-api-access-sfmkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.300203 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.300716 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-c2tx2" event={"ID":"ccfedfd0-1320-452d-b99e-5941a9601014","Type":"ContainerDied","Data":"4ad9ddcece4b7f4422f84b83fa5cb5c0af6b0640242305c0cdf1e7f49f3eded1"} Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.300769 4841 scope.go:117] "RemoveContainer" containerID="d070949f85ad9838c18c65ae2129ba936758deaf2d70aa746df18e3aa5a1e644" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.301207 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-config-data" (OuterVolumeSpecName: "config-data") pod "456013ca-403f-4f94-914c-9111ffa0a678" (UID: "456013ca-403f-4f94-914c-9111ffa0a678"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.304768 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "456013ca-403f-4f94-914c-9111ffa0a678" (UID: "456013ca-403f-4f94-914c-9111ffa0a678"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.307319 4841 generic.go:334] "Generic (PLEG): container finished" podID="456013ca-403f-4f94-914c-9111ffa0a678" containerID="e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194" exitCode=0 Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.307350 4841 generic.go:334] "Generic (PLEG): container finished" podID="456013ca-403f-4f94-914c-9111ffa0a678" containerID="9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288" exitCode=143 Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.307537 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.308009 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"456013ca-403f-4f94-914c-9111ffa0a678","Type":"ContainerDied","Data":"e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194"} Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.308051 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"456013ca-403f-4f94-914c-9111ffa0a678","Type":"ContainerDied","Data":"9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288"} Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.308061 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"456013ca-403f-4f94-914c-9111ffa0a678","Type":"ContainerDied","Data":"77155c2dcba7a50755f35509119298b8ec170565c10797ac18845a947e07b6d4"} Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.310310 4841 generic.go:334] "Generic (PLEG): container finished" podID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" containerID="6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1" exitCode=143 Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.311450 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce","Type":"ContainerDied","Data":"6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1"} Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.345348 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-c2tx2"] Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.350812 4841 scope.go:117] "RemoveContainer" containerID="290b306f10c71213e52e22feccb083a2e99d29e7890b6f8bfa1024b4ea5849b5" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.355847 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "456013ca-403f-4f94-914c-9111ffa0a678" (UID: "456013ca-403f-4f94-914c-9111ffa0a678"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.376134 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-c2tx2"] Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.380564 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfmkh\" (UniqueName: \"kubernetes.io/projected/456013ca-403f-4f94-914c-9111ffa0a678-kube-api-access-sfmkh\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.380595 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.380608 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.380621 4841 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/456013ca-403f-4f94-914c-9111ffa0a678-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.381893 4841 scope.go:117] "RemoveContainer" containerID="e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.390685 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.398705 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.399565 4841 scope.go:117] "RemoveContainer" containerID="9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.424258 4841 scope.go:117] "RemoveContainer" containerID="e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194" Jan 30 05:28:15 crc kubenswrapper[4841]: E0130 05:28:15.424819 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194\": container with ID starting with e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194 not found: ID does not exist" containerID="e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.424881 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194"} err="failed to get container status \"e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194\": rpc error: code = NotFound desc = could not find container \"e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194\": container with ID starting with e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194 not found: ID does not exist" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.424908 4841 scope.go:117] "RemoveContainer" containerID="9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288" Jan 30 05:28:15 crc kubenswrapper[4841]: E0130 05:28:15.425504 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288\": container with ID starting with 9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288 not found: ID does not exist" containerID="9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.425554 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288"} err="failed to get container status \"9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288\": rpc error: code = NotFound desc = could not find container \"9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288\": container with ID starting with 9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288 not found: ID does not exist" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.425587 4841 scope.go:117] "RemoveContainer" containerID="e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.425933 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194"} err="failed to get container status \"e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194\": rpc error: code = NotFound desc = could not find container \"e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194\": container with ID starting with e5c03b478a8c74e7a7700791db0c6cabee534b4db5249fb04cf2305768a9e194 not found: ID does not exist" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.425958 4841 scope.go:117] "RemoveContainer" containerID="9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.426221 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288"} err="failed to get container status \"9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288\": rpc error: code = NotFound desc = could not find container \"9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288\": container with ID starting with 9ce77433adc8f481e8522c551e1b670cefb9f73d29f5fc765d4bf38ccda2b288 not found: ID does not exist" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.652835 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.660336 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.682543 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:15 crc kubenswrapper[4841]: E0130 05:28:15.682953 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456013ca-403f-4f94-914c-9111ffa0a678" containerName="nova-metadata-log" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.682972 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="456013ca-403f-4f94-914c-9111ffa0a678" containerName="nova-metadata-log" Jan 30 05:28:15 crc kubenswrapper[4841]: E0130 05:28:15.683006 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccfedfd0-1320-452d-b99e-5941a9601014" containerName="dnsmasq-dns" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.683013 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccfedfd0-1320-452d-b99e-5941a9601014" containerName="dnsmasq-dns" Jan 30 05:28:15 crc kubenswrapper[4841]: E0130 05:28:15.683026 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456013ca-403f-4f94-914c-9111ffa0a678" containerName="nova-metadata-metadata" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.683034 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="456013ca-403f-4f94-914c-9111ffa0a678" containerName="nova-metadata-metadata" Jan 30 05:28:15 crc kubenswrapper[4841]: E0130 05:28:15.683056 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccfedfd0-1320-452d-b99e-5941a9601014" containerName="init" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.683065 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccfedfd0-1320-452d-b99e-5941a9601014" containerName="init" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.683256 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="456013ca-403f-4f94-914c-9111ffa0a678" containerName="nova-metadata-metadata" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.683270 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccfedfd0-1320-452d-b99e-5941a9601014" containerName="dnsmasq-dns" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.683281 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="456013ca-403f-4f94-914c-9111ffa0a678" containerName="nova-metadata-log" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.689593 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.691466 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.691754 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.713994 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.787417 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfqj2\" (UniqueName: \"kubernetes.io/projected/585eb87e-f9af-4f01-9386-b28940a3d039-kube-api-access-qfqj2\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.787725 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-config-data\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.787794 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.787833 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585eb87e-f9af-4f01-9386-b28940a3d039-logs\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.787849 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.889968 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfqj2\" (UniqueName: \"kubernetes.io/projected/585eb87e-f9af-4f01-9386-b28940a3d039-kube-api-access-qfqj2\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.890781 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-config-data\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.891533 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.891600 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585eb87e-f9af-4f01-9386-b28940a3d039-logs\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.891651 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.894831 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-config-data\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.895088 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585eb87e-f9af-4f01-9386-b28940a3d039-logs\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.904981 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.905367 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:15 crc kubenswrapper[4841]: I0130 05:28:15.907884 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfqj2\" (UniqueName: \"kubernetes.io/projected/585eb87e-f9af-4f01-9386-b28940a3d039-kube-api-access-qfqj2\") pod \"nova-metadata-0\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " pod="openstack/nova-metadata-0" Jan 30 05:28:16 crc kubenswrapper[4841]: I0130 05:28:16.061236 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:28:16 crc kubenswrapper[4841]: I0130 05:28:16.320118 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"60a29cc8-4615-40e7-a687-1852db124ba0","Type":"ContainerStarted","Data":"3043b1132b1d8cd3d83be2ceb2adfb4bc8bd0b66f8fa3fcc542bda3812e357ec"} Jan 30 05:28:16 crc kubenswrapper[4841]: I0130 05:28:16.320308 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"60a29cc8-4615-40e7-a687-1852db124ba0","Type":"ContainerStarted","Data":"3387db4692895bcb2dcd2388f34c0c36789dd3cf261768a3333ed24ddc7e2a34"} Jan 30 05:28:16 crc kubenswrapper[4841]: I0130 05:28:16.321326 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:16 crc kubenswrapper[4841]: I0130 05:28:16.324544 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750288ad-af7e-4723-90a9-00b070e12063","Type":"ContainerStarted","Data":"24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674"} Jan 30 05:28:16 crc kubenswrapper[4841]: I0130 05:28:16.324572 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750288ad-af7e-4723-90a9-00b070e12063","Type":"ContainerStarted","Data":"7dd4ac30727d5a905b927d152ba3e6544fc797e5b821687d84970324350c2812"} Jan 30 05:28:16 crc kubenswrapper[4841]: I0130 05:28:16.328183 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="38918ba4-5ba7-49ca-b6a2-6183d9e9f06a" containerName="nova-scheduler-scheduler" containerID="cri-o://c5de62d5f742f5ea5c9d3c4d4fba4e2947cbdb9e394fc12da86e9378bb057b6d" gracePeriod=30 Jan 30 05:28:16 crc kubenswrapper[4841]: I0130 05:28:16.342038 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.342021929 podStartE2EDuration="2.342021929s" podCreationTimestamp="2026-01-30 05:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:16.337261441 +0000 UTC m=+1233.330734089" watchObservedRunningTime="2026-01-30 05:28:16.342021929 +0000 UTC m=+1233.335494557" Jan 30 05:28:16 crc kubenswrapper[4841]: I0130 05:28:16.442388 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="456013ca-403f-4f94-914c-9111ffa0a678" path="/var/lib/kubelet/pods/456013ca-403f-4f94-914c-9111ffa0a678/volumes" Jan 30 05:28:16 crc kubenswrapper[4841]: I0130 05:28:16.442956 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccfedfd0-1320-452d-b99e-5941a9601014" path="/var/lib/kubelet/pods/ccfedfd0-1320-452d-b99e-5941a9601014/volumes" Jan 30 05:28:16 crc kubenswrapper[4841]: I0130 05:28:16.500307 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:17 crc kubenswrapper[4841]: I0130 05:28:17.339561 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"585eb87e-f9af-4f01-9386-b28940a3d039","Type":"ContainerStarted","Data":"e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154"} Jan 30 05:28:17 crc kubenswrapper[4841]: I0130 05:28:17.340012 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"585eb87e-f9af-4f01-9386-b28940a3d039","Type":"ContainerStarted","Data":"29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376"} Jan 30 05:28:17 crc kubenswrapper[4841]: I0130 05:28:17.340022 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"585eb87e-f9af-4f01-9386-b28940a3d039","Type":"ContainerStarted","Data":"9db138b14c637a9907c1609cdf21ed40d06f9114b565379717f86c64bfbe9baa"} Jan 30 05:28:17 crc kubenswrapper[4841]: I0130 05:28:17.344789 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750288ad-af7e-4723-90a9-00b070e12063","Type":"ContainerStarted","Data":"89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5"} Jan 30 05:28:17 crc kubenswrapper[4841]: I0130 05:28:17.344815 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750288ad-af7e-4723-90a9-00b070e12063","Type":"ContainerStarted","Data":"9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a"} Jan 30 05:28:17 crc kubenswrapper[4841]: I0130 05:28:17.369529 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.369504424 podStartE2EDuration="2.369504424s" podCreationTimestamp="2026-01-30 05:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:17.354019267 +0000 UTC m=+1234.347491895" watchObservedRunningTime="2026-01-30 05:28:17.369504424 +0000 UTC m=+1234.362977062" Jan 30 05:28:18 crc kubenswrapper[4841]: E0130 05:28:18.459568 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c5de62d5f742f5ea5c9d3c4d4fba4e2947cbdb9e394fc12da86e9378bb057b6d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:28:18 crc kubenswrapper[4841]: E0130 05:28:18.462149 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c5de62d5f742f5ea5c9d3c4d4fba4e2947cbdb9e394fc12da86e9378bb057b6d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:28:18 crc kubenswrapper[4841]: E0130 05:28:18.466778 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c5de62d5f742f5ea5c9d3c4d4fba4e2947cbdb9e394fc12da86e9378bb057b6d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:28:18 crc kubenswrapper[4841]: E0130 05:28:18.466814 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="38918ba4-5ba7-49ca-b6a2-6183d9e9f06a" containerName="nova-scheduler-scheduler" Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.382308 4841 generic.go:334] "Generic (PLEG): container finished" podID="38918ba4-5ba7-49ca-b6a2-6183d9e9f06a" containerID="c5de62d5f742f5ea5c9d3c4d4fba4e2947cbdb9e394fc12da86e9378bb057b6d" exitCode=0 Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.382385 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a","Type":"ContainerDied","Data":"c5de62d5f742f5ea5c9d3c4d4fba4e2947cbdb9e394fc12da86e9378bb057b6d"} Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.388291 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750288ad-af7e-4723-90a9-00b070e12063","Type":"ContainerStarted","Data":"fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6"} Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.388480 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.412860 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.915162547 podStartE2EDuration="5.41284298s" podCreationTimestamp="2026-01-30 05:28:14 +0000 UTC" firstStartedPulling="2026-01-30 05:28:15.381914109 +0000 UTC m=+1232.375386757" lastFinishedPulling="2026-01-30 05:28:18.879594552 +0000 UTC m=+1235.873067190" observedRunningTime="2026-01-30 05:28:19.407923988 +0000 UTC m=+1236.401396636" watchObservedRunningTime="2026-01-30 05:28:19.41284298 +0000 UTC m=+1236.406315618" Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.704769 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.865820 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfsww\" (UniqueName: \"kubernetes.io/projected/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-kube-api-access-nfsww\") pod \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\" (UID: \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\") " Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.866126 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-combined-ca-bundle\") pod \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\" (UID: \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\") " Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.866178 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-config-data\") pod \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\" (UID: \"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a\") " Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.876152 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-kube-api-access-nfsww" (OuterVolumeSpecName: "kube-api-access-nfsww") pod "38918ba4-5ba7-49ca-b6a2-6183d9e9f06a" (UID: "38918ba4-5ba7-49ca-b6a2-6183d9e9f06a"). InnerVolumeSpecName "kube-api-access-nfsww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.897513 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-config-data" (OuterVolumeSpecName: "config-data") pod "38918ba4-5ba7-49ca-b6a2-6183d9e9f06a" (UID: "38918ba4-5ba7-49ca-b6a2-6183d9e9f06a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.906381 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38918ba4-5ba7-49ca-b6a2-6183d9e9f06a" (UID: "38918ba4-5ba7-49ca-b6a2-6183d9e9f06a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.970305 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfsww\" (UniqueName: \"kubernetes.io/projected/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-kube-api-access-nfsww\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.970345 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:19 crc kubenswrapper[4841]: I0130 05:28:19.970354 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.385660 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.400097 4841 generic.go:334] "Generic (PLEG): container finished" podID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" containerID="072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2" exitCode=0 Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.400190 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.400191 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce","Type":"ContainerDied","Data":"072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2"} Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.400537 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce","Type":"ContainerDied","Data":"75437907e450e72f4c05d632d90f100963f7d0e709318caf86ef8c880f92660f"} Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.400567 4841 scope.go:117] "RemoveContainer" containerID="072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.412127 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.414282 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"38918ba4-5ba7-49ca-b6a2-6183d9e9f06a","Type":"ContainerDied","Data":"85de14798496a6b6aa90debf2c33c5265ab9a78e23c298587ffc6df2ac3188f2"} Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.444913 4841 scope.go:117] "RemoveContainer" containerID="6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.493100 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-config-data\") pod \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.493313 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-logs\") pod \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.493416 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rh2s\" (UniqueName: \"kubernetes.io/projected/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-kube-api-access-8rh2s\") pod \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.493464 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-combined-ca-bundle\") pod \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\" (UID: \"08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce\") " Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.497235 4841 scope.go:117] "RemoveContainer" containerID="072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2" Jan 30 05:28:20 crc kubenswrapper[4841]: E0130 05:28:20.498040 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2\": container with ID starting with 072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2 not found: ID does not exist" containerID="072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.498092 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2"} err="failed to get container status \"072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2\": rpc error: code = NotFound desc = could not find container \"072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2\": container with ID starting with 072e06e58c5422c4c66e43ec84b691962f829b02effa3922b26b9c5f5a0d2ea2 not found: ID does not exist" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.498135 4841 scope.go:117] "RemoveContainer" containerID="6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.498831 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:20 crc kubenswrapper[4841]: E0130 05:28:20.499032 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1\": container with ID starting with 6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1 not found: ID does not exist" containerID="6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.499807 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1"} err="failed to get container status \"6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1\": rpc error: code = NotFound desc = could not find container \"6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1\": container with ID starting with 6b15b722c362365c5bb30bea875221321bfee52343826061b39f01be13fa42e1 not found: ID does not exist" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.499935 4841 scope.go:117] "RemoveContainer" containerID="c5de62d5f742f5ea5c9d3c4d4fba4e2947cbdb9e394fc12da86e9378bb057b6d" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.502600 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-kube-api-access-8rh2s" (OuterVolumeSpecName: "kube-api-access-8rh2s") pod "08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" (UID: "08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce"). InnerVolumeSpecName "kube-api-access-8rh2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.503362 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-logs" (OuterVolumeSpecName: "logs") pod "08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" (UID: "08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.526594 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" (UID: "08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.527102 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.528695 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-config-data" (OuterVolumeSpecName: "config-data") pod "08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" (UID: "08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.535946 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:20 crc kubenswrapper[4841]: E0130 05:28:20.536330 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" containerName="nova-api-api" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.536345 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" containerName="nova-api-api" Jan 30 05:28:20 crc kubenswrapper[4841]: E0130 05:28:20.536389 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" containerName="nova-api-log" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.536406 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" containerName="nova-api-log" Jan 30 05:28:20 crc kubenswrapper[4841]: E0130 05:28:20.536425 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38918ba4-5ba7-49ca-b6a2-6183d9e9f06a" containerName="nova-scheduler-scheduler" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.536431 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="38918ba4-5ba7-49ca-b6a2-6183d9e9f06a" containerName="nova-scheduler-scheduler" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.536604 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="38918ba4-5ba7-49ca-b6a2-6183d9e9f06a" containerName="nova-scheduler-scheduler" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.536614 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" containerName="nova-api-log" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.536628 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" containerName="nova-api-api" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.537332 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.539303 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.545366 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.596373 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.596595 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.596683 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rh2s\" (UniqueName: \"kubernetes.io/projected/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-kube-api-access-8rh2s\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.596740 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.698161 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9n72\" (UniqueName: \"kubernetes.io/projected/ade12f79-5499-4065-8888-40fc5e50bfe5-kube-api-access-s9n72\") pod \"nova-scheduler-0\" (UID: \"ade12f79-5499-4065-8888-40fc5e50bfe5\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.698500 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade12f79-5499-4065-8888-40fc5e50bfe5-config-data\") pod \"nova-scheduler-0\" (UID: \"ade12f79-5499-4065-8888-40fc5e50bfe5\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.698679 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade12f79-5499-4065-8888-40fc5e50bfe5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ade12f79-5499-4065-8888-40fc5e50bfe5\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.741174 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.754773 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.765301 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.767030 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.776358 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.788991 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.805956 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade12f79-5499-4065-8888-40fc5e50bfe5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ade12f79-5499-4065-8888-40fc5e50bfe5\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.806043 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9n72\" (UniqueName: \"kubernetes.io/projected/ade12f79-5499-4065-8888-40fc5e50bfe5-kube-api-access-s9n72\") pod \"nova-scheduler-0\" (UID: \"ade12f79-5499-4065-8888-40fc5e50bfe5\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.806109 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade12f79-5499-4065-8888-40fc5e50bfe5-config-data\") pod \"nova-scheduler-0\" (UID: \"ade12f79-5499-4065-8888-40fc5e50bfe5\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.809541 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade12f79-5499-4065-8888-40fc5e50bfe5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ade12f79-5499-4065-8888-40fc5e50bfe5\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.810536 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade12f79-5499-4065-8888-40fc5e50bfe5-config-data\") pod \"nova-scheduler-0\" (UID: \"ade12f79-5499-4065-8888-40fc5e50bfe5\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.825588 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9n72\" (UniqueName: \"kubernetes.io/projected/ade12f79-5499-4065-8888-40fc5e50bfe5-kube-api-access-s9n72\") pod \"nova-scheduler-0\" (UID: \"ade12f79-5499-4065-8888-40fc5e50bfe5\") " pod="openstack/nova-scheduler-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.854892 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.907473 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a490e56-2eb9-4f78-8318-0ec31d0a545d-config-data\") pod \"nova-api-0\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " pod="openstack/nova-api-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.907560 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a490e56-2eb9-4f78-8318-0ec31d0a545d-logs\") pod \"nova-api-0\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " pod="openstack/nova-api-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.907652 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a490e56-2eb9-4f78-8318-0ec31d0a545d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " pod="openstack/nova-api-0" Jan 30 05:28:20 crc kubenswrapper[4841]: I0130 05:28:20.907682 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcgv9\" (UniqueName: \"kubernetes.io/projected/5a490e56-2eb9-4f78-8318-0ec31d0a545d-kube-api-access-gcgv9\") pod \"nova-api-0\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " pod="openstack/nova-api-0" Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.009016 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a490e56-2eb9-4f78-8318-0ec31d0a545d-config-data\") pod \"nova-api-0\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " pod="openstack/nova-api-0" Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.009273 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a490e56-2eb9-4f78-8318-0ec31d0a545d-logs\") pod \"nova-api-0\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " pod="openstack/nova-api-0" Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.009319 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a490e56-2eb9-4f78-8318-0ec31d0a545d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " pod="openstack/nova-api-0" Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.009348 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcgv9\" (UniqueName: \"kubernetes.io/projected/5a490e56-2eb9-4f78-8318-0ec31d0a545d-kube-api-access-gcgv9\") pod \"nova-api-0\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " pod="openstack/nova-api-0" Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.009732 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a490e56-2eb9-4f78-8318-0ec31d0a545d-logs\") pod \"nova-api-0\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " pod="openstack/nova-api-0" Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.014811 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a490e56-2eb9-4f78-8318-0ec31d0a545d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " pod="openstack/nova-api-0" Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.028906 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a490e56-2eb9-4f78-8318-0ec31d0a545d-config-data\") pod \"nova-api-0\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " pod="openstack/nova-api-0" Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.038704 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcgv9\" (UniqueName: \"kubernetes.io/projected/5a490e56-2eb9-4f78-8318-0ec31d0a545d-kube-api-access-gcgv9\") pod \"nova-api-0\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " pod="openstack/nova-api-0" Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.067585 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.067646 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.092013 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.320376 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:21 crc kubenswrapper[4841]: W0130 05:28:21.328613 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podade12f79_5499_4065_8888_40fc5e50bfe5.slice/crio-28bb4af9c2cfd343dc852acc94e669e3f0f92024eb81036204bc07d6b0ecdb93 WatchSource:0}: Error finding container 28bb4af9c2cfd343dc852acc94e669e3f0f92024eb81036204bc07d6b0ecdb93: Status 404 returned error can't find the container with id 28bb4af9c2cfd343dc852acc94e669e3f0f92024eb81036204bc07d6b0ecdb93 Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.430101 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ade12f79-5499-4065-8888-40fc5e50bfe5","Type":"ContainerStarted","Data":"28bb4af9c2cfd343dc852acc94e669e3f0f92024eb81036204bc07d6b0ecdb93"} Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.622583 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:21 crc kubenswrapper[4841]: W0130 05:28:21.630260 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a490e56_2eb9_4f78_8318_0ec31d0a545d.slice/crio-abac83d77c723046c5be74f2b4ce96674ebe2e75a64cc3dd5244aa16d92095b4 WatchSource:0}: Error finding container abac83d77c723046c5be74f2b4ce96674ebe2e75a64cc3dd5244aa16d92095b4: Status 404 returned error can't find the container with id abac83d77c723046c5be74f2b4ce96674ebe2e75a64cc3dd5244aa16d92095b4 Jan 30 05:28:21 crc kubenswrapper[4841]: I0130 05:28:21.706790 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 05:28:22 crc kubenswrapper[4841]: I0130 05:28:22.442105 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce" path="/var/lib/kubelet/pods/08a4a9cf-ae39-4cd8-b49a-70b66a4a44ce/volumes" Jan 30 05:28:22 crc kubenswrapper[4841]: I0130 05:28:22.443325 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38918ba4-5ba7-49ca-b6a2-6183d9e9f06a" path="/var/lib/kubelet/pods/38918ba4-5ba7-49ca-b6a2-6183d9e9f06a/volumes" Jan 30 05:28:22 crc kubenswrapper[4841]: I0130 05:28:22.443924 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ade12f79-5499-4065-8888-40fc5e50bfe5","Type":"ContainerStarted","Data":"c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193"} Jan 30 05:28:22 crc kubenswrapper[4841]: I0130 05:28:22.444072 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a490e56-2eb9-4f78-8318-0ec31d0a545d","Type":"ContainerStarted","Data":"7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712"} Jan 30 05:28:22 crc kubenswrapper[4841]: I0130 05:28:22.444113 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a490e56-2eb9-4f78-8318-0ec31d0a545d","Type":"ContainerStarted","Data":"8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33"} Jan 30 05:28:22 crc kubenswrapper[4841]: I0130 05:28:22.444126 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a490e56-2eb9-4f78-8318-0ec31d0a545d","Type":"ContainerStarted","Data":"abac83d77c723046c5be74f2b4ce96674ebe2e75a64cc3dd5244aa16d92095b4"} Jan 30 05:28:22 crc kubenswrapper[4841]: I0130 05:28:22.463668 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.463653042 podStartE2EDuration="2.463653042s" podCreationTimestamp="2026-01-30 05:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:22.45950755 +0000 UTC m=+1239.452980188" watchObservedRunningTime="2026-01-30 05:28:22.463653042 +0000 UTC m=+1239.457125680" Jan 30 05:28:22 crc kubenswrapper[4841]: I0130 05:28:22.485563 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.485547162 podStartE2EDuration="2.485547162s" podCreationTimestamp="2026-01-30 05:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:22.479356475 +0000 UTC m=+1239.472829113" watchObservedRunningTime="2026-01-30 05:28:22.485547162 +0000 UTC m=+1239.479019800" Jan 30 05:28:24 crc kubenswrapper[4841]: I0130 05:28:24.755466 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 30 05:28:25 crc kubenswrapper[4841]: I0130 05:28:25.855591 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 05:28:26 crc kubenswrapper[4841]: I0130 05:28:26.064226 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 05:28:26 crc kubenswrapper[4841]: I0130 05:28:26.064290 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 05:28:27 crc kubenswrapper[4841]: I0130 05:28:27.074832 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 05:28:27 crc kubenswrapper[4841]: I0130 05:28:27.084674 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 05:28:30 crc kubenswrapper[4841]: I0130 05:28:30.855247 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 05:28:30 crc kubenswrapper[4841]: I0130 05:28:30.901552 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 05:28:31 crc kubenswrapper[4841]: I0130 05:28:31.092479 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:28:31 crc kubenswrapper[4841]: I0130 05:28:31.092546 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:28:31 crc kubenswrapper[4841]: I0130 05:28:31.607068 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 05:28:32 crc kubenswrapper[4841]: I0130 05:28:32.174603 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 05:28:32 crc kubenswrapper[4841]: I0130 05:28:32.174720 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 05:28:36 crc kubenswrapper[4841]: I0130 05:28:36.082303 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 05:28:36 crc kubenswrapper[4841]: I0130 05:28:36.083998 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 05:28:36 crc kubenswrapper[4841]: I0130 05:28:36.099767 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 05:28:36 crc kubenswrapper[4841]: I0130 05:28:36.645131 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.493788 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.620339 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc8b4aa-d421-4c89-bc56-538a727a638f-combined-ca-bundle\") pod \"cfc8b4aa-d421-4c89-bc56-538a727a638f\" (UID: \"cfc8b4aa-d421-4c89-bc56-538a727a638f\") " Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.620599 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc8b4aa-d421-4c89-bc56-538a727a638f-config-data\") pod \"cfc8b4aa-d421-4c89-bc56-538a727a638f\" (UID: \"cfc8b4aa-d421-4c89-bc56-538a727a638f\") " Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.621269 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v7sb\" (UniqueName: \"kubernetes.io/projected/cfc8b4aa-d421-4c89-bc56-538a727a638f-kube-api-access-8v7sb\") pod \"cfc8b4aa-d421-4c89-bc56-538a727a638f\" (UID: \"cfc8b4aa-d421-4c89-bc56-538a727a638f\") " Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.627431 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc8b4aa-d421-4c89-bc56-538a727a638f-kube-api-access-8v7sb" (OuterVolumeSpecName: "kube-api-access-8v7sb") pod "cfc8b4aa-d421-4c89-bc56-538a727a638f" (UID: "cfc8b4aa-d421-4c89-bc56-538a727a638f"). InnerVolumeSpecName "kube-api-access-8v7sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.647506 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc8b4aa-d421-4c89-bc56-538a727a638f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfc8b4aa-d421-4c89-bc56-538a727a638f" (UID: "cfc8b4aa-d421-4c89-bc56-538a727a638f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.656899 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc8b4aa-d421-4c89-bc56-538a727a638f-config-data" (OuterVolumeSpecName: "config-data") pod "cfc8b4aa-d421-4c89-bc56-538a727a638f" (UID: "cfc8b4aa-d421-4c89-bc56-538a727a638f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.664195 4841 generic.go:334] "Generic (PLEG): container finished" podID="cfc8b4aa-d421-4c89-bc56-538a727a638f" containerID="dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2" exitCode=137 Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.664963 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cfc8b4aa-d421-4c89-bc56-538a727a638f","Type":"ContainerDied","Data":"dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2"} Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.665031 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cfc8b4aa-d421-4c89-bc56-538a727a638f","Type":"ContainerDied","Data":"9c7202904654ab26853a1d262a7b9bac3106973495fbcf3c4e11c8946cc10a64"} Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.665043 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.665068 4841 scope.go:117] "RemoveContainer" containerID="dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.724083 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc8b4aa-d421-4c89-bc56-538a727a638f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.724105 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc8b4aa-d421-4c89-bc56-538a727a638f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.724114 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v7sb\" (UniqueName: \"kubernetes.io/projected/cfc8b4aa-d421-4c89-bc56-538a727a638f-kube-api-access-8v7sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.745135 4841 scope.go:117] "RemoveContainer" containerID="dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2" Jan 30 05:28:38 crc kubenswrapper[4841]: E0130 05:28:38.745576 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2\": container with ID starting with dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2 not found: ID does not exist" containerID="dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.745617 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2"} err="failed to get container status \"dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2\": rpc error: code = NotFound desc = could not find container \"dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2\": container with ID starting with dba149389afcab6dec8b3ec792b66bd13d6595cef2685d38b8f1032b37c461f2 not found: ID does not exist" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.769433 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.776883 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.809039 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:38 crc kubenswrapper[4841]: E0130 05:28:38.809998 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc8b4aa-d421-4c89-bc56-538a727a638f" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.810021 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc8b4aa-d421-4c89-bc56-538a727a638f" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.810382 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc8b4aa-d421-4c89-bc56-538a727a638f" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.811416 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.820900 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.821309 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.829456 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.839060 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.927749 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.927838 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.927876 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.927916 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:38 crc kubenswrapper[4841]: I0130 05:28:38.928020 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh9kf\" (UniqueName: \"kubernetes.io/projected/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-kube-api-access-hh9kf\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:39 crc kubenswrapper[4841]: I0130 05:28:39.030050 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh9kf\" (UniqueName: \"kubernetes.io/projected/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-kube-api-access-hh9kf\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:39 crc kubenswrapper[4841]: I0130 05:28:39.030222 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:39 crc kubenswrapper[4841]: I0130 05:28:39.030247 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:39 crc kubenswrapper[4841]: I0130 05:28:39.030283 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:39 crc kubenswrapper[4841]: I0130 05:28:39.030320 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:39 crc kubenswrapper[4841]: I0130 05:28:39.034292 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:39 crc kubenswrapper[4841]: I0130 05:28:39.034642 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:39 crc kubenswrapper[4841]: I0130 05:28:39.035368 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:39 crc kubenswrapper[4841]: I0130 05:28:39.035795 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:39 crc kubenswrapper[4841]: I0130 05:28:39.056793 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh9kf\" (UniqueName: \"kubernetes.io/projected/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-kube-api-access-hh9kf\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:39 crc kubenswrapper[4841]: I0130 05:28:39.137783 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:39 crc kubenswrapper[4841]: I0130 05:28:39.681199 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:28:39 crc kubenswrapper[4841]: W0130 05:28:39.691263 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fe10a9a_a21e_4b2c_a5da_d50340115c7a.slice/crio-7c5495bfedaeafce9f0e4e0f6e39d3e026f871946ae30e63e06981febebac3a9 WatchSource:0}: Error finding container 7c5495bfedaeafce9f0e4e0f6e39d3e026f871946ae30e63e06981febebac3a9: Status 404 returned error can't find the container with id 7c5495bfedaeafce9f0e4e0f6e39d3e026f871946ae30e63e06981febebac3a9 Jan 30 05:28:40 crc kubenswrapper[4841]: I0130 05:28:40.450216 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc8b4aa-d421-4c89-bc56-538a727a638f" path="/var/lib/kubelet/pods/cfc8b4aa-d421-4c89-bc56-538a727a638f/volumes" Jan 30 05:28:40 crc kubenswrapper[4841]: I0130 05:28:40.463543 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:28:40 crc kubenswrapper[4841]: I0130 05:28:40.463616 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:28:40 crc kubenswrapper[4841]: I0130 05:28:40.699973 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6fe10a9a-a21e-4b2c-a5da-d50340115c7a","Type":"ContainerStarted","Data":"de54af2206ef9739b9851bdbdbfe8715a9ca62af5991471c3c6388dc3e2c68b3"} Jan 30 05:28:40 crc kubenswrapper[4841]: I0130 05:28:40.700029 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6fe10a9a-a21e-4b2c-a5da-d50340115c7a","Type":"ContainerStarted","Data":"7c5495bfedaeafce9f0e4e0f6e39d3e026f871946ae30e63e06981febebac3a9"} Jan 30 05:28:40 crc kubenswrapper[4841]: I0130 05:28:40.727326 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.727298293 podStartE2EDuration="2.727298293s" podCreationTimestamp="2026-01-30 05:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:40.721838256 +0000 UTC m=+1257.715310934" watchObservedRunningTime="2026-01-30 05:28:40.727298293 +0000 UTC m=+1257.720770971" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.097041 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.097586 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.098498 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.102743 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.712272 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.715834 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.945057 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-zb2qt"] Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.946965 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.959815 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-zb2qt"] Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.997928 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8px8v\" (UniqueName: \"kubernetes.io/projected/0da7e312-7550-4d60-a14b-d5dbdc500e88-kube-api-access-8px8v\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.997989 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-ovsdbserver-sb\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.998330 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-dns-swift-storage-0\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.998376 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-ovsdbserver-nb\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.998430 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-dns-svc\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:41 crc kubenswrapper[4841]: I0130 05:28:41.998481 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-config\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.100201 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-dns-swift-storage-0\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.100253 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-ovsdbserver-nb\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.100278 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-dns-svc\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.100313 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-config\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.100377 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8px8v\" (UniqueName: \"kubernetes.io/projected/0da7e312-7550-4d60-a14b-d5dbdc500e88-kube-api-access-8px8v\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.100427 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-ovsdbserver-sb\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.101135 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-ovsdbserver-nb\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.101161 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-dns-svc\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.101276 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-ovsdbserver-sb\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.101379 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-config\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.101672 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-dns-swift-storage-0\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.117636 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8px8v\" (UniqueName: \"kubernetes.io/projected/0da7e312-7550-4d60-a14b-d5dbdc500e88-kube-api-access-8px8v\") pod \"dnsmasq-dns-fcd6f8f8f-zb2qt\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.268604 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:42 crc kubenswrapper[4841]: I0130 05:28:42.836109 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-zb2qt"] Jan 30 05:28:42 crc kubenswrapper[4841]: W0130 05:28:42.837254 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0da7e312_7550_4d60_a14b_d5dbdc500e88.slice/crio-9381d5c45ba94de36ed625ec31606b1cf1ad32e73382b50c5f7e64e061b5c058 WatchSource:0}: Error finding container 9381d5c45ba94de36ed625ec31606b1cf1ad32e73382b50c5f7e64e061b5c058: Status 404 returned error can't find the container with id 9381d5c45ba94de36ed625ec31606b1cf1ad32e73382b50c5f7e64e061b5c058 Jan 30 05:28:43 crc kubenswrapper[4841]: I0130 05:28:43.704225 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:43 crc kubenswrapper[4841]: I0130 05:28:43.705044 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="ceilometer-central-agent" containerID="cri-o://24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674" gracePeriod=30 Jan 30 05:28:43 crc kubenswrapper[4841]: I0130 05:28:43.705178 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="proxy-httpd" containerID="cri-o://fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6" gracePeriod=30 Jan 30 05:28:43 crc kubenswrapper[4841]: I0130 05:28:43.705234 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="sg-core" containerID="cri-o://89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5" gracePeriod=30 Jan 30 05:28:43 crc kubenswrapper[4841]: I0130 05:28:43.705268 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="ceilometer-notification-agent" containerID="cri-o://9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a" gracePeriod=30 Jan 30 05:28:43 crc kubenswrapper[4841]: I0130 05:28:43.722999 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 30 05:28:43 crc kubenswrapper[4841]: I0130 05:28:43.737389 4841 generic.go:334] "Generic (PLEG): container finished" podID="0da7e312-7550-4d60-a14b-d5dbdc500e88" containerID="a3396857474a1366b395f609227fb5083231e58e937199f3945762778f000d36" exitCode=0 Jan 30 05:28:43 crc kubenswrapper[4841]: I0130 05:28:43.739718 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" event={"ID":"0da7e312-7550-4d60-a14b-d5dbdc500e88","Type":"ContainerDied","Data":"a3396857474a1366b395f609227fb5083231e58e937199f3945762778f000d36"} Jan 30 05:28:43 crc kubenswrapper[4841]: I0130 05:28:43.739753 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" event={"ID":"0da7e312-7550-4d60-a14b-d5dbdc500e88","Type":"ContainerStarted","Data":"9381d5c45ba94de36ed625ec31606b1cf1ad32e73382b50c5f7e64e061b5c058"} Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.137988 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.494811 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.753080 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" event={"ID":"0da7e312-7550-4d60-a14b-d5dbdc500e88","Type":"ContainerStarted","Data":"4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2"} Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.754348 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.757985 4841 generic.go:334] "Generic (PLEG): container finished" podID="750288ad-af7e-4723-90a9-00b070e12063" containerID="fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6" exitCode=0 Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.758013 4841 generic.go:334] "Generic (PLEG): container finished" podID="750288ad-af7e-4723-90a9-00b070e12063" containerID="89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5" exitCode=2 Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.758026 4841 generic.go:334] "Generic (PLEG): container finished" podID="750288ad-af7e-4723-90a9-00b070e12063" containerID="24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674" exitCode=0 Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.758172 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" containerName="nova-api-log" containerID="cri-o://8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33" gracePeriod=30 Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.758380 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750288ad-af7e-4723-90a9-00b070e12063","Type":"ContainerDied","Data":"fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6"} Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.758431 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750288ad-af7e-4723-90a9-00b070e12063","Type":"ContainerDied","Data":"89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5"} Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.758454 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750288ad-af7e-4723-90a9-00b070e12063","Type":"ContainerDied","Data":"24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674"} Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.758528 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" containerName="nova-api-api" containerID="cri-o://7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712" gracePeriod=30 Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.760956 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.198:3000/\": dial tcp 10.217.0.198:3000: connect: connection refused" Jan 30 05:28:44 crc kubenswrapper[4841]: I0130 05:28:44.798670 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" podStartSLOduration=3.798644553 podStartE2EDuration="3.798644553s" podCreationTimestamp="2026-01-30 05:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:44.782909749 +0000 UTC m=+1261.776382397" watchObservedRunningTime="2026-01-30 05:28:44.798644553 +0000 UTC m=+1261.792117191" Jan 30 05:28:45 crc kubenswrapper[4841]: I0130 05:28:45.774173 4841 generic.go:334] "Generic (PLEG): container finished" podID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" containerID="8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33" exitCode=143 Jan 30 05:28:45 crc kubenswrapper[4841]: I0130 05:28:45.774237 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a490e56-2eb9-4f78-8318-0ec31d0a545d","Type":"ContainerDied","Data":"8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33"} Jan 30 05:28:47 crc kubenswrapper[4841]: E0130 05:28:47.093302 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod750288ad_af7e_4723_90a9_00b070e12063.slice/crio-9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.565164 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.699768 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-combined-ca-bundle\") pod \"750288ad-af7e-4723-90a9-00b070e12063\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.699861 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-ceilometer-tls-certs\") pod \"750288ad-af7e-4723-90a9-00b070e12063\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.699897 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-config-data\") pod \"750288ad-af7e-4723-90a9-00b070e12063\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.700022 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n245\" (UniqueName: \"kubernetes.io/projected/750288ad-af7e-4723-90a9-00b070e12063-kube-api-access-8n245\") pod \"750288ad-af7e-4723-90a9-00b070e12063\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.700101 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-scripts\") pod \"750288ad-af7e-4723-90a9-00b070e12063\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.700269 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-sg-core-conf-yaml\") pod \"750288ad-af7e-4723-90a9-00b070e12063\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.700355 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750288ad-af7e-4723-90a9-00b070e12063-run-httpd\") pod \"750288ad-af7e-4723-90a9-00b070e12063\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.700461 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750288ad-af7e-4723-90a9-00b070e12063-log-httpd\") pod \"750288ad-af7e-4723-90a9-00b070e12063\" (UID: \"750288ad-af7e-4723-90a9-00b070e12063\") " Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.701571 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750288ad-af7e-4723-90a9-00b070e12063-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "750288ad-af7e-4723-90a9-00b070e12063" (UID: "750288ad-af7e-4723-90a9-00b070e12063"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.701839 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750288ad-af7e-4723-90a9-00b070e12063-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "750288ad-af7e-4723-90a9-00b070e12063" (UID: "750288ad-af7e-4723-90a9-00b070e12063"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.710203 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750288ad-af7e-4723-90a9-00b070e12063-kube-api-access-8n245" (OuterVolumeSpecName: "kube-api-access-8n245") pod "750288ad-af7e-4723-90a9-00b070e12063" (UID: "750288ad-af7e-4723-90a9-00b070e12063"). InnerVolumeSpecName "kube-api-access-8n245". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.710979 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-scripts" (OuterVolumeSpecName: "scripts") pod "750288ad-af7e-4723-90a9-00b070e12063" (UID: "750288ad-af7e-4723-90a9-00b070e12063"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.730347 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "750288ad-af7e-4723-90a9-00b070e12063" (UID: "750288ad-af7e-4723-90a9-00b070e12063"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.804041 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "750288ad-af7e-4723-90a9-00b070e12063" (UID: "750288ad-af7e-4723-90a9-00b070e12063"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.804475 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750288ad-af7e-4723-90a9-00b070e12063-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.804777 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750288ad-af7e-4723-90a9-00b070e12063-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.804807 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n245\" (UniqueName: \"kubernetes.io/projected/750288ad-af7e-4723-90a9-00b070e12063-kube-api-access-8n245\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.804829 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.804849 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.810132 4841 generic.go:334] "Generic (PLEG): container finished" podID="750288ad-af7e-4723-90a9-00b070e12063" containerID="9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a" exitCode=0 Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.810196 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750288ad-af7e-4723-90a9-00b070e12063","Type":"ContainerDied","Data":"9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a"} Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.810265 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750288ad-af7e-4723-90a9-00b070e12063","Type":"ContainerDied","Data":"7dd4ac30727d5a905b927d152ba3e6544fc797e5b821687d84970324350c2812"} Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.810298 4841 scope.go:117] "RemoveContainer" containerID="fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.810521 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.839464 4841 scope.go:117] "RemoveContainer" containerID="89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.862131 4841 scope.go:117] "RemoveContainer" containerID="9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.886316 4841 scope.go:117] "RemoveContainer" containerID="24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.894540 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-config-data" (OuterVolumeSpecName: "config-data") pod "750288ad-af7e-4723-90a9-00b070e12063" (UID: "750288ad-af7e-4723-90a9-00b070e12063"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.896094 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "750288ad-af7e-4723-90a9-00b070e12063" (UID: "750288ad-af7e-4723-90a9-00b070e12063"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.909086 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.909136 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.909155 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750288ad-af7e-4723-90a9-00b070e12063-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.912012 4841 scope.go:117] "RemoveContainer" containerID="fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6" Jan 30 05:28:47 crc kubenswrapper[4841]: E0130 05:28:47.914842 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6\": container with ID starting with fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6 not found: ID does not exist" containerID="fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.914911 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6"} err="failed to get container status \"fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6\": rpc error: code = NotFound desc = could not find container \"fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6\": container with ID starting with fa90f4527a1ba7c8bde2af4cc2d876ea729bad5096939bbb372451e1678c36a6 not found: ID does not exist" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.914963 4841 scope.go:117] "RemoveContainer" containerID="89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5" Jan 30 05:28:47 crc kubenswrapper[4841]: E0130 05:28:47.915694 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5\": container with ID starting with 89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5 not found: ID does not exist" containerID="89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.915746 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5"} err="failed to get container status \"89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5\": rpc error: code = NotFound desc = could not find container \"89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5\": container with ID starting with 89d4a7224aaedfcb6675cbcdd00a063e34efa3fffc683a867173035548a200b5 not found: ID does not exist" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.915773 4841 scope.go:117] "RemoveContainer" containerID="9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a" Jan 30 05:28:47 crc kubenswrapper[4841]: E0130 05:28:47.916113 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a\": container with ID starting with 9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a not found: ID does not exist" containerID="9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.916149 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a"} err="failed to get container status \"9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a\": rpc error: code = NotFound desc = could not find container \"9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a\": container with ID starting with 9d16ded2de5d846c27b7631cc3d217b4c8576b9369aeccfa0f627f753f8ad27a not found: ID does not exist" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.916168 4841 scope.go:117] "RemoveContainer" containerID="24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674" Jan 30 05:28:47 crc kubenswrapper[4841]: E0130 05:28:47.916504 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674\": container with ID starting with 24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674 not found: ID does not exist" containerID="24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674" Jan 30 05:28:47 crc kubenswrapper[4841]: I0130 05:28:47.916533 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674"} err="failed to get container status \"24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674\": rpc error: code = NotFound desc = could not find container \"24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674\": container with ID starting with 24f6ff4f10062f8c1fd26fee9f9971c16e470f1111b13ac0f05238266cf18674 not found: ID does not exist" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.225911 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.236848 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.247803 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:48 crc kubenswrapper[4841]: E0130 05:28:48.248335 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="proxy-httpd" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.248433 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="proxy-httpd" Jan 30 05:28:48 crc kubenswrapper[4841]: E0130 05:28:48.248502 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="sg-core" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.248582 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="sg-core" Jan 30 05:28:48 crc kubenswrapper[4841]: E0130 05:28:48.248641 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="ceilometer-central-agent" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.248690 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="ceilometer-central-agent" Jan 30 05:28:48 crc kubenswrapper[4841]: E0130 05:28:48.248740 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="ceilometer-notification-agent" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.248797 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="ceilometer-notification-agent" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.249036 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="sg-core" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.249100 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="proxy-httpd" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.249173 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="ceilometer-central-agent" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.249229 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="750288ad-af7e-4723-90a9-00b070e12063" containerName="ceilometer-notification-agent" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.250872 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.253012 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.255951 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.256766 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.259580 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.316867 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwcns\" (UniqueName: \"kubernetes.io/projected/84508935-db34-4e2b-a3af-800ac432353b-kube-api-access-wwcns\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.316950 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-config-data\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.317008 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84508935-db34-4e2b-a3af-800ac432353b-run-httpd\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.317035 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.317053 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.317078 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84508935-db34-4e2b-a3af-800ac432353b-log-httpd\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.317094 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-scripts\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.317110 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.374975 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.418587 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcgv9\" (UniqueName: \"kubernetes.io/projected/5a490e56-2eb9-4f78-8318-0ec31d0a545d-kube-api-access-gcgv9\") pod \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.418745 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a490e56-2eb9-4f78-8318-0ec31d0a545d-combined-ca-bundle\") pod \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.418825 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a490e56-2eb9-4f78-8318-0ec31d0a545d-logs\") pod \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.418888 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a490e56-2eb9-4f78-8318-0ec31d0a545d-config-data\") pod \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\" (UID: \"5a490e56-2eb9-4f78-8318-0ec31d0a545d\") " Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.419262 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.419314 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.419363 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84508935-db34-4e2b-a3af-800ac432353b-log-httpd\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.419422 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-scripts\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.419456 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.419542 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwcns\" (UniqueName: \"kubernetes.io/projected/84508935-db34-4e2b-a3af-800ac432353b-kube-api-access-wwcns\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.419676 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-config-data\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.419791 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84508935-db34-4e2b-a3af-800ac432353b-run-httpd\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.420140 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a490e56-2eb9-4f78-8318-0ec31d0a545d-logs" (OuterVolumeSpecName: "logs") pod "5a490e56-2eb9-4f78-8318-0ec31d0a545d" (UID: "5a490e56-2eb9-4f78-8318-0ec31d0a545d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.420235 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84508935-db34-4e2b-a3af-800ac432353b-log-httpd\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.420393 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84508935-db34-4e2b-a3af-800ac432353b-run-httpd\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.428544 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.428620 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-scripts\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.429513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.429906 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a490e56-2eb9-4f78-8318-0ec31d0a545d-kube-api-access-gcgv9" (OuterVolumeSpecName: "kube-api-access-gcgv9") pod "5a490e56-2eb9-4f78-8318-0ec31d0a545d" (UID: "5a490e56-2eb9-4f78-8318-0ec31d0a545d"). InnerVolumeSpecName "kube-api-access-gcgv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.429958 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-config-data\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.439443 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.440688 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="750288ad-af7e-4723-90a9-00b070e12063" path="/var/lib/kubelet/pods/750288ad-af7e-4723-90a9-00b070e12063/volumes" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.442478 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwcns\" (UniqueName: \"kubernetes.io/projected/84508935-db34-4e2b-a3af-800ac432353b-kube-api-access-wwcns\") pod \"ceilometer-0\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.468603 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a490e56-2eb9-4f78-8318-0ec31d0a545d-config-data" (OuterVolumeSpecName: "config-data") pod "5a490e56-2eb9-4f78-8318-0ec31d0a545d" (UID: "5a490e56-2eb9-4f78-8318-0ec31d0a545d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.474576 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a490e56-2eb9-4f78-8318-0ec31d0a545d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a490e56-2eb9-4f78-8318-0ec31d0a545d" (UID: "5a490e56-2eb9-4f78-8318-0ec31d0a545d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.526066 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcgv9\" (UniqueName: \"kubernetes.io/projected/5a490e56-2eb9-4f78-8318-0ec31d0a545d-kube-api-access-gcgv9\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.527166 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a490e56-2eb9-4f78-8318-0ec31d0a545d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.527384 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a490e56-2eb9-4f78-8318-0ec31d0a545d-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.527418 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a490e56-2eb9-4f78-8318-0ec31d0a545d-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.567763 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.823331 4841 generic.go:334] "Generic (PLEG): container finished" podID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" containerID="7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712" exitCode=0 Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.823387 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a490e56-2eb9-4f78-8318-0ec31d0a545d","Type":"ContainerDied","Data":"7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712"} Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.823426 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a490e56-2eb9-4f78-8318-0ec31d0a545d","Type":"ContainerDied","Data":"abac83d77c723046c5be74f2b4ce96674ebe2e75a64cc3dd5244aa16d92095b4"} Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.823443 4841 scope.go:117] "RemoveContainer" containerID="7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.823563 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.865810 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.872617 4841 scope.go:117] "RemoveContainer" containerID="8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.874665 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.887193 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:48 crc kubenswrapper[4841]: E0130 05:28:48.887611 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" containerName="nova-api-api" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.887629 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" containerName="nova-api-api" Jan 30 05:28:48 crc kubenswrapper[4841]: E0130 05:28:48.887657 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" containerName="nova-api-log" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.887663 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" containerName="nova-api-log" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.887845 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" containerName="nova-api-log" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.887864 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" containerName="nova-api-api" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.888860 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.890845 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.891541 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.892081 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.893668 4841 scope.go:117] "RemoveContainer" containerID="7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712" Jan 30 05:28:48 crc kubenswrapper[4841]: E0130 05:28:48.894083 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712\": container with ID starting with 7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712 not found: ID does not exist" containerID="7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.894109 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712"} err="failed to get container status \"7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712\": rpc error: code = NotFound desc = could not find container \"7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712\": container with ID starting with 7efabd41c371075df51328c6eac1cdf25543bba14282247606119cd549c87712 not found: ID does not exist" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.894127 4841 scope.go:117] "RemoveContainer" containerID="8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33" Jan 30 05:28:48 crc kubenswrapper[4841]: E0130 05:28:48.894349 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33\": container with ID starting with 8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33 not found: ID does not exist" containerID="8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.894368 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33"} err="failed to get container status \"8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33\": rpc error: code = NotFound desc = could not find container \"8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33\": container with ID starting with 8e925d49b71509617a9c2f4d1f58c168665ab205295303f91645f7a270c3ae33 not found: ID does not exist" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.900336 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.936167 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-logs\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.936279 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.936319 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.936472 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wckn\" (UniqueName: \"kubernetes.io/projected/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-kube-api-access-8wckn\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.936542 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-public-tls-certs\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:48 crc kubenswrapper[4841]: I0130 05:28:48.936615 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-config-data\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.012383 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.038859 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-public-tls-certs\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.038941 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-config-data\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.039015 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-logs\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.039067 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.039112 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.039173 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wckn\" (UniqueName: \"kubernetes.io/projected/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-kube-api-access-8wckn\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.039940 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-logs\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.043639 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-public-tls-certs\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.044119 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.049808 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-config-data\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.050956 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.063681 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wckn\" (UniqueName: \"kubernetes.io/projected/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-kube-api-access-8wckn\") pod \"nova-api-0\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.138322 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.182490 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.209956 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.686816 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:49 crc kubenswrapper[4841]: W0130 05:28:49.699893 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaf8561f_e23f_4be5_a7ee_f84ce16a5a63.slice/crio-6ae781c29d9d8791f49aed0456eeb68fc5126e52826f653cd08ddee7c83de623 WatchSource:0}: Error finding container 6ae781c29d9d8791f49aed0456eeb68fc5126e52826f653cd08ddee7c83de623: Status 404 returned error can't find the container with id 6ae781c29d9d8791f49aed0456eeb68fc5126e52826f653cd08ddee7c83de623 Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.845714 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63","Type":"ContainerStarted","Data":"479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52"} Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.845774 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63","Type":"ContainerStarted","Data":"6ae781c29d9d8791f49aed0456eeb68fc5126e52826f653cd08ddee7c83de623"} Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.849742 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84508935-db34-4e2b-a3af-800ac432353b","Type":"ContainerStarted","Data":"cfc0ae345e05eb3f086c878d95ec4681a8950e4f7bd6e49a010f4a0710f07baa"} Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.849772 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84508935-db34-4e2b-a3af-800ac432353b","Type":"ContainerStarted","Data":"fb4312d698aba3570b2814baa1b78d3ca797f5d3fabd9b6ab07c441e6ea68afc"} Jan 30 05:28:49 crc kubenswrapper[4841]: I0130 05:28:49.869361 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.004142 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-zszt7"] Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.005996 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.012662 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.012827 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.016335 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zszt7"] Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.057761 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zszt7\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.058098 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-config-data\") pod \"nova-cell1-cell-mapping-zszt7\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.058144 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-scripts\") pod \"nova-cell1-cell-mapping-zszt7\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.058164 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grzwt\" (UniqueName: \"kubernetes.io/projected/65017d10-45fc-4a63-adbf-b09b4adafb4e-kube-api-access-grzwt\") pod \"nova-cell1-cell-mapping-zszt7\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.159799 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-config-data\") pod \"nova-cell1-cell-mapping-zszt7\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.159924 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-scripts\") pod \"nova-cell1-cell-mapping-zszt7\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.159957 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grzwt\" (UniqueName: \"kubernetes.io/projected/65017d10-45fc-4a63-adbf-b09b4adafb4e-kube-api-access-grzwt\") pod \"nova-cell1-cell-mapping-zszt7\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.160047 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zszt7\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.167076 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-scripts\") pod \"nova-cell1-cell-mapping-zszt7\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.167188 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-config-data\") pod \"nova-cell1-cell-mapping-zszt7\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.167366 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zszt7\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.182611 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grzwt\" (UniqueName: \"kubernetes.io/projected/65017d10-45fc-4a63-adbf-b09b4adafb4e-kube-api-access-grzwt\") pod \"nova-cell1-cell-mapping-zszt7\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.330208 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.444746 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a490e56-2eb9-4f78-8318-0ec31d0a545d" path="/var/lib/kubelet/pods/5a490e56-2eb9-4f78-8318-0ec31d0a545d/volumes" Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.785558 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zszt7"] Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.865235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63","Type":"ContainerStarted","Data":"b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899"} Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.868972 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zszt7" event={"ID":"65017d10-45fc-4a63-adbf-b09b4adafb4e","Type":"ContainerStarted","Data":"6a92c36ca5a8eddaffbfb97ff6ea1f526daaf14c6119a4926d657ea9743fdb62"} Jan 30 05:28:50 crc kubenswrapper[4841]: I0130 05:28:50.873336 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84508935-db34-4e2b-a3af-800ac432353b","Type":"ContainerStarted","Data":"d71848d69df407548c21d416cd1905e03bc4273aed56307ec048215b6bd60f64"} Jan 30 05:28:51 crc kubenswrapper[4841]: I0130 05:28:51.887912 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zszt7" event={"ID":"65017d10-45fc-4a63-adbf-b09b4adafb4e","Type":"ContainerStarted","Data":"a09f001bb78e76ebfd5fd7c52c6c812ab40aeee42dc914fb41353c36d1cdd59c"} Jan 30 05:28:51 crc kubenswrapper[4841]: I0130 05:28:51.893050 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84508935-db34-4e2b-a3af-800ac432353b","Type":"ContainerStarted","Data":"e8fd8e866eed90045f99756e46172a271386a959f2b77da427ad4735cdb79473"} Jan 30 05:28:51 crc kubenswrapper[4841]: I0130 05:28:51.913971 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.9139442300000002 podStartE2EDuration="3.91394423s" podCreationTimestamp="2026-01-30 05:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:50.895982322 +0000 UTC m=+1267.889455000" watchObservedRunningTime="2026-01-30 05:28:51.91394423 +0000 UTC m=+1268.907416908" Jan 30 05:28:51 crc kubenswrapper[4841]: I0130 05:28:51.920003 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-zszt7" podStartSLOduration=2.9199817919999997 podStartE2EDuration="2.919981792s" podCreationTimestamp="2026-01-30 05:28:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:28:51.918922274 +0000 UTC m=+1268.912394932" watchObservedRunningTime="2026-01-30 05:28:51.919981792 +0000 UTC m=+1268.913454470" Jan 30 05:28:52 crc kubenswrapper[4841]: I0130 05:28:52.270783 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:28:52 crc kubenswrapper[4841]: I0130 05:28:52.397276 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-8nh7c"] Jan 30 05:28:52 crc kubenswrapper[4841]: I0130 05:28:52.402599 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" podUID="e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" containerName="dnsmasq-dns" containerID="cri-o://eb30ed9d206df00865353e2b78d42fe6279ddd3397ab2ec90f8a6d55b5a27596" gracePeriod=10 Jan 30 05:28:52 crc kubenswrapper[4841]: I0130 05:28:52.929008 4841 generic.go:334] "Generic (PLEG): container finished" podID="e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" containerID="eb30ed9d206df00865353e2b78d42fe6279ddd3397ab2ec90f8a6d55b5a27596" exitCode=0 Jan 30 05:28:52 crc kubenswrapper[4841]: I0130 05:28:52.929089 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" event={"ID":"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4","Type":"ContainerDied","Data":"eb30ed9d206df00865353e2b78d42fe6279ddd3397ab2ec90f8a6d55b5a27596"} Jan 30 05:28:52 crc kubenswrapper[4841]: I0130 05:28:52.948656 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.026316 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-dns-svc\") pod \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.026504 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-ovsdbserver-sb\") pod \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.026530 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhpc6\" (UniqueName: \"kubernetes.io/projected/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-kube-api-access-vhpc6\") pod \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.026549 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-dns-swift-storage-0\") pod \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.026594 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-ovsdbserver-nb\") pod \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.026619 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-config\") pod \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\" (UID: \"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4\") " Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.056883 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-kube-api-access-vhpc6" (OuterVolumeSpecName: "kube-api-access-vhpc6") pod "e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" (UID: "e1d6a4b8-b961-48f0-b440-3ee2ef5472c4"). InnerVolumeSpecName "kube-api-access-vhpc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.080374 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" (UID: "e1d6a4b8-b961-48f0-b440-3ee2ef5472c4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.081484 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" (UID: "e1d6a4b8-b961-48f0-b440-3ee2ef5472c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.091738 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-config" (OuterVolumeSpecName: "config") pod "e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" (UID: "e1d6a4b8-b961-48f0-b440-3ee2ef5472c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.095507 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" (UID: "e1d6a4b8-b961-48f0-b440-3ee2ef5472c4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.099037 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" (UID: "e1d6a4b8-b961-48f0-b440-3ee2ef5472c4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.128901 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.128934 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhpc6\" (UniqueName: \"kubernetes.io/projected/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-kube-api-access-vhpc6\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.128946 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.128954 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.128963 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.128973 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.942737 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84508935-db34-4e2b-a3af-800ac432353b","Type":"ContainerStarted","Data":"2888bc3f6d872d8a4049937fb5ba2193d6506521580fd4ed32cf36760940a837"} Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.947025 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" event={"ID":"e1d6a4b8-b961-48f0-b440-3ee2ef5472c4","Type":"ContainerDied","Data":"889e9b28afb1cf6ce1fe2f45668b36937c0c648cec467195a35ca0da45014d84"} Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.947084 4841 scope.go:117] "RemoveContainer" containerID="eb30ed9d206df00865353e2b78d42fe6279ddd3397ab2ec90f8a6d55b5a27596" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.947253 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-8nh7c" Jan 30 05:28:53 crc kubenswrapper[4841]: I0130 05:28:53.993719 4841 scope.go:117] "RemoveContainer" containerID="4229641e7fecfa26b115d7c44f70e8ff0d358d403e0c398cd81f68758b65a24d" Jan 30 05:28:54 crc kubenswrapper[4841]: I0130 05:28:54.015390 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.369141207 podStartE2EDuration="6.015373952s" podCreationTimestamp="2026-01-30 05:28:48 +0000 UTC" firstStartedPulling="2026-01-30 05:28:49.016384068 +0000 UTC m=+1266.009856706" lastFinishedPulling="2026-01-30 05:28:52.662616823 +0000 UTC m=+1269.656089451" observedRunningTime="2026-01-30 05:28:53.994109079 +0000 UTC m=+1270.987581757" watchObservedRunningTime="2026-01-30 05:28:54.015373952 +0000 UTC m=+1271.008846590" Jan 30 05:28:54 crc kubenswrapper[4841]: I0130 05:28:54.019447 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-8nh7c"] Jan 30 05:28:54 crc kubenswrapper[4841]: I0130 05:28:54.027169 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-8nh7c"] Jan 30 05:28:54 crc kubenswrapper[4841]: I0130 05:28:54.451160 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" path="/var/lib/kubelet/pods/e1d6a4b8-b961-48f0-b440-3ee2ef5472c4/volumes" Jan 30 05:28:54 crc kubenswrapper[4841]: I0130 05:28:54.964066 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 05:28:55 crc kubenswrapper[4841]: I0130 05:28:55.974786 4841 generic.go:334] "Generic (PLEG): container finished" podID="65017d10-45fc-4a63-adbf-b09b4adafb4e" containerID="a09f001bb78e76ebfd5fd7c52c6c812ab40aeee42dc914fb41353c36d1cdd59c" exitCode=0 Jan 30 05:28:55 crc kubenswrapper[4841]: I0130 05:28:55.974868 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zszt7" event={"ID":"65017d10-45fc-4a63-adbf-b09b4adafb4e","Type":"ContainerDied","Data":"a09f001bb78e76ebfd5fd7c52c6c812ab40aeee42dc914fb41353c36d1cdd59c"} Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.436622 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.520710 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-combined-ca-bundle\") pod \"65017d10-45fc-4a63-adbf-b09b4adafb4e\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.521003 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-scripts\") pod \"65017d10-45fc-4a63-adbf-b09b4adafb4e\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.521077 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-config-data\") pod \"65017d10-45fc-4a63-adbf-b09b4adafb4e\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.521161 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grzwt\" (UniqueName: \"kubernetes.io/projected/65017d10-45fc-4a63-adbf-b09b4adafb4e-kube-api-access-grzwt\") pod \"65017d10-45fc-4a63-adbf-b09b4adafb4e\" (UID: \"65017d10-45fc-4a63-adbf-b09b4adafb4e\") " Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.526097 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65017d10-45fc-4a63-adbf-b09b4adafb4e-kube-api-access-grzwt" (OuterVolumeSpecName: "kube-api-access-grzwt") pod "65017d10-45fc-4a63-adbf-b09b4adafb4e" (UID: "65017d10-45fc-4a63-adbf-b09b4adafb4e"). InnerVolumeSpecName "kube-api-access-grzwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.548511 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-scripts" (OuterVolumeSpecName: "scripts") pod "65017d10-45fc-4a63-adbf-b09b4adafb4e" (UID: "65017d10-45fc-4a63-adbf-b09b4adafb4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.553967 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65017d10-45fc-4a63-adbf-b09b4adafb4e" (UID: "65017d10-45fc-4a63-adbf-b09b4adafb4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.558500 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-config-data" (OuterVolumeSpecName: "config-data") pod "65017d10-45fc-4a63-adbf-b09b4adafb4e" (UID: "65017d10-45fc-4a63-adbf-b09b4adafb4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.623055 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.623082 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.623095 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grzwt\" (UniqueName: \"kubernetes.io/projected/65017d10-45fc-4a63-adbf-b09b4adafb4e-kube-api-access-grzwt\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:57 crc kubenswrapper[4841]: I0130 05:28:57.623132 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65017d10-45fc-4a63-adbf-b09b4adafb4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:58 crc kubenswrapper[4841]: I0130 05:28:58.006627 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zszt7" event={"ID":"65017d10-45fc-4a63-adbf-b09b4adafb4e","Type":"ContainerDied","Data":"6a92c36ca5a8eddaffbfb97ff6ea1f526daaf14c6119a4926d657ea9743fdb62"} Jan 30 05:28:58 crc kubenswrapper[4841]: I0130 05:28:58.006692 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a92c36ca5a8eddaffbfb97ff6ea1f526daaf14c6119a4926d657ea9743fdb62" Jan 30 05:28:58 crc kubenswrapper[4841]: I0130 05:28:58.006782 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zszt7" Jan 30 05:28:58 crc kubenswrapper[4841]: I0130 05:28:58.216568 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:28:58 crc kubenswrapper[4841]: I0130 05:28:58.217174 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ade12f79-5499-4065-8888-40fc5e50bfe5" containerName="nova-scheduler-scheduler" containerID="cri-o://c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193" gracePeriod=30 Jan 30 05:28:58 crc kubenswrapper[4841]: I0130 05:28:58.239922 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:58 crc kubenswrapper[4841]: I0130 05:28:58.240290 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" containerName="nova-api-log" containerID="cri-o://479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52" gracePeriod=30 Jan 30 05:28:58 crc kubenswrapper[4841]: I0130 05:28:58.240519 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" containerName="nova-api-api" containerID="cri-o://b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899" gracePeriod=30 Jan 30 05:28:58 crc kubenswrapper[4841]: I0130 05:28:58.399296 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:28:58 crc kubenswrapper[4841]: I0130 05:28:58.399576 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" containerName="nova-metadata-log" containerID="cri-o://29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376" gracePeriod=30 Jan 30 05:28:58 crc kubenswrapper[4841]: I0130 05:28:58.399676 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" containerName="nova-metadata-metadata" containerID="cri-o://e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154" gracePeriod=30 Jan 30 05:28:58 crc kubenswrapper[4841]: I0130 05:28:58.942184 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.015936 4841 generic.go:334] "Generic (PLEG): container finished" podID="aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" containerID="b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899" exitCode=0 Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.015967 4841 generic.go:334] "Generic (PLEG): container finished" podID="aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" containerID="479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52" exitCode=143 Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.016001 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63","Type":"ContainerDied","Data":"b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899"} Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.016024 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63","Type":"ContainerDied","Data":"479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52"} Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.016033 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63","Type":"ContainerDied","Data":"6ae781c29d9d8791f49aed0456eeb68fc5126e52826f653cd08ddee7c83de623"} Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.016047 4841 scope.go:117] "RemoveContainer" containerID="b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.016142 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.025014 4841 generic.go:334] "Generic (PLEG): container finished" podID="585eb87e-f9af-4f01-9386-b28940a3d039" containerID="29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376" exitCode=143 Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.025060 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"585eb87e-f9af-4f01-9386-b28940a3d039","Type":"ContainerDied","Data":"29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376"} Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.039790 4841 scope.go:117] "RemoveContainer" containerID="479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.075340 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-config-data\") pod \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.075496 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-internal-tls-certs\") pod \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.075589 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wckn\" (UniqueName: \"kubernetes.io/projected/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-kube-api-access-8wckn\") pod \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.075770 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-public-tls-certs\") pod \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.075958 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-logs\") pod \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.076052 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-combined-ca-bundle\") pod \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\" (UID: \"aaf8561f-e23f-4be5-a7ee-f84ce16a5a63\") " Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.078846 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-logs" (OuterVolumeSpecName: "logs") pod "aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" (UID: "aaf8561f-e23f-4be5-a7ee-f84ce16a5a63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.087879 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-kube-api-access-8wckn" (OuterVolumeSpecName: "kube-api-access-8wckn") pod "aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" (UID: "aaf8561f-e23f-4be5-a7ee-f84ce16a5a63"). InnerVolumeSpecName "kube-api-access-8wckn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.094045 4841 scope.go:117] "RemoveContainer" containerID="b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899" Jan 30 05:28:59 crc kubenswrapper[4841]: E0130 05:28:59.096848 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899\": container with ID starting with b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899 not found: ID does not exist" containerID="b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.096911 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899"} err="failed to get container status \"b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899\": rpc error: code = NotFound desc = could not find container \"b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899\": container with ID starting with b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899 not found: ID does not exist" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.096946 4841 scope.go:117] "RemoveContainer" containerID="479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52" Jan 30 05:28:59 crc kubenswrapper[4841]: E0130 05:28:59.097306 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52\": container with ID starting with 479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52 not found: ID does not exist" containerID="479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.097351 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52"} err="failed to get container status \"479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52\": rpc error: code = NotFound desc = could not find container \"479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52\": container with ID starting with 479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52 not found: ID does not exist" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.097382 4841 scope.go:117] "RemoveContainer" containerID="b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.099329 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899"} err="failed to get container status \"b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899\": rpc error: code = NotFound desc = could not find container \"b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899\": container with ID starting with b40d5c165d8aa158b1357a2d705d76247be2400c79922e260fb47c43ea08d899 not found: ID does not exist" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.099382 4841 scope.go:117] "RemoveContainer" containerID="479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.099886 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52"} err="failed to get container status \"479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52\": rpc error: code = NotFound desc = could not find container \"479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52\": container with ID starting with 479899bfa22e2ef69183931b9cc079d4a4e6b25fe911efc3f4e981d6f7220d52 not found: ID does not exist" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.112800 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" (UID: "aaf8561f-e23f-4be5-a7ee-f84ce16a5a63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.127917 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-config-data" (OuterVolumeSpecName: "config-data") pod "aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" (UID: "aaf8561f-e23f-4be5-a7ee-f84ce16a5a63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.144311 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" (UID: "aaf8561f-e23f-4be5-a7ee-f84ce16a5a63"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.146726 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" (UID: "aaf8561f-e23f-4be5-a7ee-f84ce16a5a63"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.179282 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.179314 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.179330 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wckn\" (UniqueName: \"kubernetes.io/projected/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-kube-api-access-8wckn\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.179341 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.179352 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.179365 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.369715 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.390506 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.405154 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:59 crc kubenswrapper[4841]: E0130 05:28:59.405664 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65017d10-45fc-4a63-adbf-b09b4adafb4e" containerName="nova-manage" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.405709 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="65017d10-45fc-4a63-adbf-b09b4adafb4e" containerName="nova-manage" Jan 30 05:28:59 crc kubenswrapper[4841]: E0130 05:28:59.405734 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" containerName="init" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.405744 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" containerName="init" Jan 30 05:28:59 crc kubenswrapper[4841]: E0130 05:28:59.405770 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" containerName="nova-api-log" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.405779 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" containerName="nova-api-log" Jan 30 05:28:59 crc kubenswrapper[4841]: E0130 05:28:59.405793 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" containerName="dnsmasq-dns" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.405801 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" containerName="dnsmasq-dns" Jan 30 05:28:59 crc kubenswrapper[4841]: E0130 05:28:59.405816 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" containerName="nova-api-api" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.405825 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" containerName="nova-api-api" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.406051 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" containerName="nova-api-api" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.406067 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="65017d10-45fc-4a63-adbf-b09b4adafb4e" containerName="nova-manage" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.406086 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" containerName="nova-api-log" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.406112 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1d6a4b8-b961-48f0-b440-3ee2ef5472c4" containerName="dnsmasq-dns" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.407282 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.411207 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.411207 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.412524 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.415952 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.484155 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-config-data\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.484208 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-logs\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.484229 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bv5s\" (UniqueName: \"kubernetes.io/projected/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-kube-api-access-7bv5s\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.484250 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.484356 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:28:59 crc kubenswrapper[4841]: I0130 05:28:59.484419 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.585874 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-config-data\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.585951 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-logs\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.585973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bv5s\" (UniqueName: \"kubernetes.io/projected/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-kube-api-access-7bv5s\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.586001 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.586086 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.586144 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.589036 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-logs\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.599429 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-config-data\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.602266 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.603857 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.604148 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.611236 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bv5s\" (UniqueName: \"kubernetes.io/projected/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-kube-api-access-7bv5s\") pod \"nova-api-0\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.783640 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.906365 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.994868 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade12f79-5499-4065-8888-40fc5e50bfe5-combined-ca-bundle\") pod \"ade12f79-5499-4065-8888-40fc5e50bfe5\" (UID: \"ade12f79-5499-4065-8888-40fc5e50bfe5\") " Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.995057 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade12f79-5499-4065-8888-40fc5e50bfe5-config-data\") pod \"ade12f79-5499-4065-8888-40fc5e50bfe5\" (UID: \"ade12f79-5499-4065-8888-40fc5e50bfe5\") " Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.995133 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9n72\" (UniqueName: \"kubernetes.io/projected/ade12f79-5499-4065-8888-40fc5e50bfe5-kube-api-access-s9n72\") pod \"ade12f79-5499-4065-8888-40fc5e50bfe5\" (UID: \"ade12f79-5499-4065-8888-40fc5e50bfe5\") " Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:28:59.999153 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ade12f79-5499-4065-8888-40fc5e50bfe5-kube-api-access-s9n72" (OuterVolumeSpecName: "kube-api-access-s9n72") pod "ade12f79-5499-4065-8888-40fc5e50bfe5" (UID: "ade12f79-5499-4065-8888-40fc5e50bfe5"). InnerVolumeSpecName "kube-api-access-s9n72". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.020831 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade12f79-5499-4065-8888-40fc5e50bfe5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ade12f79-5499-4065-8888-40fc5e50bfe5" (UID: "ade12f79-5499-4065-8888-40fc5e50bfe5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.027197 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade12f79-5499-4065-8888-40fc5e50bfe5-config-data" (OuterVolumeSpecName: "config-data") pod "ade12f79-5499-4065-8888-40fc5e50bfe5" (UID: "ade12f79-5499-4065-8888-40fc5e50bfe5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.033495 4841 generic.go:334] "Generic (PLEG): container finished" podID="ade12f79-5499-4065-8888-40fc5e50bfe5" containerID="c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193" exitCode=0 Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.033576 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.033567 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ade12f79-5499-4065-8888-40fc5e50bfe5","Type":"ContainerDied","Data":"c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193"} Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.033731 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ade12f79-5499-4065-8888-40fc5e50bfe5","Type":"ContainerDied","Data":"28bb4af9c2cfd343dc852acc94e669e3f0f92024eb81036204bc07d6b0ecdb93"} Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.033771 4841 scope.go:117] "RemoveContainer" containerID="c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.056700 4841 scope.go:117] "RemoveContainer" containerID="c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193" Jan 30 05:29:00 crc kubenswrapper[4841]: E0130 05:29:00.057778 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193\": container with ID starting with c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193 not found: ID does not exist" containerID="c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.057816 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193"} err="failed to get container status \"c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193\": rpc error: code = NotFound desc = could not find container \"c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193\": container with ID starting with c90ce5d16da40b353e1ba4e6a0167062c6a53771abb6f9fffdf69974b8855193 not found: ID does not exist" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.075501 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.090289 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.098174 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9n72\" (UniqueName: \"kubernetes.io/projected/ade12f79-5499-4065-8888-40fc5e50bfe5-kube-api-access-s9n72\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.098209 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade12f79-5499-4065-8888-40fc5e50bfe5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.098221 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade12f79-5499-4065-8888-40fc5e50bfe5-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.100743 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:29:00 crc kubenswrapper[4841]: E0130 05:29:00.101287 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade12f79-5499-4065-8888-40fc5e50bfe5" containerName="nova-scheduler-scheduler" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.101305 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade12f79-5499-4065-8888-40fc5e50bfe5" containerName="nova-scheduler-scheduler" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.101610 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ade12f79-5499-4065-8888-40fc5e50bfe5" containerName="nova-scheduler-scheduler" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.102521 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.105548 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.123163 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.199732 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86140170-ca48-47e9-b587-43f98f3624c1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"86140170-ca48-47e9-b587-43f98f3624c1\") " pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.199799 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw55s\" (UniqueName: \"kubernetes.io/projected/86140170-ca48-47e9-b587-43f98f3624c1-kube-api-access-nw55s\") pod \"nova-scheduler-0\" (UID: \"86140170-ca48-47e9-b587-43f98f3624c1\") " pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.199969 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86140170-ca48-47e9-b587-43f98f3624c1-config-data\") pod \"nova-scheduler-0\" (UID: \"86140170-ca48-47e9-b587-43f98f3624c1\") " pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.302716 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw55s\" (UniqueName: \"kubernetes.io/projected/86140170-ca48-47e9-b587-43f98f3624c1-kube-api-access-nw55s\") pod \"nova-scheduler-0\" (UID: \"86140170-ca48-47e9-b587-43f98f3624c1\") " pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.303276 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86140170-ca48-47e9-b587-43f98f3624c1-config-data\") pod \"nova-scheduler-0\" (UID: \"86140170-ca48-47e9-b587-43f98f3624c1\") " pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.303465 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86140170-ca48-47e9-b587-43f98f3624c1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"86140170-ca48-47e9-b587-43f98f3624c1\") " pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.308849 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86140170-ca48-47e9-b587-43f98f3624c1-config-data\") pod \"nova-scheduler-0\" (UID: \"86140170-ca48-47e9-b587-43f98f3624c1\") " pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.311154 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86140170-ca48-47e9-b587-43f98f3624c1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"86140170-ca48-47e9-b587-43f98f3624c1\") " pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.322866 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw55s\" (UniqueName: \"kubernetes.io/projected/86140170-ca48-47e9-b587-43f98f3624c1-kube-api-access-nw55s\") pod \"nova-scheduler-0\" (UID: \"86140170-ca48-47e9-b587-43f98f3624c1\") " pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.423114 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.444998 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaf8561f-e23f-4be5-a7ee-f84ce16a5a63" path="/var/lib/kubelet/pods/aaf8561f-e23f-4be5-a7ee-f84ce16a5a63/volumes" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.445586 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ade12f79-5499-4065-8888-40fc5e50bfe5" path="/var/lib/kubelet/pods/ade12f79-5499-4065-8888-40fc5e50bfe5/volumes" Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.721736 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:29:00 crc kubenswrapper[4841]: I0130 05:29:00.969799 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:29:01 crc kubenswrapper[4841]: I0130 05:29:01.046251 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"86140170-ca48-47e9-b587-43f98f3624c1","Type":"ContainerStarted","Data":"ca260dccf3864aeff6c9a3062998d02be6a111667474abb112a582d6fba772e4"} Jan 30 05:29:01 crc kubenswrapper[4841]: I0130 05:29:01.049211 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecdab3bb-c4de-4c49-9988-d9ed592a40a7","Type":"ContainerStarted","Data":"153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47"} Jan 30 05:29:01 crc kubenswrapper[4841]: I0130 05:29:01.049240 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecdab3bb-c4de-4c49-9988-d9ed592a40a7","Type":"ContainerStarted","Data":"7191ea666925a4bd95337e8184f0dbfd12c2d8bb4996e5e1616237b7a804f5de"} Jan 30 05:29:01 crc kubenswrapper[4841]: I0130 05:29:01.558452 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:34756->10.217.0.199:8775: read: connection reset by peer" Jan 30 05:29:01 crc kubenswrapper[4841]: I0130 05:29:01.559173 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:34754->10.217.0.199:8775: read: connection reset by peer" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.012169 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.074039 4841 generic.go:334] "Generic (PLEG): container finished" podID="585eb87e-f9af-4f01-9386-b28940a3d039" containerID="e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154" exitCode=0 Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.074116 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"585eb87e-f9af-4f01-9386-b28940a3d039","Type":"ContainerDied","Data":"e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154"} Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.074147 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"585eb87e-f9af-4f01-9386-b28940a3d039","Type":"ContainerDied","Data":"9db138b14c637a9907c1609cdf21ed40d06f9114b565379717f86c64bfbe9baa"} Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.074168 4841 scope.go:117] "RemoveContainer" containerID="e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.074556 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.077191 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"86140170-ca48-47e9-b587-43f98f3624c1","Type":"ContainerStarted","Data":"478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2"} Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.080851 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecdab3bb-c4de-4c49-9988-d9ed592a40a7","Type":"ContainerStarted","Data":"e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7"} Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.103879 4841 scope.go:117] "RemoveContainer" containerID="29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.124852 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.124829705 podStartE2EDuration="2.124829705s" podCreationTimestamp="2026-01-30 05:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:29:02.10424352 +0000 UTC m=+1279.097716168" watchObservedRunningTime="2026-01-30 05:29:02.124829705 +0000 UTC m=+1279.118302343" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.130665 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.130650422 podStartE2EDuration="3.130650422s" podCreationTimestamp="2026-01-30 05:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:29:02.129165112 +0000 UTC m=+1279.122637780" watchObservedRunningTime="2026-01-30 05:29:02.130650422 +0000 UTC m=+1279.124123060" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.138129 4841 scope.go:117] "RemoveContainer" containerID="e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154" Jan 30 05:29:02 crc kubenswrapper[4841]: E0130 05:29:02.138442 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154\": container with ID starting with e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154 not found: ID does not exist" containerID="e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.138465 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154"} err="failed to get container status \"e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154\": rpc error: code = NotFound desc = could not find container \"e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154\": container with ID starting with e6350e2a443f00169ae40699caf8b8ae1666514d95a0e7d5e17b78cf1000e154 not found: ID does not exist" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.138484 4841 scope.go:117] "RemoveContainer" containerID="29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376" Jan 30 05:29:02 crc kubenswrapper[4841]: E0130 05:29:02.138649 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376\": container with ID starting with 29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376 not found: ID does not exist" containerID="29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.138687 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376"} err="failed to get container status \"29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376\": rpc error: code = NotFound desc = could not find container \"29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376\": container with ID starting with 29a00aae473e18258e703cb1fe56284763885560ffe5a5dc8c0d7cae47e1c376 not found: ID does not exist" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.153262 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfqj2\" (UniqueName: \"kubernetes.io/projected/585eb87e-f9af-4f01-9386-b28940a3d039-kube-api-access-qfqj2\") pod \"585eb87e-f9af-4f01-9386-b28940a3d039\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.153372 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585eb87e-f9af-4f01-9386-b28940a3d039-logs\") pod \"585eb87e-f9af-4f01-9386-b28940a3d039\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.153409 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-combined-ca-bundle\") pod \"585eb87e-f9af-4f01-9386-b28940a3d039\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.153427 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-nova-metadata-tls-certs\") pod \"585eb87e-f9af-4f01-9386-b28940a3d039\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.153495 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-config-data\") pod \"585eb87e-f9af-4f01-9386-b28940a3d039\" (UID: \"585eb87e-f9af-4f01-9386-b28940a3d039\") " Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.154353 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/585eb87e-f9af-4f01-9386-b28940a3d039-logs" (OuterVolumeSpecName: "logs") pod "585eb87e-f9af-4f01-9386-b28940a3d039" (UID: "585eb87e-f9af-4f01-9386-b28940a3d039"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.160580 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/585eb87e-f9af-4f01-9386-b28940a3d039-kube-api-access-qfqj2" (OuterVolumeSpecName: "kube-api-access-qfqj2") pod "585eb87e-f9af-4f01-9386-b28940a3d039" (UID: "585eb87e-f9af-4f01-9386-b28940a3d039"). InnerVolumeSpecName "kube-api-access-qfqj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.188925 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "585eb87e-f9af-4f01-9386-b28940a3d039" (UID: "585eb87e-f9af-4f01-9386-b28940a3d039"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.196406 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-config-data" (OuterVolumeSpecName: "config-data") pod "585eb87e-f9af-4f01-9386-b28940a3d039" (UID: "585eb87e-f9af-4f01-9386-b28940a3d039"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.212769 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "585eb87e-f9af-4f01-9386-b28940a3d039" (UID: "585eb87e-f9af-4f01-9386-b28940a3d039"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.255354 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585eb87e-f9af-4f01-9386-b28940a3d039-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.255430 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.255446 4841 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.255458 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585eb87e-f9af-4f01-9386-b28940a3d039-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.255472 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfqj2\" (UniqueName: \"kubernetes.io/projected/585eb87e-f9af-4f01-9386-b28940a3d039-kube-api-access-qfqj2\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.412955 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.430474 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.444283 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" path="/var/lib/kubelet/pods/585eb87e-f9af-4f01-9386-b28940a3d039/volumes" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.447746 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:29:02 crc kubenswrapper[4841]: E0130 05:29:02.448049 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" containerName="nova-metadata-log" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.448067 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" containerName="nova-metadata-log" Jan 30 05:29:02 crc kubenswrapper[4841]: E0130 05:29:02.448095 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" containerName="nova-metadata-metadata" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.448101 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" containerName="nova-metadata-metadata" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.448283 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" containerName="nova-metadata-log" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.448306 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="585eb87e-f9af-4f01-9386-b28940a3d039" containerName="nova-metadata-metadata" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.449245 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.451923 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.452360 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.458243 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.560494 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.560558 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd9v9\" (UniqueName: \"kubernetes.io/projected/7be8df86-7b8d-4741-ae13-ec1b243549b3-kube-api-access-xd9v9\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.560595 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-config-data\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.560643 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7be8df86-7b8d-4741-ae13-ec1b243549b3-logs\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.560695 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.662868 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.662986 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.663036 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd9v9\" (UniqueName: \"kubernetes.io/projected/7be8df86-7b8d-4741-ae13-ec1b243549b3-kube-api-access-xd9v9\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.663075 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-config-data\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.663122 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7be8df86-7b8d-4741-ae13-ec1b243549b3-logs\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.663613 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7be8df86-7b8d-4741-ae13-ec1b243549b3-logs\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.667196 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.668936 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.669722 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-config-data\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.692967 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd9v9\" (UniqueName: \"kubernetes.io/projected/7be8df86-7b8d-4741-ae13-ec1b243549b3-kube-api-access-xd9v9\") pod \"nova-metadata-0\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " pod="openstack/nova-metadata-0" Jan 30 05:29:02 crc kubenswrapper[4841]: I0130 05:29:02.787590 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:29:03 crc kubenswrapper[4841]: I0130 05:29:03.299683 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:29:03 crc kubenswrapper[4841]: W0130 05:29:03.305307 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7be8df86_7b8d_4741_ae13_ec1b243549b3.slice/crio-fa94895eb27278be690fd9feac29050bf3fa70985c449d4f4adec0b712a1b02e WatchSource:0}: Error finding container fa94895eb27278be690fd9feac29050bf3fa70985c449d4f4adec0b712a1b02e: Status 404 returned error can't find the container with id fa94895eb27278be690fd9feac29050bf3fa70985c449d4f4adec0b712a1b02e Jan 30 05:29:04 crc kubenswrapper[4841]: I0130 05:29:04.106096 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7be8df86-7b8d-4741-ae13-ec1b243549b3","Type":"ContainerStarted","Data":"d334449b0de1d005d626d94fd883ec0192a4936b40dc9ec24b425dde3a584637"} Jan 30 05:29:04 crc kubenswrapper[4841]: I0130 05:29:04.106386 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7be8df86-7b8d-4741-ae13-ec1b243549b3","Type":"ContainerStarted","Data":"2603ac57002b34136c88ddf281a8b6cb56fddccd4b183b0ab1effc47d15e9154"} Jan 30 05:29:04 crc kubenswrapper[4841]: I0130 05:29:04.106423 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7be8df86-7b8d-4741-ae13-ec1b243549b3","Type":"ContainerStarted","Data":"fa94895eb27278be690fd9feac29050bf3fa70985c449d4f4adec0b712a1b02e"} Jan 30 05:29:04 crc kubenswrapper[4841]: I0130 05:29:04.165307 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.165287864 podStartE2EDuration="2.165287864s" podCreationTimestamp="2026-01-30 05:29:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 05:29:04.160726091 +0000 UTC m=+1281.154198749" watchObservedRunningTime="2026-01-30 05:29:04.165287864 +0000 UTC m=+1281.158760512" Jan 30 05:29:05 crc kubenswrapper[4841]: I0130 05:29:05.424207 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.432253 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mtk2w"] Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.436007 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.443264 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mtk2w"] Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.584023 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b781c5e-301c-4458-8dea-4494e6ff8ee1-catalog-content\") pod \"redhat-operators-mtk2w\" (UID: \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\") " pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.584138 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2xf8\" (UniqueName: \"kubernetes.io/projected/2b781c5e-301c-4458-8dea-4494e6ff8ee1-kube-api-access-l2xf8\") pod \"redhat-operators-mtk2w\" (UID: \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\") " pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.584220 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b781c5e-301c-4458-8dea-4494e6ff8ee1-utilities\") pod \"redhat-operators-mtk2w\" (UID: \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\") " pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.686763 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b781c5e-301c-4458-8dea-4494e6ff8ee1-catalog-content\") pod \"redhat-operators-mtk2w\" (UID: \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\") " pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.686855 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2xf8\" (UniqueName: \"kubernetes.io/projected/2b781c5e-301c-4458-8dea-4494e6ff8ee1-kube-api-access-l2xf8\") pod \"redhat-operators-mtk2w\" (UID: \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\") " pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.686915 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b781c5e-301c-4458-8dea-4494e6ff8ee1-utilities\") pod \"redhat-operators-mtk2w\" (UID: \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\") " pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.687617 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b781c5e-301c-4458-8dea-4494e6ff8ee1-utilities\") pod \"redhat-operators-mtk2w\" (UID: \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\") " pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.688126 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b781c5e-301c-4458-8dea-4494e6ff8ee1-catalog-content\") pod \"redhat-operators-mtk2w\" (UID: \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\") " pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.710004 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2xf8\" (UniqueName: \"kubernetes.io/projected/2b781c5e-301c-4458-8dea-4494e6ff8ee1-kube-api-access-l2xf8\") pod \"redhat-operators-mtk2w\" (UID: \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\") " pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.784295 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.788460 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 05:29:07 crc kubenswrapper[4841]: I0130 05:29:07.789163 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 05:29:08 crc kubenswrapper[4841]: I0130 05:29:08.214335 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mtk2w"] Jan 30 05:29:08 crc kubenswrapper[4841]: W0130 05:29:08.217660 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b781c5e_301c_4458_8dea_4494e6ff8ee1.slice/crio-09e2603ddea500778f0b4c77c640fd984a140cc5d12a09f0aca41206f54f915d WatchSource:0}: Error finding container 09e2603ddea500778f0b4c77c640fd984a140cc5d12a09f0aca41206f54f915d: Status 404 returned error can't find the container with id 09e2603ddea500778f0b4c77c640fd984a140cc5d12a09f0aca41206f54f915d Jan 30 05:29:09 crc kubenswrapper[4841]: I0130 05:29:09.172572 4841 generic.go:334] "Generic (PLEG): container finished" podID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerID="be2eaf76e1c546278544c503658d4e4057ccce574bfafaf72f4235735046ac36" exitCode=0 Jan 30 05:29:09 crc kubenswrapper[4841]: I0130 05:29:09.173036 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtk2w" event={"ID":"2b781c5e-301c-4458-8dea-4494e6ff8ee1","Type":"ContainerDied","Data":"be2eaf76e1c546278544c503658d4e4057ccce574bfafaf72f4235735046ac36"} Jan 30 05:29:09 crc kubenswrapper[4841]: I0130 05:29:09.173067 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtk2w" event={"ID":"2b781c5e-301c-4458-8dea-4494e6ff8ee1","Type":"ContainerStarted","Data":"09e2603ddea500778f0b4c77c640fd984a140cc5d12a09f0aca41206f54f915d"} Jan 30 05:29:09 crc kubenswrapper[4841]: I0130 05:29:09.177139 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:29:09 crc kubenswrapper[4841]: I0130 05:29:09.784926 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:29:09 crc kubenswrapper[4841]: I0130 05:29:09.784986 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 05:29:10 crc kubenswrapper[4841]: I0130 05:29:10.186447 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtk2w" event={"ID":"2b781c5e-301c-4458-8dea-4494e6ff8ee1","Type":"ContainerStarted","Data":"47ddf991a16144e1858ac88abfce03ab53252ff609af74874acca0efd37dde5c"} Jan 30 05:29:10 crc kubenswrapper[4841]: I0130 05:29:10.423653 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 05:29:10 crc kubenswrapper[4841]: I0130 05:29:10.464448 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:29:10 crc kubenswrapper[4841]: I0130 05:29:10.464516 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:29:10 crc kubenswrapper[4841]: I0130 05:29:10.491965 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 05:29:10 crc kubenswrapper[4841]: I0130 05:29:10.805632 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 05:29:10 crc kubenswrapper[4841]: I0130 05:29:10.805674 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 05:29:11 crc kubenswrapper[4841]: I0130 05:29:11.249876 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 05:29:12 crc kubenswrapper[4841]: I0130 05:29:12.215527 4841 generic.go:334] "Generic (PLEG): container finished" podID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerID="47ddf991a16144e1858ac88abfce03ab53252ff609af74874acca0efd37dde5c" exitCode=0 Jan 30 05:29:12 crc kubenswrapper[4841]: I0130 05:29:12.215612 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtk2w" event={"ID":"2b781c5e-301c-4458-8dea-4494e6ff8ee1","Type":"ContainerDied","Data":"47ddf991a16144e1858ac88abfce03ab53252ff609af74874acca0efd37dde5c"} Jan 30 05:29:12 crc kubenswrapper[4841]: I0130 05:29:12.787757 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 05:29:12 crc kubenswrapper[4841]: I0130 05:29:12.787827 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 05:29:13 crc kubenswrapper[4841]: I0130 05:29:13.804573 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 05:29:13 crc kubenswrapper[4841]: I0130 05:29:13.804645 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 05:29:14 crc kubenswrapper[4841]: I0130 05:29:14.249528 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtk2w" event={"ID":"2b781c5e-301c-4458-8dea-4494e6ff8ee1","Type":"ContainerStarted","Data":"639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35"} Jan 30 05:29:14 crc kubenswrapper[4841]: I0130 05:29:14.286511 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mtk2w" podStartSLOduration=3.100039372 podStartE2EDuration="7.286484123s" podCreationTimestamp="2026-01-30 05:29:07 +0000 UTC" firstStartedPulling="2026-01-30 05:29:09.176807655 +0000 UTC m=+1286.170280293" lastFinishedPulling="2026-01-30 05:29:13.363252366 +0000 UTC m=+1290.356725044" observedRunningTime="2026-01-30 05:29:14.278058645 +0000 UTC m=+1291.271531323" watchObservedRunningTime="2026-01-30 05:29:14.286484123 +0000 UTC m=+1291.279956801" Jan 30 05:29:17 crc kubenswrapper[4841]: I0130 05:29:17.784584 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:17 crc kubenswrapper[4841]: I0130 05:29:17.785269 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:18 crc kubenswrapper[4841]: I0130 05:29:18.579890 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 05:29:18 crc kubenswrapper[4841]: I0130 05:29:18.856062 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mtk2w" podUID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerName="registry-server" probeResult="failure" output=< Jan 30 05:29:18 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 05:29:18 crc kubenswrapper[4841]: > Jan 30 05:29:19 crc kubenswrapper[4841]: I0130 05:29:19.800868 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 05:29:19 crc kubenswrapper[4841]: I0130 05:29:19.801775 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 05:29:19 crc kubenswrapper[4841]: I0130 05:29:19.815788 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 05:29:19 crc kubenswrapper[4841]: I0130 05:29:19.815939 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 05:29:20 crc kubenswrapper[4841]: I0130 05:29:20.327679 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 05:29:20 crc kubenswrapper[4841]: I0130 05:29:20.334756 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 05:29:22 crc kubenswrapper[4841]: I0130 05:29:22.799822 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 05:29:22 crc kubenswrapper[4841]: I0130 05:29:22.802699 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 05:29:22 crc kubenswrapper[4841]: I0130 05:29:22.805964 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 05:29:23 crc kubenswrapper[4841]: I0130 05:29:23.366688 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 05:29:28 crc kubenswrapper[4841]: I0130 05:29:28.863672 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mtk2w" podUID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerName="registry-server" probeResult="failure" output=< Jan 30 05:29:28 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 05:29:28 crc kubenswrapper[4841]: > Jan 30 05:29:38 crc kubenswrapper[4841]: I0130 05:29:38.859603 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mtk2w" podUID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerName="registry-server" probeResult="failure" output=< Jan 30 05:29:38 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 05:29:38 crc kubenswrapper[4841]: > Jan 30 05:29:40 crc kubenswrapper[4841]: I0130 05:29:40.463799 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:29:40 crc kubenswrapper[4841]: I0130 05:29:40.464180 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:29:40 crc kubenswrapper[4841]: I0130 05:29:40.464233 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:29:40 crc kubenswrapper[4841]: I0130 05:29:40.465342 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8bf8b4cf30e4dca128fb9c700ac455cfd4ad66705f2809226e958756cecd6fcb"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:29:40 crc kubenswrapper[4841]: I0130 05:29:40.465468 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://8bf8b4cf30e4dca128fb9c700ac455cfd4ad66705f2809226e958756cecd6fcb" gracePeriod=600 Jan 30 05:29:41 crc kubenswrapper[4841]: I0130 05:29:41.598731 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="8bf8b4cf30e4dca128fb9c700ac455cfd4ad66705f2809226e958756cecd6fcb" exitCode=0 Jan 30 05:29:41 crc kubenswrapper[4841]: I0130 05:29:41.598852 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"8bf8b4cf30e4dca128fb9c700ac455cfd4ad66705f2809226e958756cecd6fcb"} Jan 30 05:29:41 crc kubenswrapper[4841]: I0130 05:29:41.599271 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be"} Jan 30 05:29:41 crc kubenswrapper[4841]: I0130 05:29:41.599294 4841 scope.go:117] "RemoveContainer" containerID="344e964639b293843f58c03939c43edd9bcd822c2de642650f9f33c6e2a4eb20" Jan 30 05:29:41 crc kubenswrapper[4841]: I0130 05:29:41.936685 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 30 05:29:41 crc kubenswrapper[4841]: I0130 05:29:41.937105 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5" containerName="openstackclient" containerID="cri-o://3c7852e63582709a9e74b827146a470d704988cb06dbe6df77ac6ac4fc666c94" gracePeriod=2 Jan 30 05:29:41 crc kubenswrapper[4841]: I0130 05:29:41.956713 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 30 05:29:41 crc kubenswrapper[4841]: I0130 05:29:41.990238 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1306-account-create-update-9p79b"] Jan 30 05:29:41 crc kubenswrapper[4841]: E0130 05:29:41.996901 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5" containerName="openstackclient" Jan 30 05:29:41 crc kubenswrapper[4841]: I0130 05:29:41.996935 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5" containerName="openstackclient" Jan 30 05:29:41 crc kubenswrapper[4841]: I0130 05:29:41.997255 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5" containerName="openstackclient" Jan 30 05:29:41 crc kubenswrapper[4841]: I0130 05:29:41.997909 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1306-account-create-update-9p79b" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.001834 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.017457 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7z8bf"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.018583 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7z8bf" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.021804 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.028115 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1306-account-create-update-9p79b"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.098493 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7z8bf"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.101006 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca-operator-scripts\") pod \"neutron-1306-account-create-update-9p79b\" (UID: \"3106f00a-8ed8-4189-be2e-f5c6cce1b4ca\") " pod="openstack/neutron-1306-account-create-update-9p79b" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.102232 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn558\" (UniqueName: \"kubernetes.io/projected/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca-kube-api-access-bn558\") pod \"neutron-1306-account-create-update-9p79b\" (UID: \"3106f00a-8ed8-4189-be2e-f5c6cce1b4ca\") " pod="openstack/neutron-1306-account-create-update-9p79b" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.174308 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1306-account-create-update-mf8cj"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.203967 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts\") pod \"root-account-create-update-7z8bf\" (UID: \"1afed894-4dfb-4873-a45c-29b70507295a\") " pod="openstack/root-account-create-update-7z8bf" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.204018 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn558\" (UniqueName: \"kubernetes.io/projected/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca-kube-api-access-bn558\") pod \"neutron-1306-account-create-update-9p79b\" (UID: \"3106f00a-8ed8-4189-be2e-f5c6cce1b4ca\") " pod="openstack/neutron-1306-account-create-update-9p79b" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.204039 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nplhv\" (UniqueName: \"kubernetes.io/projected/1afed894-4dfb-4873-a45c-29b70507295a-kube-api-access-nplhv\") pod \"root-account-create-update-7z8bf\" (UID: \"1afed894-4dfb-4873-a45c-29b70507295a\") " pod="openstack/root-account-create-update-7z8bf" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.204155 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca-operator-scripts\") pod \"neutron-1306-account-create-update-9p79b\" (UID: \"3106f00a-8ed8-4189-be2e-f5c6cce1b4ca\") " pod="openstack/neutron-1306-account-create-update-9p79b" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.204793 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca-operator-scripts\") pod \"neutron-1306-account-create-update-9p79b\" (UID: \"3106f00a-8ed8-4189-be2e-f5c6cce1b4ca\") " pod="openstack/neutron-1306-account-create-update-9p79b" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.221466 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-de72-account-create-update-ckzp8"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.232893 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-de72-account-create-update-ckzp8" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.244283 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.246799 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1306-account-create-update-mf8cj"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.258064 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-de72-account-create-update-ckzp8"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.279496 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xxskp"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.285422 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xxskp"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.298106 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.308692 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts\") pod \"root-account-create-update-7z8bf\" (UID: \"1afed894-4dfb-4873-a45c-29b70507295a\") " pod="openstack/root-account-create-update-7z8bf" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.308749 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nplhv\" (UniqueName: \"kubernetes.io/projected/1afed894-4dfb-4873-a45c-29b70507295a-kube-api-access-nplhv\") pod \"root-account-create-update-7z8bf\" (UID: \"1afed894-4dfb-4873-a45c-29b70507295a\") " pod="openstack/root-account-create-update-7z8bf" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.309633 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts\") pod \"root-account-create-update-7z8bf\" (UID: \"1afed894-4dfb-4873-a45c-29b70507295a\") " pod="openstack/root-account-create-update-7z8bf" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.310173 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn558\" (UniqueName: \"kubernetes.io/projected/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca-kube-api-access-bn558\") pod \"neutron-1306-account-create-update-9p79b\" (UID: \"3106f00a-8ed8-4189-be2e-f5c6cce1b4ca\") " pod="openstack/neutron-1306-account-create-update-9p79b" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.320114 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1306-account-create-update-9p79b" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.397004 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9a61-account-create-update-8jgdj"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.398957 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nplhv\" (UniqueName: \"kubernetes.io/projected/1afed894-4dfb-4873-a45c-29b70507295a-kube-api-access-nplhv\") pod \"root-account-create-update-7z8bf\" (UID: \"1afed894-4dfb-4873-a45c-29b70507295a\") " pod="openstack/root-account-create-update-7z8bf" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.402233 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9a61-account-create-update-8jgdj" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.416388 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.418491 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4335ead-4d73-4061-b069-960881c2f2d9-operator-scripts\") pod \"cinder-de72-account-create-update-ckzp8\" (UID: \"f4335ead-4d73-4061-b069-960881c2f2d9\") " pod="openstack/cinder-de72-account-create-update-ckzp8" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.418523 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c248\" (UniqueName: \"kubernetes.io/projected/f4335ead-4d73-4061-b069-960881c2f2d9-kube-api-access-5c248\") pod \"cinder-de72-account-create-update-ckzp8\" (UID: \"f4335ead-4d73-4061-b069-960881c2f2d9\") " pod="openstack/cinder-de72-account-create-update-ckzp8" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.452260 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36bee932-2dad-4fca-aff2-0170cb6d4af8" path="/var/lib/kubelet/pods/36bee932-2dad-4fca-aff2-0170cb6d4af8/volumes" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.453152 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d80c9d6e-25f1-4629-97f0-724c2353944b" path="/var/lib/kubelet/pods/d80c9d6e-25f1-4629-97f0-724c2353944b/volumes" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.461157 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-676d-account-create-update-tkbqn"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.475297 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9a61-account-create-update-8jgdj"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.475447 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-676d-account-create-update-tkbqn" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.487903 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.503894 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-676d-account-create-update-tkbqn"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.519707 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4335ead-4d73-4061-b069-960881c2f2d9-operator-scripts\") pod \"cinder-de72-account-create-update-ckzp8\" (UID: \"f4335ead-4d73-4061-b069-960881c2f2d9\") " pod="openstack/cinder-de72-account-create-update-ckzp8" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.519742 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c248\" (UniqueName: \"kubernetes.io/projected/f4335ead-4d73-4061-b069-960881c2f2d9-kube-api-access-5c248\") pod \"cinder-de72-account-create-update-ckzp8\" (UID: \"f4335ead-4d73-4061-b069-960881c2f2d9\") " pod="openstack/cinder-de72-account-create-update-ckzp8" Jan 30 05:29:42 crc kubenswrapper[4841]: E0130 05:29:42.521327 4841 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:42 crc kubenswrapper[4841]: E0130 05:29:42.521387 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data podName:ad7779ad-0912-4695-853f-3ce786c2e9ae nodeName:}" failed. No retries permitted until 2026-01-30 05:29:43.021358801 +0000 UTC m=+1320.014831439 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data") pod "rabbitmq-cell1-server-0" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae") : configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.522168 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4335ead-4d73-4061-b069-960881c2f2d9-operator-scripts\") pod \"cinder-de72-account-create-update-ckzp8\" (UID: \"f4335ead-4d73-4061-b069-960881c2f2d9\") " pod="openstack/cinder-de72-account-create-update-ckzp8" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.554513 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-de72-account-create-update-8qxjx"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.582606 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-de72-account-create-update-8qxjx"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.616173 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c248\" (UniqueName: \"kubernetes.io/projected/f4335ead-4d73-4061-b069-960881c2f2d9-kube-api-access-5c248\") pod \"cinder-de72-account-create-update-ckzp8\" (UID: \"f4335ead-4d73-4061-b069-960881c2f2d9\") " pod="openstack/cinder-de72-account-create-update-ckzp8" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.623381 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff1ce10f-fa3e-4526-a823-f4defdaf9085-operator-scripts\") pod \"barbican-676d-account-create-update-tkbqn\" (UID: \"ff1ce10f-fa3e-4526-a823-f4defdaf9085\") " pod="openstack/barbican-676d-account-create-update-tkbqn" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.623476 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e1bee18-1dd7-42be-b113-6a746b3ff70d-operator-scripts\") pod \"glance-9a61-account-create-update-8jgdj\" (UID: \"4e1bee18-1dd7-42be-b113-6a746b3ff70d\") " pod="openstack/glance-9a61-account-create-update-8jgdj" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.623548 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8z62\" (UniqueName: \"kubernetes.io/projected/4e1bee18-1dd7-42be-b113-6a746b3ff70d-kube-api-access-q8z62\") pod \"glance-9a61-account-create-update-8jgdj\" (UID: \"4e1bee18-1dd7-42be-b113-6a746b3ff70d\") " pod="openstack/glance-9a61-account-create-update-8jgdj" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.623648 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vgv5\" (UniqueName: \"kubernetes.io/projected/ff1ce10f-fa3e-4526-a823-f4defdaf9085-kube-api-access-9vgv5\") pod \"barbican-676d-account-create-update-tkbqn\" (UID: \"ff1ce10f-fa3e-4526-a823-f4defdaf9085\") " pod="openstack/barbican-676d-account-create-update-tkbqn" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.654862 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7z8bf" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.720488 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9a61-account-create-update-hb6tt"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.728600 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8z62\" (UniqueName: \"kubernetes.io/projected/4e1bee18-1dd7-42be-b113-6a746b3ff70d-kube-api-access-q8z62\") pod \"glance-9a61-account-create-update-8jgdj\" (UID: \"4e1bee18-1dd7-42be-b113-6a746b3ff70d\") " pod="openstack/glance-9a61-account-create-update-8jgdj" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.728717 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vgv5\" (UniqueName: \"kubernetes.io/projected/ff1ce10f-fa3e-4526-a823-f4defdaf9085-kube-api-access-9vgv5\") pod \"barbican-676d-account-create-update-tkbqn\" (UID: \"ff1ce10f-fa3e-4526-a823-f4defdaf9085\") " pod="openstack/barbican-676d-account-create-update-tkbqn" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.728751 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff1ce10f-fa3e-4526-a823-f4defdaf9085-operator-scripts\") pod \"barbican-676d-account-create-update-tkbqn\" (UID: \"ff1ce10f-fa3e-4526-a823-f4defdaf9085\") " pod="openstack/barbican-676d-account-create-update-tkbqn" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.728790 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e1bee18-1dd7-42be-b113-6a746b3ff70d-operator-scripts\") pod \"glance-9a61-account-create-update-8jgdj\" (UID: \"4e1bee18-1dd7-42be-b113-6a746b3ff70d\") " pod="openstack/glance-9a61-account-create-update-8jgdj" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.729484 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e1bee18-1dd7-42be-b113-6a746b3ff70d-operator-scripts\") pod \"glance-9a61-account-create-update-8jgdj\" (UID: \"4e1bee18-1dd7-42be-b113-6a746b3ff70d\") " pod="openstack/glance-9a61-account-create-update-8jgdj" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.729610 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff1ce10f-fa3e-4526-a823-f4defdaf9085-operator-scripts\") pod \"barbican-676d-account-create-update-tkbqn\" (UID: \"ff1ce10f-fa3e-4526-a823-f4defdaf9085\") " pod="openstack/barbican-676d-account-create-update-tkbqn" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.733014 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-676d-account-create-update-bdk8r"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.753192 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-676d-account-create-update-bdk8r"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.753243 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8z7z8"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.760222 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-t7thb"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.769488 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-t7thb"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.777164 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8z7z8"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.791003 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.798301 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8z62\" (UniqueName: \"kubernetes.io/projected/4e1bee18-1dd7-42be-b113-6a746b3ff70d-kube-api-access-q8z62\") pod \"glance-9a61-account-create-update-8jgdj\" (UID: \"4e1bee18-1dd7-42be-b113-6a746b3ff70d\") " pod="openstack/glance-9a61-account-create-update-8jgdj" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.808359 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9a61-account-create-update-hb6tt"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.815020 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vgv5\" (UniqueName: \"kubernetes.io/projected/ff1ce10f-fa3e-4526-a823-f4defdaf9085-kube-api-access-9vgv5\") pod \"barbican-676d-account-create-update-tkbqn\" (UID: \"ff1ce10f-fa3e-4526-a823-f4defdaf9085\") " pod="openstack/barbican-676d-account-create-update-tkbqn" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.828819 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9a61-account-create-update-8jgdj" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.847627 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-676d-account-create-update-tkbqn" Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.876233 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.876687 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="cd33f000-ac38-400f-95b4-d9f6a68d13c0" containerName="openstack-network-exporter" containerID="cri-o://d9d5bddafde38d3fd047f16407b7b2377643bde704520227bda10004fa73dc8d" gracePeriod=300 Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.885200 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-de72-account-create-update-ckzp8" Jan 30 05:29:42 crc kubenswrapper[4841]: E0130 05:29:42.946336 4841 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 30 05:29:42 crc kubenswrapper[4841]: E0130 05:29:42.946389 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data podName:cc423120-ba93-465b-8ef8-871904b901ef nodeName:}" failed. No retries permitted until 2026-01-30 05:29:43.446374694 +0000 UTC m=+1320.439847332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data") pod "rabbitmq-server-0" (UID: "cc423120-ba93-465b-8ef8-871904b901ef") : configmap "rabbitmq-config-data" not found Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.957702 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.958360 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="5df66af1-0c57-44f7-8b54-bc351a3faa66" containerName="openstack-network-exporter" containerID="cri-o://770c37b5591c01b91dbf8c41b92ea066c0501a7dc4909ac010c8ff458d9d3823" gracePeriod=300 Jan 30 05:29:42 crc kubenswrapper[4841]: I0130 05:29:42.985214 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-b8f64"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.025472 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-b8f64"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.044962 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.047812 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="84924340-1dd2-488e-a6e6-adfe62b61f2f" containerName="openstack-network-exporter" containerID="cri-o://465a65be37ff3c79b8c121932b87f14d4319db96cad76bedc4f91b6b55b13cf0" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.048060 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="84924340-1dd2-488e-a6e6-adfe62b61f2f" containerName="ovn-northd" containerID="cri-o://b18eea1e80216e394517133207cd5dd3e554316b05364a6776862ef3821a0f05" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4841]: E0130 05:29:43.051341 4841 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:43 crc kubenswrapper[4841]: E0130 05:29:43.051428 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data podName:ad7779ad-0912-4695-853f-3ce786c2e9ae nodeName:}" failed. No retries permitted until 2026-01-30 05:29:44.051391043 +0000 UTC m=+1321.044863681 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data") pod "rabbitmq-cell1-server-0" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae") : configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.112570 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d99c-account-create-update-xhj29"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.120865 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d99c-account-create-update-xhj29" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.134152 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.188666 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="5df66af1-0c57-44f7-8b54-bc351a3faa66" containerName="ovsdbserver-nb" containerID="cri-o://8b17702bd7db482eb4d14d6e36c3a54793a366a5aa87ef51064106768aa101bc" gracePeriod=300 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.196937 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="cd33f000-ac38-400f-95b4-d9f6a68d13c0" containerName="ovsdbserver-sb" containerID="cri-o://eaca801d46ea5dc0cda81a019aed95bd7823bddb489326243d7b1a3fbaa6599a" gracePeriod=300 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.206079 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d99c-account-create-update-xhj29"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.292468 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1fec-account-create-update-46xxt"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.302386 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fec-account-create-update-46xxt" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.308733 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.362472 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1fec-account-create-update-46xxt"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.382704 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69db2b46-7a27-45b3-9bca-fac2189f47ef-operator-scripts\") pod \"nova-api-d99c-account-create-update-xhj29\" (UID: \"69db2b46-7a27-45b3-9bca-fac2189f47ef\") " pod="openstack/nova-api-d99c-account-create-update-xhj29" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.407442 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqqfr\" (UniqueName: \"kubernetes.io/projected/69db2b46-7a27-45b3-9bca-fac2189f47ef-kube-api-access-sqqfr\") pod \"nova-api-d99c-account-create-update-xhj29\" (UID: \"69db2b46-7a27-45b3-9bca-fac2189f47ef\") " pod="openstack/nova-api-d99c-account-create-update-xhj29" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.456045 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d99c-account-create-update-hpmhc"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.473656 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d99c-account-create-update-hpmhc"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.490193 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-n25tn"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.490451 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-n25tn" podUID="30c47888-1780-4539-8777-5914009b862f" containerName="openstack-network-exporter" containerID="cri-o://97894b49cfe91943a3d82f73c54427c16ccf3ca0fab823dd8f5f0d111192e569" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.507260 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-mtgqw"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.511280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqqfr\" (UniqueName: \"kubernetes.io/projected/69db2b46-7a27-45b3-9bca-fac2189f47ef-kube-api-access-sqqfr\") pod \"nova-api-d99c-account-create-update-xhj29\" (UID: \"69db2b46-7a27-45b3-9bca-fac2189f47ef\") " pod="openstack/nova-api-d99c-account-create-update-xhj29" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.511429 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/703d8335-345d-4847-afca-b5667e0b4c0f-operator-scripts\") pod \"nova-cell0-1fec-account-create-update-46xxt\" (UID: \"703d8335-345d-4847-afca-b5667e0b4c0f\") " pod="openstack/nova-cell0-1fec-account-create-update-46xxt" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.511456 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69db2b46-7a27-45b3-9bca-fac2189f47ef-operator-scripts\") pod \"nova-api-d99c-account-create-update-xhj29\" (UID: \"69db2b46-7a27-45b3-9bca-fac2189f47ef\") " pod="openstack/nova-api-d99c-account-create-update-xhj29" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.511508 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dv67\" (UniqueName: \"kubernetes.io/projected/703d8335-345d-4847-afca-b5667e0b4c0f-kube-api-access-9dv67\") pod \"nova-cell0-1fec-account-create-update-46xxt\" (UID: \"703d8335-345d-4847-afca-b5667e0b4c0f\") " pod="openstack/nova-cell0-1fec-account-create-update-46xxt" Jan 30 05:29:43 crc kubenswrapper[4841]: E0130 05:29:43.511971 4841 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 30 05:29:43 crc kubenswrapper[4841]: E0130 05:29:43.512028 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data podName:cc423120-ba93-465b-8ef8-871904b901ef nodeName:}" failed. No retries permitted until 2026-01-30 05:29:44.512014824 +0000 UTC m=+1321.505487462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data") pod "rabbitmq-server-0" (UID: "cc423120-ba93-465b-8ef8-871904b901ef") : configmap "rabbitmq-config-data" not found Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.512846 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69db2b46-7a27-45b3-9bca-fac2189f47ef-operator-scripts\") pod \"nova-api-d99c-account-create-update-xhj29\" (UID: \"69db2b46-7a27-45b3-9bca-fac2189f47ef\") " pod="openstack/nova-api-d99c-account-create-update-xhj29" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.524374 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-mtgqw"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.530890 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-88rdn"] Jan 30 05:29:43 crc kubenswrapper[4841]: W0130 05:29:43.532744 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3106f00a_8ed8_4189_be2e_f5c6cce1b4ca.slice/crio-eb15ee17f4d7c8dc588c845a05a968e79aaededfe3658159c5af648558a58bc2 WatchSource:0}: Error finding container eb15ee17f4d7c8dc588c845a05a968e79aaededfe3658159c5af648558a58bc2: Status 404 returned error can't find the container with id eb15ee17f4d7c8dc588c845a05a968e79aaededfe3658159c5af648558a58bc2 Jan 30 05:29:43 crc kubenswrapper[4841]: E0130 05:29:43.539149 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:43 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:43 crc kubenswrapper[4841]: Jan 30 05:29:43 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:43 crc kubenswrapper[4841]: Jan 30 05:29:43 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:43 crc kubenswrapper[4841]: Jan 30 05:29:43 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:43 crc kubenswrapper[4841]: Jan 30 05:29:43 crc kubenswrapper[4841]: if [ -n "neutron" ]; then Jan 30 05:29:43 crc kubenswrapper[4841]: GRANT_DATABASE="neutron" Jan 30 05:29:43 crc kubenswrapper[4841]: else Jan 30 05:29:43 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:43 crc kubenswrapper[4841]: fi Jan 30 05:29:43 crc kubenswrapper[4841]: Jan 30 05:29:43 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:43 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:43 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:43 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:43 crc kubenswrapper[4841]: # support updates Jan 30 05:29:43 crc kubenswrapper[4841]: Jan 30 05:29:43 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:43 crc kubenswrapper[4841]: E0130 05:29:43.540554 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-1306-account-create-update-9p79b" podUID="3106f00a-8ed8-4189-be2e-f5c6cce1b4ca" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.545474 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1fec-account-create-update-9sgmx"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.554020 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqqfr\" (UniqueName: \"kubernetes.io/projected/69db2b46-7a27-45b3-9bca-fac2189f47ef-kube-api-access-sqqfr\") pod \"nova-api-d99c-account-create-update-xhj29\" (UID: \"69db2b46-7a27-45b3-9bca-fac2189f47ef\") " pod="openstack/nova-api-d99c-account-create-update-xhj29" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.570718 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1fec-account-create-update-9sgmx"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.609034 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-lbv2q"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.616701 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/703d8335-345d-4847-afca-b5667e0b4c0f-operator-scripts\") pod \"nova-cell0-1fec-account-create-update-46xxt\" (UID: \"703d8335-345d-4847-afca-b5667e0b4c0f\") " pod="openstack/nova-cell0-1fec-account-create-update-46xxt" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.616763 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dv67\" (UniqueName: \"kubernetes.io/projected/703d8335-345d-4847-afca-b5667e0b4c0f-kube-api-access-9dv67\") pod \"nova-cell0-1fec-account-create-update-46xxt\" (UID: \"703d8335-345d-4847-afca-b5667e0b4c0f\") " pod="openstack/nova-cell0-1fec-account-create-update-46xxt" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.618725 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/703d8335-345d-4847-afca-b5667e0b4c0f-operator-scripts\") pod \"nova-cell0-1fec-account-create-update-46xxt\" (UID: \"703d8335-345d-4847-afca-b5667e0b4c0f\") " pod="openstack/nova-cell0-1fec-account-create-update-46xxt" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.651933 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dv67\" (UniqueName: \"kubernetes.io/projected/703d8335-345d-4847-afca-b5667e0b4c0f-kube-api-access-9dv67\") pod \"nova-cell0-1fec-account-create-update-46xxt\" (UID: \"703d8335-345d-4847-afca-b5667e0b4c0f\") " pod="openstack/nova-cell0-1fec-account-create-update-46xxt" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.684790 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-zb2qt"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.685081 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" podUID="0da7e312-7550-4d60-a14b-d5dbdc500e88" containerName="dnsmasq-dns" containerID="cri-o://4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2" gracePeriod=10 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.698307 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fec-account-create-update-46xxt" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.701577 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-dzfl5"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.707511 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5df66af1-0c57-44f7-8b54-bc351a3faa66/ovsdbserver-nb/0.log" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.707562 4841 generic.go:334] "Generic (PLEG): container finished" podID="5df66af1-0c57-44f7-8b54-bc351a3faa66" containerID="770c37b5591c01b91dbf8c41b92ea066c0501a7dc4909ac010c8ff458d9d3823" exitCode=2 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.707578 4841 generic.go:334] "Generic (PLEG): container finished" podID="5df66af1-0c57-44f7-8b54-bc351a3faa66" containerID="8b17702bd7db482eb4d14d6e36c3a54793a366a5aa87ef51064106768aa101bc" exitCode=143 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.707621 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5df66af1-0c57-44f7-8b54-bc351a3faa66","Type":"ContainerDied","Data":"770c37b5591c01b91dbf8c41b92ea066c0501a7dc4909ac010c8ff458d9d3823"} Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.707646 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5df66af1-0c57-44f7-8b54-bc351a3faa66","Type":"ContainerDied","Data":"8b17702bd7db482eb4d14d6e36c3a54793a366a5aa87ef51064106768aa101bc"} Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.721464 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-dzfl5"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.724615 4841 generic.go:334] "Generic (PLEG): container finished" podID="84924340-1dd2-488e-a6e6-adfe62b61f2f" containerID="465a65be37ff3c79b8c121932b87f14d4319db96cad76bedc4f91b6b55b13cf0" exitCode=2 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.724676 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"84924340-1dd2-488e-a6e6-adfe62b61f2f","Type":"ContainerDied","Data":"465a65be37ff3c79b8c121932b87f14d4319db96cad76bedc4f91b6b55b13cf0"} Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.734210 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cd33f000-ac38-400f-95b4-d9f6a68d13c0/ovsdbserver-sb/0.log" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.734251 4841 generic.go:334] "Generic (PLEG): container finished" podID="cd33f000-ac38-400f-95b4-d9f6a68d13c0" containerID="d9d5bddafde38d3fd047f16407b7b2377643bde704520227bda10004fa73dc8d" exitCode=2 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.734264 4841 generic.go:334] "Generic (PLEG): container finished" podID="cd33f000-ac38-400f-95b4-d9f6a68d13c0" containerID="eaca801d46ea5dc0cda81a019aed95bd7823bddb489326243d7b1a3fbaa6599a" exitCode=143 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.734417 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cd33f000-ac38-400f-95b4-d9f6a68d13c0","Type":"ContainerDied","Data":"d9d5bddafde38d3fd047f16407b7b2377643bde704520227bda10004fa73dc8d"} Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.734441 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cd33f000-ac38-400f-95b4-d9f6a68d13c0","Type":"ContainerDied","Data":"eaca801d46ea5dc0cda81a019aed95bd7823bddb489326243d7b1a3fbaa6599a"} Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.739427 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6fc8b6ddd6-nkc6r"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.739760 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6fc8b6ddd6-nkc6r" podUID="28551500-d017-475a-aae4-8352782c0b4e" containerName="placement-log" containerID="cri-o://088a47d541f00148328e19a9fb5636697d24c805841cd93d6a4ac4b7d6e6779f" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.740020 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6fc8b6ddd6-nkc6r" podUID="28551500-d017-475a-aae4-8352782c0b4e" containerName="placement-api" containerID="cri-o://1608c08a181e26696b13ff2abda250d1ad88e50ec15ce51f053cef31c22f983e" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.751151 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1306-account-create-update-9p79b"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.751494 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1306-account-create-update-9p79b" event={"ID":"3106f00a-8ed8-4189-be2e-f5c6cce1b4ca","Type":"ContainerStarted","Data":"eb15ee17f4d7c8dc588c845a05a968e79aaededfe3658159c5af648558a58bc2"} Jan 30 05:29:43 crc kubenswrapper[4841]: E0130 05:29:43.753702 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:43 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:43 crc kubenswrapper[4841]: Jan 30 05:29:43 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:43 crc kubenswrapper[4841]: Jan 30 05:29:43 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:43 crc kubenswrapper[4841]: Jan 30 05:29:43 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:43 crc kubenswrapper[4841]: Jan 30 05:29:43 crc kubenswrapper[4841]: if [ -n "neutron" ]; then Jan 30 05:29:43 crc kubenswrapper[4841]: GRANT_DATABASE="neutron" Jan 30 05:29:43 crc kubenswrapper[4841]: else Jan 30 05:29:43 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:43 crc kubenswrapper[4841]: fi Jan 30 05:29:43 crc kubenswrapper[4841]: Jan 30 05:29:43 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:43 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:43 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:43 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:43 crc kubenswrapper[4841]: # support updates Jan 30 05:29:43 crc kubenswrapper[4841]: Jan 30 05:29:43 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:43 crc kubenswrapper[4841]: E0130 05:29:43.758666 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-1306-account-create-update-9p79b" podUID="3106f00a-8ed8-4189-be2e-f5c6cce1b4ca" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.771485 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.771871 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" containerName="cinder-scheduler" containerID="cri-o://008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.772042 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" containerName="probe" containerID="cri-o://0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.783193 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d99c-account-create-update-xhj29" Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.788738 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.789022 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1a2724da-6b9b-4947-a4e3-894938742304" containerName="cinder-api-log" containerID="cri-o://111f861be82792295977d9e1509a0d34e506936e498cf079b08adb56c250f14b" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.789139 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1a2724da-6b9b-4947-a4e3-894938742304" containerName="cinder-api" containerID="cri-o://a480751fcbd360c7947b1114ae086d16d2b9b26b051a46078f3d5cb250e0a982" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.795246 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-de72-account-create-update-ckzp8"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.802757 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-164a-account-create-update-vqvpq"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.819312 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-164a-account-create-update-vqvpq"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.829493 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-65977b5879-qctf6"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.829839 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-65977b5879-qctf6" podUID="8ad9e30b-abf9-45fd-9088-103c94e4ed70" containerName="neutron-api" containerID="cri-o://507b9c74c5025e94796600d70c3425d021076e350f65b3c2da7a6f02528353c9" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.830379 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-65977b5879-qctf6" podUID="8ad9e30b-abf9-45fd-9088-103c94e4ed70" containerName="neutron-httpd" containerID="cri-o://71a6a5266aac4658f33e51c0327009b65b98cc2a8a908dc821c307eb9aa11b89" gracePeriod=30 Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.856075 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-hpd9d"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.864037 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-hpd9d"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.910658 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-wnzmn"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.913481 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-wnzmn"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.943726 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-zszt7"] Jan 30 05:29:43 crc kubenswrapper[4841]: I0130 05:29:43.953114 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-zszt7"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.031952 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-t5622"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.057008 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-t5622"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.083449 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.133614 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jkv7n"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.170899 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ad7779ad-0912-4695-853f-3ce786c2e9ae" containerName="rabbitmq" containerID="cri-o://7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97" gracePeriod=604800 Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.178390 4841 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.187605 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data podName:ad7779ad-0912-4695-853f-3ce786c2e9ae nodeName:}" failed. No retries permitted until 2026-01-30 05:29:46.187565106 +0000 UTC m=+1323.181037744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data") pod "rabbitmq-cell1-server-0" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae") : configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.186241 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-jkv7n"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.235783 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5df66af1-0c57-44f7-8b54-bc351a3faa66/ovsdbserver-nb/0.log" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.235881 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.296059 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.296423 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a9e49c58-8075-46a1-8bfd-44412a673589" containerName="glance-log" containerID="cri-o://31ef83de4f5eec8256ad9a9714b034f94d5c23abce8bac25f0acaf273fe574d8" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.296971 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a9e49c58-8075-46a1-8bfd-44412a673589" containerName="glance-httpd" containerID="cri-o://423863723ace49a11675767356ead1a32dc7658d7769ea7c64a96cba5343a3ca" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.328815 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1306-account-create-update-9p79b"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.355769 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.356649 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:44 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: if [ -n "" ]; then Jan 30 05:29:44 crc kubenswrapper[4841]: GRANT_DATABASE="" Jan 30 05:29:44 crc kubenswrapper[4841]: else Jan 30 05:29:44 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:44 crc kubenswrapper[4841]: fi Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:44 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:44 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:44 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:44 crc kubenswrapper[4841]: # support updates Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.358376 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-7z8bf" podUID="1afed894-4dfb-4873-a45c-29b70507295a" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.419272 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"5df66af1-0c57-44f7-8b54-bc351a3faa66\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.419882 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-combined-ca-bundle\") pod \"5df66af1-0c57-44f7-8b54-bc351a3faa66\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.420033 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5df66af1-0c57-44f7-8b54-bc351a3faa66-scripts\") pod \"5df66af1-0c57-44f7-8b54-bc351a3faa66\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.420135 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc5mt\" (UniqueName: \"kubernetes.io/projected/5df66af1-0c57-44f7-8b54-bc351a3faa66-kube-api-access-kc5mt\") pod \"5df66af1-0c57-44f7-8b54-bc351a3faa66\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.420341 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-metrics-certs-tls-certs\") pod \"5df66af1-0c57-44f7-8b54-bc351a3faa66\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.420463 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5df66af1-0c57-44f7-8b54-bc351a3faa66-ovsdb-rundir\") pod \"5df66af1-0c57-44f7-8b54-bc351a3faa66\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.420547 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-ovsdbserver-nb-tls-certs\") pod \"5df66af1-0c57-44f7-8b54-bc351a3faa66\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.420636 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df66af1-0c57-44f7-8b54-bc351a3faa66-config\") pod \"5df66af1-0c57-44f7-8b54-bc351a3faa66\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.421617 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df66af1-0c57-44f7-8b54-bc351a3faa66-config" (OuterVolumeSpecName: "config") pod "5df66af1-0c57-44f7-8b54-bc351a3faa66" (UID: "5df66af1-0c57-44f7-8b54-bc351a3faa66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.421895 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.422344 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-server" containerID="cri-o://31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.424541 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-expirer" containerID="cri-o://d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.425068 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="swift-recon-cron" containerID="cri-o://1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.425112 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="rsync" containerID="cri-o://ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.425186 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-auditor" containerID="cri-o://9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.425242 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-updater" containerID="cri-o://0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.426233 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-server" containerID="cri-o://a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.426474 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-replicator" containerID="cri-o://cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.429162 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5df66af1-0c57-44f7-8b54-bc351a3faa66-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "5df66af1-0c57-44f7-8b54-bc351a3faa66" (UID: "5df66af1-0c57-44f7-8b54-bc351a3faa66"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.446612 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-server" containerID="cri-o://a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.446775 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-updater" containerID="cri-o://4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.446824 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-auditor" containerID="cri-o://4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.446868 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-replicator" containerID="cri-o://3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.447813 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df66af1-0c57-44f7-8b54-bc351a3faa66-scripts" (OuterVolumeSpecName: "scripts") pod "5df66af1-0c57-44f7-8b54-bc351a3faa66" (UID: "5df66af1-0c57-44f7-8b54-bc351a3faa66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.447895 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-auditor" containerID="cri-o://0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.448565 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-reaper" containerID="cri-o://092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.448662 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-replicator" containerID="cri-o://dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.452206 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df66af1-0c57-44f7-8b54-bc351a3faa66-kube-api-access-kc5mt" (OuterVolumeSpecName: "kube-api-access-kc5mt") pod "5df66af1-0c57-44f7-8b54-bc351a3faa66" (UID: "5df66af1-0c57-44f7-8b54-bc351a3faa66"). InnerVolumeSpecName "kube-api-access-kc5mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.519012 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "5df66af1-0c57-44f7-8b54-bc351a3faa66" (UID: "5df66af1-0c57-44f7-8b54-bc351a3faa66"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.523319 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d823011-229a-426a-99d7-0af611df4000" path="/var/lib/kubelet/pods/0d823011-229a-426a-99d7-0af611df4000/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.526662 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1" path="/var/lib/kubelet/pods/39e2f0d7-9bb0-4621-a416-77d8ba3c4bc1/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.527254 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5871a8-286c-4ae9-90b8-5accaf3e8fa3" path="/var/lib/kubelet/pods/3c5871a8-286c-4ae9-90b8-5accaf3e8fa3/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.527952 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="502d6fe3-4215-4b32-8546-a55e5a4afc91" path="/var/lib/kubelet/pods/502d6fe3-4215-4b32-8546-a55e5a4afc91/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.546786 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60131c9c-b83a-472c-ad14-5ea846e9b04d" path="/var/lib/kubelet/pods/60131c9c-b83a-472c-ad14-5ea846e9b04d/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.547829 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65017d10-45fc-4a63-adbf-b09b4adafb4e" path="/var/lib/kubelet/pods/65017d10-45fc-4a63-adbf-b09b4adafb4e/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.548535 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"5df66af1-0c57-44f7-8b54-bc351a3faa66\" (UID: \"5df66af1-0c57-44f7-8b54-bc351a3faa66\") " Jan 30 05:29:44 crc kubenswrapper[4841]: W0130 05:29:44.548676 4841 mount_helper_common.go:34] Warning: mount cleanup skipped because path does not exist: /var/lib/kubelet/pods/5df66af1-0c57-44f7-8b54-bc351a3faa66/volumes/kubernetes.io~local-volume/local-storage08-crc Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.548732 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "5df66af1-0c57-44f7-8b54-bc351a3faa66" (UID: "5df66af1-0c57-44f7-8b54-bc351a3faa66"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.549189 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="654366d0-8b60-4ae2-bde2-981cdc9464a4" path="/var/lib/kubelet/pods/654366d0-8b60-4ae2-bde2-981cdc9464a4/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.549837 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d4a4be-d1db-4e1e-81de-09de7396cb0a" path="/var/lib/kubelet/pods/66d4a4be-d1db-4e1e-81de-09de7396cb0a/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.550294 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df66af1-0c57-44f7-8b54-bc351a3faa66-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.550324 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.550333 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5df66af1-0c57-44f7-8b54-bc351a3faa66-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.550341 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc5mt\" (UniqueName: \"kubernetes.io/projected/5df66af1-0c57-44f7-8b54-bc351a3faa66-kube-api-access-kc5mt\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.550351 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5df66af1-0c57-44f7-8b54-bc351a3faa66-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.550861 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f5077d-7967-4cbe-9254-09728b25ab58" path="/var/lib/kubelet/pods/68f5077d-7967-4cbe-9254-09728b25ab58/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.551205 4841 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.551253 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data podName:cc423120-ba93-465b-8ef8-871904b901ef nodeName:}" failed. No retries permitted until 2026-01-30 05:29:46.551239835 +0000 UTC m=+1323.544712473 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data") pod "rabbitmq-server-0" (UID: "cc423120-ba93-465b-8ef8-871904b901ef") : configmap "rabbitmq-config-data" not found Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.552901 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="738fde20-8e94-46e9-bb59-24f917e279cd" path="/var/lib/kubelet/pods/738fde20-8e94-46e9-bb59-24f917e279cd/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.553520 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a361348-e06e-4aa4-b180-0450782b1dfc" path="/var/lib/kubelet/pods/8a361348-e06e-4aa4-b180-0450782b1dfc/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.554095 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a606323e-b83b-4046-aeff-ea4ded617943" path="/var/lib/kubelet/pods/a606323e-b83b-4046-aeff-ea4ded617943/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.555513 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5df66af1-0c57-44f7-8b54-bc351a3faa66" (UID: "5df66af1-0c57-44f7-8b54-bc351a3faa66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.560929 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1" path="/var/lib/kubelet/pods/ac5bcf96-ceff-47c0-9dbb-4a96c0e7c8b1/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.563103 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beaf0495-d1d3-42df-b727-dc0c6fb5fe2a" path="/var/lib/kubelet/pods/beaf0495-d1d3-42df-b727-dc0c6fb5fe2a/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.563717 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bef5639f-2a56-4255-ae24-a8f794a7b715" path="/var/lib/kubelet/pods/bef5639f-2a56-4255-ae24-a8f794a7b715/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.569241 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff309626-60f6-4110-8b20-5354dab1ca68" path="/var/lib/kubelet/pods/ff309626-60f6-4110-8b20-5354dab1ca68/volumes" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.628949 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.654855 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.654886 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.670782 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "5df66af1-0c57-44f7-8b54-bc351a3faa66" (UID: "5df66af1-0c57-44f7-8b54-bc351a3faa66"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.719765 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.729191 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "5df66af1-0c57-44f7-8b54-bc351a3faa66" (UID: "5df66af1-0c57-44f7-8b54-bc351a3faa66"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.730796 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:44 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: if [ -n "glance" ]; then Jan 30 05:29:44 crc kubenswrapper[4841]: GRANT_DATABASE="glance" Jan 30 05:29:44 crc kubenswrapper[4841]: else Jan 30 05:29:44 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:44 crc kubenswrapper[4841]: fi Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:44 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:44 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:44 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:44 crc kubenswrapper[4841]: # support updates Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.731895 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-9a61-account-create-update-8jgdj" podUID="4e1bee18-1dd7-42be-b113-6a746b3ff70d" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.736295 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovs-vswitchd" probeResult="failure" output=< Jan 30 05:29:44 crc kubenswrapper[4841]: cat: /var/run/openvswitch/ovs-vswitchd.pid: No such file or directory Jan 30 05:29:44 crc kubenswrapper[4841]: ERROR - Failed to get pid for ovs-vswitchd, exit status: 0 Jan 30 05:29:44 crc kubenswrapper[4841]: > Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.749609 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovs-vswitchd" containerID="cri-o://6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" gracePeriod=29 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.757468 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.757497 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5df66af1-0c57-44f7-8b54-bc351a3faa66-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.770786 4841 generic.go:334] "Generic (PLEG): container finished" podID="a9e49c58-8075-46a1-8bfd-44412a673589" containerID="31ef83de4f5eec8256ad9a9714b034f94d5c23abce8bac25f0acaf273fe574d8" exitCode=143 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.793205 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7z8bf"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.793238 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9e49c58-8075-46a1-8bfd-44412a673589","Type":"ContainerDied","Data":"31ef83de4f5eec8256ad9a9714b034f94d5c23abce8bac25f0acaf273fe574d8"} Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.793264 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.793277 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-hrbft"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.793288 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9a61-account-create-update-8jgdj" event={"ID":"4e1bee18-1dd7-42be-b113-6a746b3ff70d","Type":"ContainerStarted","Data":"f5e7d86049bcc5e08b656357960f8a4470152ba9d86730ed4073bdb2e47f5a8a"} Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.793305 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-hrbft"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.793320 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2cee-account-create-update-qbxlv"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.793330 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2cee-account-create-update-qbxlv"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.793342 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-w85md"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.793352 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-w85md"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.793361 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9a61-account-create-update-8jgdj"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.793373 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.798908 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-26gs6"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.799012 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="73fdf532-7bb7-43db-acbc-b949166ccd6b" containerName="glance-log" containerID="cri-o://f9bffcb1c98bf20cc71c95a2813cb2b9fa85cff321938e56af140ba1916af595" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.799217 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="73fdf532-7bb7-43db-acbc-b949166ccd6b" containerName="glance-httpd" containerID="cri-o://249a278941e10cc86f1a6bbc7212e043a936d0a3e53d7ee03f0ed2a158d8459b" gracePeriod=30 Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.806141 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:44 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: if [ -n "glance" ]; then Jan 30 05:29:44 crc kubenswrapper[4841]: GRANT_DATABASE="glance" Jan 30 05:29:44 crc kubenswrapper[4841]: else Jan 30 05:29:44 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:44 crc kubenswrapper[4841]: fi Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:44 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:44 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:44 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:44 crc kubenswrapper[4841]: # support updates Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.807491 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-9a61-account-create-update-8jgdj" podUID="4e1bee18-1dd7-42be-b113-6a746b3ff70d" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.808699 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.818283 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:44 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: if [ -n "barbican" ]; then Jan 30 05:29:44 crc kubenswrapper[4841]: GRANT_DATABASE="barbican" Jan 30 05:29:44 crc kubenswrapper[4841]: else Jan 30 05:29:44 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:44 crc kubenswrapper[4841]: fi Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:44 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:44 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:44 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:44 crc kubenswrapper[4841]: # support updates Jan 30 05:29:44 crc kubenswrapper[4841]: Jan 30 05:29:44 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.821377 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-676d-account-create-update-tkbqn" podUID="ff1ce10f-fa3e-4526-a823-f4defdaf9085" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.821851 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d99c-account-create-update-xhj29"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.829234 4841 generic.go:334] "Generic (PLEG): container finished" podID="f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5" containerID="3c7852e63582709a9e74b827146a470d704988cb06dbe6df77ac6ac4fc666c94" exitCode=137 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.862698 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-26gs6"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.862930 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cd33f000-ac38-400f-95b4-d9f6a68d13c0/ovsdbserver-sb/0.log" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.863001 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.866745 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5df66af1-0c57-44f7-8b54-bc351a3faa66/ovsdbserver-nb/0.log" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.866888 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5df66af1-0c57-44f7-8b54-bc351a3faa66","Type":"ContainerDied","Data":"cc0f049e0885820ab6e679e1bba1a27774c34c43371b4e35e8197ce8b7b0491e"} Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.866971 4841 scope.go:117] "RemoveContainer" containerID="770c37b5591c01b91dbf8c41b92ea066c0501a7dc4909ac010c8ff458d9d3823" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.867138 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.867383 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-n25tn_30c47888-1780-4539-8777-5914009b862f/openstack-network-exporter/0.log" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.867463 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.867589 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.902220 4841 generic.go:334] "Generic (PLEG): container finished" podID="1a2724da-6b9b-4947-a4e3-894938742304" containerID="111f861be82792295977d9e1509a0d34e506936e498cf079b08adb56c250f14b" exitCode=143 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.902299 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a2724da-6b9b-4947-a4e3-894938742304","Type":"ContainerDied","Data":"111f861be82792295977d9e1509a0d34e506936e498cf079b08adb56c250f14b"} Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.903820 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-676d-account-create-update-tkbqn"] Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.903886 4841 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Jan 30 05:29:44 crc kubenswrapper[4841]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 30 05:29:44 crc kubenswrapper[4841]: + source /usr/local/bin/container-scripts/functions Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNBridge=br-int Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNRemote=tcp:localhost:6642 Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNEncapType=geneve Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNAvailabilityZones= Jan 30 05:29:44 crc kubenswrapper[4841]: ++ EnableChassisAsGateway=true Jan 30 05:29:44 crc kubenswrapper[4841]: ++ PhysicalNetworks= Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNHostName= Jan 30 05:29:44 crc kubenswrapper[4841]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 30 05:29:44 crc kubenswrapper[4841]: ++ ovs_dir=/var/lib/openvswitch Jan 30 05:29:44 crc kubenswrapper[4841]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 30 05:29:44 crc kubenswrapper[4841]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 30 05:29:44 crc kubenswrapper[4841]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 30 05:29:44 crc kubenswrapper[4841]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 30 05:29:44 crc kubenswrapper[4841]: + sleep 0.5 Jan 30 05:29:44 crc kubenswrapper[4841]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 30 05:29:44 crc kubenswrapper[4841]: + sleep 0.5 Jan 30 05:29:44 crc kubenswrapper[4841]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 30 05:29:44 crc kubenswrapper[4841]: + cleanup_ovsdb_server_semaphore Jan 30 05:29:44 crc kubenswrapper[4841]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 30 05:29:44 crc kubenswrapper[4841]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 30 05:29:44 crc kubenswrapper[4841]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-lbv2q" message=< Jan 30 05:29:44 crc kubenswrapper[4841]: Exiting ovsdb-server (5) [ OK ] Jan 30 05:29:44 crc kubenswrapper[4841]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 30 05:29:44 crc kubenswrapper[4841]: + source /usr/local/bin/container-scripts/functions Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNBridge=br-int Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNRemote=tcp:localhost:6642 Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNEncapType=geneve Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNAvailabilityZones= Jan 30 05:29:44 crc kubenswrapper[4841]: ++ EnableChassisAsGateway=true Jan 30 05:29:44 crc kubenswrapper[4841]: ++ PhysicalNetworks= Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNHostName= Jan 30 05:29:44 crc kubenswrapper[4841]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 30 05:29:44 crc kubenswrapper[4841]: ++ ovs_dir=/var/lib/openvswitch Jan 30 05:29:44 crc kubenswrapper[4841]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 30 05:29:44 crc kubenswrapper[4841]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 30 05:29:44 crc kubenswrapper[4841]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 30 05:29:44 crc kubenswrapper[4841]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 30 05:29:44 crc kubenswrapper[4841]: + sleep 0.5 Jan 30 05:29:44 crc kubenswrapper[4841]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 30 05:29:44 crc kubenswrapper[4841]: + sleep 0.5 Jan 30 05:29:44 crc kubenswrapper[4841]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 30 05:29:44 crc kubenswrapper[4841]: + cleanup_ovsdb_server_semaphore Jan 30 05:29:44 crc kubenswrapper[4841]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 30 05:29:44 crc kubenswrapper[4841]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 30 05:29:44 crc kubenswrapper[4841]: > Jan 30 05:29:44 crc kubenswrapper[4841]: E0130 05:29:44.903927 4841 kuberuntime_container.go:691] "PreStop hook failed" err=< Jan 30 05:29:44 crc kubenswrapper[4841]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 30 05:29:44 crc kubenswrapper[4841]: + source /usr/local/bin/container-scripts/functions Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNBridge=br-int Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNRemote=tcp:localhost:6642 Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNEncapType=geneve Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNAvailabilityZones= Jan 30 05:29:44 crc kubenswrapper[4841]: ++ EnableChassisAsGateway=true Jan 30 05:29:44 crc kubenswrapper[4841]: ++ PhysicalNetworks= Jan 30 05:29:44 crc kubenswrapper[4841]: ++ OVNHostName= Jan 30 05:29:44 crc kubenswrapper[4841]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 30 05:29:44 crc kubenswrapper[4841]: ++ ovs_dir=/var/lib/openvswitch Jan 30 05:29:44 crc kubenswrapper[4841]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 30 05:29:44 crc kubenswrapper[4841]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 30 05:29:44 crc kubenswrapper[4841]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 30 05:29:44 crc kubenswrapper[4841]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 30 05:29:44 crc kubenswrapper[4841]: + sleep 0.5 Jan 30 05:29:44 crc kubenswrapper[4841]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 30 05:29:44 crc kubenswrapper[4841]: + sleep 0.5 Jan 30 05:29:44 crc kubenswrapper[4841]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 30 05:29:44 crc kubenswrapper[4841]: + cleanup_ovsdb_server_semaphore Jan 30 05:29:44 crc kubenswrapper[4841]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 30 05:29:44 crc kubenswrapper[4841]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 30 05:29:44 crc kubenswrapper[4841]: > pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovsdb-server" containerID="cri-o://0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.903964 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovsdb-server" containerID="cri-o://0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" gracePeriod=29 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.916504 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-j4ls7"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.920069 4841 generic.go:334] "Generic (PLEG): container finished" podID="28551500-d017-475a-aae4-8352782c0b4e" containerID="088a47d541f00148328e19a9fb5636697d24c805841cd93d6a4ac4b7d6e6779f" exitCode=143 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.920134 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc8b6ddd6-nkc6r" event={"ID":"28551500-d017-475a-aae4-8352782c0b4e","Type":"ContainerDied","Data":"088a47d541f00148328e19a9fb5636697d24c805841cd93d6a4ac4b7d6e6779f"} Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.931528 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-j4ls7"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.932204 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-n25tn_30c47888-1780-4539-8777-5914009b862f/openstack-network-exporter/0.log" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.932239 4841 generic.go:334] "Generic (PLEG): container finished" podID="30c47888-1780-4539-8777-5914009b862f" containerID="97894b49cfe91943a3d82f73c54427c16ccf3ca0fab823dd8f5f0d111192e569" exitCode=2 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.932309 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-n25tn" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.932730 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-n25tn" event={"ID":"30c47888-1780-4539-8777-5914009b862f","Type":"ContainerDied","Data":"97894b49cfe91943a3d82f73c54427c16ccf3ca0fab823dd8f5f0d111192e569"} Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.938995 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wsbdm"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.950822 4841 scope.go:117] "RemoveContainer" containerID="8b17702bd7db482eb4d14d6e36c3a54793a366a5aa87ef51064106768aa101bc" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.956827 4841 generic.go:334] "Generic (PLEG): container finished" podID="8ad9e30b-abf9-45fd-9088-103c94e4ed70" containerID="71a6a5266aac4658f33e51c0327009b65b98cc2a8a908dc821c307eb9aa11b89" exitCode=0 Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.956946 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65977b5879-qctf6" event={"ID":"8ad9e30b-abf9-45fd-9088-103c94e4ed70","Type":"ContainerDied","Data":"71a6a5266aac4658f33e51c0327009b65b98cc2a8a908dc821c307eb9aa11b89"} Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.965065 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wsbdm"] Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.973925 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-config\") pod \"0da7e312-7550-4d60-a14b-d5dbdc500e88\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.973965 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8px8v\" (UniqueName: \"kubernetes.io/projected/0da7e312-7550-4d60-a14b-d5dbdc500e88-kube-api-access-8px8v\") pod \"0da7e312-7550-4d60-a14b-d5dbdc500e88\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.973994 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf5gr\" (UniqueName: \"kubernetes.io/projected/30c47888-1780-4539-8777-5914009b862f-kube-api-access-vf5gr\") pod \"30c47888-1780-4539-8777-5914009b862f\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974021 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd33f000-ac38-400f-95b4-d9f6a68d13c0-scripts\") pod \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974083 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c47888-1780-4539-8777-5914009b862f-metrics-certs-tls-certs\") pod \"30c47888-1780-4539-8777-5914009b862f\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974108 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd33f000-ac38-400f-95b4-d9f6a68d13c0-config\") pod \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974146 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-metrics-certs-tls-certs\") pod \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974188 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30c47888-1780-4539-8777-5914009b862f-config\") pod \"30c47888-1780-4539-8777-5914009b862f\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974205 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-ovsdbserver-sb\") pod \"0da7e312-7550-4d60-a14b-d5dbdc500e88\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974267 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/30c47888-1780-4539-8777-5914009b862f-ovn-rundir\") pod \"30c47888-1780-4539-8777-5914009b862f\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974297 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd33f000-ac38-400f-95b4-d9f6a68d13c0-ovsdb-rundir\") pod \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974322 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-dns-svc\") pod \"0da7e312-7550-4d60-a14b-d5dbdc500e88\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974352 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-ovsdbserver-nb\") pod \"0da7e312-7550-4d60-a14b-d5dbdc500e88\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974390 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl2rb\" (UniqueName: \"kubernetes.io/projected/cd33f000-ac38-400f-95b4-d9f6a68d13c0-kube-api-access-tl2rb\") pod \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974432 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974484 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-ovsdbserver-sb-tls-certs\") pod \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.974510 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c47888-1780-4539-8777-5914009b862f-combined-ca-bundle\") pod \"30c47888-1780-4539-8777-5914009b862f\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.979795 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd33f000-ac38-400f-95b4-d9f6a68d13c0-config" (OuterVolumeSpecName: "config") pod "cd33f000-ac38-400f-95b4-d9f6a68d13c0" (UID: "cd33f000-ac38-400f-95b4-d9f6a68d13c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.984758 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd33f000-ac38-400f-95b4-d9f6a68d13c0-scripts" (OuterVolumeSpecName: "scripts") pod "cd33f000-ac38-400f-95b4-d9f6a68d13c0" (UID: "cd33f000-ac38-400f-95b4-d9f6a68d13c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.984904 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30c47888-1780-4539-8777-5914009b862f-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "30c47888-1780-4539-8777-5914009b862f" (UID: "30c47888-1780-4539-8777-5914009b862f"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.985958 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd33f000-ac38-400f-95b4-d9f6a68d13c0-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "cd33f000-ac38-400f-95b4-d9f6a68d13c0" (UID: "cd33f000-ac38-400f-95b4-d9f6a68d13c0"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.985902 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c47888-1780-4539-8777-5914009b862f-kube-api-access-vf5gr" (OuterVolumeSpecName: "kube-api-access-vf5gr") pod "30c47888-1780-4539-8777-5914009b862f" (UID: "30c47888-1780-4539-8777-5914009b862f"). InnerVolumeSpecName "kube-api-access-vf5gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.993104 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cd33f000-ac38-400f-95b4-d9f6a68d13c0/ovsdbserver-sb/0.log" Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.993349 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cd33f000-ac38-400f-95b4-d9f6a68d13c0","Type":"ContainerDied","Data":"0758459fb9e4f3940596789815cd623a5bc6ab5827144d6d8e84cf5952e31bcd"} Jan 30 05:29:44 crc kubenswrapper[4841]: I0130 05:29:44.993531 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.000222 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7z8bf" event={"ID":"1afed894-4dfb-4873-a45c-29b70507295a","Type":"ContainerStarted","Data":"68a54c2161fbb02f8f5192095e93d59f373e572d8122da43c30c351668ab291a"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.001447 4841 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-7z8bf" secret="" err="secret \"galera-openstack-cell1-dockercfg-wbm6q\" not found" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.002133 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.002344 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerName="nova-metadata-log" containerID="cri-o://2603ac57002b34136c88ddf281a8b6cb56fddccd4b183b0ab1effc47d15e9154" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.002477 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerName="nova-metadata-metadata" containerID="cri-o://d334449b0de1d005d626d94fd883ec0192a4936b40dc9ec24b425dde3a584637" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.004966 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "cd33f000-ac38-400f-95b4-d9f6a68d13c0" (UID: "cd33f000-ac38-400f-95b4-d9f6a68d13c0"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.005136 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30c47888-1780-4539-8777-5914009b862f-config" (OuterVolumeSpecName: "config") pod "30c47888-1780-4539-8777-5914009b862f" (UID: "30c47888-1780-4539-8777-5914009b862f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.006474 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd33f000-ac38-400f-95b4-d9f6a68d13c0-kube-api-access-tl2rb" (OuterVolumeSpecName: "kube-api-access-tl2rb") pod "cd33f000-ac38-400f-95b4-d9f6a68d13c0" (UID: "cd33f000-ac38-400f-95b4-d9f6a68d13c0"). InnerVolumeSpecName "kube-api-access-tl2rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.014681 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da7e312-7550-4d60-a14b-d5dbdc500e88-kube-api-access-8px8v" (OuterVolumeSpecName: "kube-api-access-8px8v") pod "0da7e312-7550-4d60-a14b-d5dbdc500e88" (UID: "0da7e312-7550-4d60-a14b-d5dbdc500e88"). InnerVolumeSpecName "kube-api-access-8px8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.021635 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.022355 4841 scope.go:117] "RemoveContainer" containerID="97894b49cfe91943a3d82f73c54427c16ccf3ca0fab823dd8f5f0d111192e569" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.026547 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" event={"ID":"0da7e312-7550-4d60-a14b-d5dbdc500e88","Type":"ContainerDied","Data":"4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.026527 4841 generic.go:334] "Generic (PLEG): container finished" podID="0da7e312-7550-4d60-a14b-d5dbdc500e88" containerID="4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.026647 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.026725 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-zb2qt" event={"ID":"0da7e312-7550-4d60-a14b-d5dbdc500e88","Type":"ContainerDied","Data":"9381d5c45ba94de36ed625ec31606b1cf1ad32e73382b50c5f7e64e061b5c058"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.058057 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1fec-account-create-update-46xxt"] Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.066109 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:45 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: if [ -n "" ]; then Jan 30 05:29:45 crc kubenswrapper[4841]: GRANT_DATABASE="" Jan 30 05:29:45 crc kubenswrapper[4841]: else Jan 30 05:29:45 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:45 crc kubenswrapper[4841]: fi Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:45 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:45 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:45 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:45 crc kubenswrapper[4841]: # support updates Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.067245 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-7z8bf" podUID="1afed894-4dfb-4873-a45c-29b70507295a" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.072851 4841 scope.go:117] "RemoveContainer" containerID="d9d5bddafde38d3fd047f16407b7b2377643bde704520227bda10004fa73dc8d" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.079974 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/30c47888-1780-4539-8777-5914009b862f-ovs-rundir\") pod \"30c47888-1780-4539-8777-5914009b862f\" (UID: \"30c47888-1780-4539-8777-5914009b862f\") " Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.080016 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-combined-ca-bundle\") pod \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\" (UID: \"cd33f000-ac38-400f-95b4-d9f6a68d13c0\") " Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.080041 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-dns-swift-storage-0\") pod \"0da7e312-7550-4d60-a14b-d5dbdc500e88\" (UID: \"0da7e312-7550-4d60-a14b-d5dbdc500e88\") " Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.080393 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30c47888-1780-4539-8777-5914009b862f-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "30c47888-1780-4539-8777-5914009b862f" (UID: "30c47888-1780-4539-8777-5914009b862f"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.081367 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd33f000-ac38-400f-95b4-d9f6a68d13c0-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.081389 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd33f000-ac38-400f-95b4-d9f6a68d13c0-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.081411 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30c47888-1780-4539-8777-5914009b862f-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.081419 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/30c47888-1780-4539-8777-5914009b862f-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.081427 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd33f000-ac38-400f-95b4-d9f6a68d13c0-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.081437 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl2rb\" (UniqueName: \"kubernetes.io/projected/cd33f000-ac38-400f-95b4-d9f6a68d13c0-kube-api-access-tl2rb\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.081453 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.081463 4841 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/30c47888-1780-4539-8777-5914009b862f-ovs-rundir\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.081472 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8px8v\" (UniqueName: \"kubernetes.io/projected/0da7e312-7550-4d60-a14b-d5dbdc500e88-kube-api-access-8px8v\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.081482 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf5gr\" (UniqueName: \"kubernetes.io/projected/30c47888-1780-4539-8777-5914009b862f-kube-api-access-vf5gr\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.084963 4841 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.085035 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts podName:1afed894-4dfb-4873-a45c-29b70507295a nodeName:}" failed. No retries permitted until 2026-01-30 05:29:45.585015207 +0000 UTC m=+1322.578487845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts") pod "root-account-create-update-7z8bf" (UID: "1afed894-4dfb-4873-a45c-29b70507295a") : configmap "openstack-cell1-scripts" not found Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.096519 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c47888-1780-4539-8777-5914009b862f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30c47888-1780-4539-8777-5914009b862f" (UID: "30c47888-1780-4539-8777-5914009b862f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.106556 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.106771 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" containerName="nova-api-log" containerID="cri-o://153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.107046 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" containerName="nova-api-api" containerID="cri-o://e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.116744 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-brvmt"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.134898 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="365caacf-756c-4558-b281-f8644c9c1c5f" containerName="galera" containerID="cri-o://42b9436e6ff50cbea7adadc24b6d483db87ac5a4389559de150639c961eb2f31" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.136388 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-brvmt"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137019 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137038 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137045 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137051 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137057 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137063 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137069 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137077 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137082 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137088 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14" exitCode=0 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137123 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137168 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137182 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137192 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137201 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137209 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137221 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137229 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137238 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.137245 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14"} Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.141566 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.142081 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5fb964589-phnmn"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.142298 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5fb964589-phnmn" podUID="f198eff9-f493-43d9-9b64-06196b205142" containerName="proxy-httpd" containerID="cri-o://fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.142781 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5fb964589-phnmn" podUID="f198eff9-f493-43d9-9b64-06196b205142" containerName="proxy-server" containerID="cri-o://8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.142775 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0da7e312-7550-4d60-a14b-d5dbdc500e88" (UID: "0da7e312-7550-4d60-a14b-d5dbdc500e88"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.148896 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d94d6f7cb-lf9nq"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.149199 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d94d6f7cb-lf9nq" podUID="91db9edf-7d6d-4189-aaac-480a438900be" containerName="barbican-api-log" containerID="cri-o://0de4c7d3f6fcb35d2a2e2a038bee89e95869ad06f6b00924683feb595867a396" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.149435 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d94d6f7cb-lf9nq" podUID="91db9edf-7d6d-4189-aaac-480a438900be" containerName="barbican-api" containerID="cri-o://bc897d8ef0f32c66c2606560ae71bcd74a208effcd7e7b2a87ba2f0b34843405" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.151615 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0da7e312-7550-4d60-a14b-d5dbdc500e88" (UID: "0da7e312-7550-4d60-a14b-d5dbdc500e88"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.156223 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-69fd44f6fc-dpzfd"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.156417 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" podUID="a035e8ef-e433-4c59-a0fd-09937eb5f226" containerName="barbican-worker-log" containerID="cri-o://da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.156703 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" podUID="a035e8ef-e433-4c59-a0fd-09937eb5f226" containerName="barbican-worker" containerID="cri-o://ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.170359 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7b9cf755cd-5p4pk"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.170636 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" podUID="3b29e384-bd86-4102-8a9e-4745cd0ae8d5" containerName="barbican-keystone-listener-log" containerID="cri-o://0a7c1a689c4e10c27ddb7c3c12fcc8d9ee61127c2c5b8d9fdc13d03072e0a7c1" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.170744 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" podUID="3b29e384-bd86-4102-8a9e-4745cd0ae8d5" containerName="barbican-keystone-listener" containerID="cri-o://5c4314bd5d4e8c9a32d1faa3556ab7cd3dae6013e3228dbb71483e6295221f1f" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.175180 4841 scope.go:117] "RemoveContainer" containerID="eaca801d46ea5dc0cda81a019aed95bd7823bddb489326243d7b1a3fbaa6599a" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.175832 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:45 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: if [ -n "neutron" ]; then Jan 30 05:29:45 crc kubenswrapper[4841]: GRANT_DATABASE="neutron" Jan 30 05:29:45 crc kubenswrapper[4841]: else Jan 30 05:29:45 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:45 crc kubenswrapper[4841]: fi Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:45 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:45 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:45 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:45 crc kubenswrapper[4841]: # support updates Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.176371 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0da7e312-7550-4d60-a14b-d5dbdc500e88" (UID: "0da7e312-7550-4d60-a14b-d5dbdc500e88"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.177636 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-1306-account-create-update-9p79b" podUID="3106f00a-8ed8-4189-be2e-f5c6cce1b4ca" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.184693 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c47888-1780-4539-8777-5914009b862f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.205733 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.205984 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.205994 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.206007 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.215519 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7z8bf"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.232940 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.233128 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6fe10a9a-a21e-4b2c-a5da-d50340115c7a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://de54af2206ef9739b9851bdbdbfe8715a9ca62af5991471c3c6388dc3e2c68b3" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.242891 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9a61-account-create-update-8jgdj"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.249054 4841 scope.go:117] "RemoveContainer" containerID="4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.251532 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-676d-account-create-update-tkbqn"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.269027 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.269198 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="86140170-ca48-47e9-b587-43f98f3624c1" containerName="nova-scheduler-scheduler" containerID="cri-o://478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.286996 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.298338 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fjjz6"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.306743 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzhlr\" (UniqueName: \"kubernetes.io/projected/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-kube-api-access-lzhlr\") pod \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.307360 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-openstack-config-secret\") pod \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.307517 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-openstack-config\") pod \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.307686 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-combined-ca-bundle\") pod \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\" (UID: \"f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5\") " Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.342026 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.342210 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="60a29cc8-4615-40e7-a687-1852db124ba0" containerName="nova-cell1-conductor-conductor" containerID="cri-o://3043b1132b1d8cd3d83be2ceb2adfb4bc8bd0b66f8fa3fcc542bda3812e357ec" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.342704 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fjjz6"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.353480 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q4xj8"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.363652 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-kube-api-access-lzhlr" (OuterVolumeSpecName: "kube-api-access-lzhlr") pod "f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5" (UID: "f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5"). InnerVolumeSpecName "kube-api-access-lzhlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.363758 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd33f000-ac38-400f-95b4-d9f6a68d13c0" (UID: "cd33f000-ac38-400f-95b4-d9f6a68d13c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.367752 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.367946 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="9baa24e8-552c-425b-a494-ca70b9bcff0c" containerName="nova-cell0-conductor-conductor" containerID="cri-o://93721abe2b80f73cd1be85412e3cfc53afaba2d7b1d8e8eec69d4a39915539f1" gracePeriod=30 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.382338 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q4xj8"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.410182 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.410319 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzhlr\" (UniqueName: \"kubernetes.io/projected/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-kube-api-access-lzhlr\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.426081 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.427256 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.427841 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-config" (OuterVolumeSpecName: "config") pod "0da7e312-7550-4d60-a14b-d5dbdc500e88" (UID: "0da7e312-7550-4d60-a14b-d5dbdc500e88"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.428345 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.428413 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="86140170-ca48-47e9-b587-43f98f3624c1" containerName="nova-scheduler-scheduler" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.448606 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5" (UID: "f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.458916 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="cc423120-ba93-465b-8ef8-871904b901ef" containerName="rabbitmq" containerID="cri-o://a19190f1ad60a61a31c984cadceaa8ab89c01149a3adc2e54a53efba5d740bd4" gracePeriod=604800 Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.494059 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c47888-1780-4539-8777-5914009b862f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "30c47888-1780-4539-8777-5914009b862f" (UID: "30c47888-1780-4539-8777-5914009b862f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.497168 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0da7e312-7550-4d60-a14b-d5dbdc500e88" (UID: "0da7e312-7550-4d60-a14b-d5dbdc500e88"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.511551 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.511721 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.511775 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0da7e312-7550-4d60-a14b-d5dbdc500e88-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.511844 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c47888-1780-4539-8777-5914009b862f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.512388 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5" (UID: "f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.517149 4841 scope.go:117] "RemoveContainer" containerID="a3396857474a1366b395f609227fb5083231e58e937199f3945762778f000d36" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.539927 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5" (UID: "f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.548843 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.549023 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.549444 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.550282 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "cd33f000-ac38-400f-95b4-d9f6a68d13c0" (UID: "cd33f000-ac38-400f-95b4-d9f6a68d13c0"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.557176 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:45 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: if [ -n "nova_cell0" ]; then Jan 30 05:29:45 crc kubenswrapper[4841]: GRANT_DATABASE="nova_cell0" Jan 30 05:29:45 crc kubenswrapper[4841]: else Jan 30 05:29:45 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:45 crc kubenswrapper[4841]: fi Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:45 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:45 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:45 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:45 crc kubenswrapper[4841]: # support updates Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.557180 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:45 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: if [ -n "nova_api" ]; then Jan 30 05:29:45 crc kubenswrapper[4841]: GRANT_DATABASE="nova_api" Jan 30 05:29:45 crc kubenswrapper[4841]: else Jan 30 05:29:45 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:45 crc kubenswrapper[4841]: fi Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:45 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:45 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:45 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:45 crc kubenswrapper[4841]: # support updates Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.557262 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:45 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: if [ -n "cinder" ]; then Jan 30 05:29:45 crc kubenswrapper[4841]: GRANT_DATABASE="cinder" Jan 30 05:29:45 crc kubenswrapper[4841]: else Jan 30 05:29:45 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:45 crc kubenswrapper[4841]: fi Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:45 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:45 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:45 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:45 crc kubenswrapper[4841]: # support updates Jan 30 05:29:45 crc kubenswrapper[4841]: Jan 30 05:29:45 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.558443 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-d99c-account-create-update-xhj29" podUID="69db2b46-7a27-45b3-9bca-fac2189f47ef" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.558382 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-de72-account-create-update-ckzp8" podUID="f4335ead-4d73-4061-b069-960881c2f2d9" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.558419 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-1fec-account-create-update-46xxt" podUID="703d8335-345d-4847-afca-b5667e0b4c0f" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.614421 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.614453 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.614463 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.614521 4841 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.614566 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts podName:1afed894-4dfb-4873-a45c-29b70507295a nodeName:}" failed. No retries permitted until 2026-01-30 05:29:46.614551966 +0000 UTC m=+1323.608024604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts") pod "root-account-create-update-7z8bf" (UID: "1afed894-4dfb-4873-a45c-29b70507295a") : configmap "openstack-cell1-scripts" not found Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.614859 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.623214 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "cd33f000-ac38-400f-95b4-d9f6a68d13c0" (UID: "cd33f000-ac38-400f-95b4-d9f6a68d13c0"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.635455 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.648930 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-de72-account-create-update-ckzp8"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.657046 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d99c-account-create-update-xhj29"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.664369 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1fec-account-create-update-46xxt"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.690376 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-n25tn"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.694547 4841 scope.go:117] "RemoveContainer" containerID="4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.695069 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2\": container with ID starting with 4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2 not found: ID does not exist" containerID="4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.695093 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2"} err="failed to get container status \"4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2\": rpc error: code = NotFound desc = could not find container \"4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2\": container with ID starting with 4f831fa3e84a699ca470ce051853ca952c34e9134f0d098a69933bc17e9709f2 not found: ID does not exist" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.695114 4841 scope.go:117] "RemoveContainer" containerID="a3396857474a1366b395f609227fb5083231e58e937199f3945762778f000d36" Jan 30 05:29:45 crc kubenswrapper[4841]: E0130 05:29:45.695359 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3396857474a1366b395f609227fb5083231e58e937199f3945762778f000d36\": container with ID starting with a3396857474a1366b395f609227fb5083231e58e937199f3945762778f000d36 not found: ID does not exist" containerID="a3396857474a1366b395f609227fb5083231e58e937199f3945762778f000d36" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.695384 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3396857474a1366b395f609227fb5083231e58e937199f3945762778f000d36"} err="failed to get container status \"a3396857474a1366b395f609227fb5083231e58e937199f3945762778f000d36\": rpc error: code = NotFound desc = could not find container \"a3396857474a1366b395f609227fb5083231e58e937199f3945762778f000d36\": container with ID starting with a3396857474a1366b395f609227fb5083231e58e937199f3945762778f000d36 not found: ID does not exist" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.696085 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-n25tn"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.716609 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd33f000-ac38-400f-95b4-d9f6a68d13c0-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.735094 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-zb2qt"] Jan 30 05:29:45 crc kubenswrapper[4841]: I0130 05:29:45.744392 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-zb2qt"] Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.015990 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.041802 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.048781 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.132332 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-internal-tls-certs\") pod \"f198eff9-f493-43d9-9b64-06196b205142\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.132959 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-combined-ca-bundle\") pod \"f198eff9-f493-43d9-9b64-06196b205142\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.133002 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-public-tls-certs\") pod \"f198eff9-f493-43d9-9b64-06196b205142\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.133020 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cglr\" (UniqueName: \"kubernetes.io/projected/f198eff9-f493-43d9-9b64-06196b205142-kube-api-access-8cglr\") pod \"f198eff9-f493-43d9-9b64-06196b205142\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.133050 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-config-data\") pod \"f198eff9-f493-43d9-9b64-06196b205142\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.133158 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f198eff9-f493-43d9-9b64-06196b205142-run-httpd\") pod \"f198eff9-f493-43d9-9b64-06196b205142\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.133247 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f198eff9-f493-43d9-9b64-06196b205142-log-httpd\") pod \"f198eff9-f493-43d9-9b64-06196b205142\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.133280 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f198eff9-f493-43d9-9b64-06196b205142-etc-swift\") pod \"f198eff9-f493-43d9-9b64-06196b205142\" (UID: \"f198eff9-f493-43d9-9b64-06196b205142\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.134507 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f198eff9-f493-43d9-9b64-06196b205142-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f198eff9-f493-43d9-9b64-06196b205142" (UID: "f198eff9-f493-43d9-9b64-06196b205142"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.134652 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f198eff9-f493-43d9-9b64-06196b205142-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f198eff9-f493-43d9-9b64-06196b205142" (UID: "f198eff9-f493-43d9-9b64-06196b205142"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.138113 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f198eff9-f493-43d9-9b64-06196b205142-kube-api-access-8cglr" (OuterVolumeSpecName: "kube-api-access-8cglr") pod "f198eff9-f493-43d9-9b64-06196b205142" (UID: "f198eff9-f493-43d9-9b64-06196b205142"). InnerVolumeSpecName "kube-api-access-8cglr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.146267 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f198eff9-f493-43d9-9b64-06196b205142-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f198eff9-f493-43d9-9b64-06196b205142" (UID: "f198eff9-f493-43d9-9b64-06196b205142"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.154775 4841 generic.go:334] "Generic (PLEG): container finished" podID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerID="2603ac57002b34136c88ddf281a8b6cb56fddccd4b183b0ab1effc47d15e9154" exitCode=143 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.154855 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7be8df86-7b8d-4741-ae13-ec1b243549b3","Type":"ContainerDied","Data":"2603ac57002b34136c88ddf281a8b6cb56fddccd4b183b0ab1effc47d15e9154"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.157284 4841 generic.go:334] "Generic (PLEG): container finished" podID="73fdf532-7bb7-43db-acbc-b949166ccd6b" containerID="f9bffcb1c98bf20cc71c95a2813cb2b9fa85cff321938e56af140ba1916af595" exitCode=143 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.157381 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"73fdf532-7bb7-43db-acbc-b949166ccd6b","Type":"ContainerDied","Data":"f9bffcb1c98bf20cc71c95a2813cb2b9fa85cff321938e56af140ba1916af595"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.160170 4841 generic.go:334] "Generic (PLEG): container finished" podID="365caacf-756c-4558-b281-f8644c9c1c5f" containerID="42b9436e6ff50cbea7adadc24b6d483db87ac5a4389559de150639c961eb2f31" exitCode=0 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.160219 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"365caacf-756c-4558-b281-f8644c9c1c5f","Type":"ContainerDied","Data":"42b9436e6ff50cbea7adadc24b6d483db87ac5a4389559de150639c961eb2f31"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.164654 4841 generic.go:334] "Generic (PLEG): container finished" podID="3b29e384-bd86-4102-8a9e-4745cd0ae8d5" containerID="0a7c1a689c4e10c27ddb7c3c12fcc8d9ee61127c2c5b8d9fdc13d03072e0a7c1" exitCode=143 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.164735 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" event={"ID":"3b29e384-bd86-4102-8a9e-4745cd0ae8d5","Type":"ContainerDied","Data":"0a7c1a689c4e10c27ddb7c3c12fcc8d9ee61127c2c5b8d9fdc13d03072e0a7c1"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.171770 4841 generic.go:334] "Generic (PLEG): container finished" podID="91db9edf-7d6d-4189-aaac-480a438900be" containerID="0de4c7d3f6fcb35d2a2e2a038bee89e95869ad06f6b00924683feb595867a396" exitCode=143 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.171882 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d94d6f7cb-lf9nq" event={"ID":"91db9edf-7d6d-4189-aaac-480a438900be","Type":"ContainerDied","Data":"0de4c7d3f6fcb35d2a2e2a038bee89e95869ad06f6b00924683feb595867a396"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.174335 4841 generic.go:334] "Generic (PLEG): container finished" podID="f198eff9-f493-43d9-9b64-06196b205142" containerID="8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d" exitCode=0 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.174387 4841 generic.go:334] "Generic (PLEG): container finished" podID="f198eff9-f493-43d9-9b64-06196b205142" containerID="fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3" exitCode=0 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.174465 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5fb964589-phnmn" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.174842 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5fb964589-phnmn" event={"ID":"f198eff9-f493-43d9-9b64-06196b205142","Type":"ContainerDied","Data":"8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.175289 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5fb964589-phnmn" event={"ID":"f198eff9-f493-43d9-9b64-06196b205142","Type":"ContainerDied","Data":"fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.175374 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5fb964589-phnmn" event={"ID":"f198eff9-f493-43d9-9b64-06196b205142","Type":"ContainerDied","Data":"fe33d56b31bdb343fca3d156e40bf9674067615261c3d2c6b427d4633546e434"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.175488 4841 scope.go:117] "RemoveContainer" containerID="8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.192386 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864" exitCode=0 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.192668 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07" exitCode=0 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.194963 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252" exitCode=0 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.195099 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb" exitCode=0 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.195237 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.195339 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.195952 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.196310 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.210489 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.218933 4841 generic.go:334] "Generic (PLEG): container finished" podID="d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" containerID="0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87" exitCode=0 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.219157 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc","Type":"ContainerDied","Data":"0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.223734 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-676d-account-create-update-tkbqn" event={"ID":"ff1ce10f-fa3e-4526-a823-f4defdaf9085","Type":"ContainerStarted","Data":"43ed044e1286fb9780ca41391af739022fad6b12ebd947d30dabd327cf555fca"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.228426 4841 generic.go:334] "Generic (PLEG): container finished" podID="6fe10a9a-a21e-4b2c-a5da-d50340115c7a" containerID="de54af2206ef9739b9851bdbdbfe8715a9ca62af5991471c3c6388dc3e2c68b3" exitCode=0 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.228454 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6fe10a9a-a21e-4b2c-a5da-d50340115c7a","Type":"ContainerDied","Data":"de54af2206ef9739b9851bdbdbfe8715a9ca62af5991471c3c6388dc3e2c68b3"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.229231 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f198eff9-f493-43d9-9b64-06196b205142" (UID: "f198eff9-f493-43d9-9b64-06196b205142"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.232219 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fec-account-create-update-46xxt" event={"ID":"703d8335-345d-4847-afca-b5667e0b4c0f","Type":"ContainerStarted","Data":"cbc0bf8fa64e7fb6ec3c68ab7842c5c6aa8b5911b9beca6f3b472cc0c18c6f1b"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.232246 4841 scope.go:117] "RemoveContainer" containerID="fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.233719 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:46 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:46 crc kubenswrapper[4841]: Jan 30 05:29:46 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:46 crc kubenswrapper[4841]: Jan 30 05:29:46 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:46 crc kubenswrapper[4841]: Jan 30 05:29:46 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:46 crc kubenswrapper[4841]: Jan 30 05:29:46 crc kubenswrapper[4841]: if [ -n "barbican" ]; then Jan 30 05:29:46 crc kubenswrapper[4841]: GRANT_DATABASE="barbican" Jan 30 05:29:46 crc kubenswrapper[4841]: else Jan 30 05:29:46 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:46 crc kubenswrapper[4841]: fi Jan 30 05:29:46 crc kubenswrapper[4841]: Jan 30 05:29:46 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:46 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:46 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:46 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:46 crc kubenswrapper[4841]: # support updates Jan 30 05:29:46 crc kubenswrapper[4841]: Jan 30 05:29:46 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.235428 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-676d-account-create-update-tkbqn" podUID="ff1ce10f-fa3e-4526-a823-f4defdaf9085" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.237040 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f198eff9-f493-43d9-9b64-06196b205142-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.237079 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f198eff9-f493-43d9-9b64-06196b205142-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.237089 4841 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f198eff9-f493-43d9-9b64-06196b205142-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.237097 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.237107 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cglr\" (UniqueName: \"kubernetes.io/projected/f198eff9-f493-43d9-9b64-06196b205142-kube-api-access-8cglr\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.237202 4841 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.237473 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data podName:ad7779ad-0912-4695-853f-3ce786c2e9ae nodeName:}" failed. No retries permitted until 2026-01-30 05:29:50.237439889 +0000 UTC m=+1327.230912527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data") pod "rabbitmq-cell1-server-0" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae") : configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.238554 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f198eff9-f493-43d9-9b64-06196b205142" (UID: "f198eff9-f493-43d9-9b64-06196b205142"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.253596 4841 generic.go:334] "Generic (PLEG): container finished" podID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" containerID="153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47" exitCode=143 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.253660 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecdab3bb-c4de-4c49-9988-d9ed592a40a7","Type":"ContainerDied","Data":"153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.256502 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-config-data" (OuterVolumeSpecName: "config-data") pod "f198eff9-f493-43d9-9b64-06196b205142" (UID: "f198eff9-f493-43d9-9b64-06196b205142"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.258462 4841 generic.go:334] "Generic (PLEG): container finished" podID="582a9577-0530-4793-8723-01681bdcfda4" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" exitCode=0 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.259190 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lbv2q" event={"ID":"582a9577-0530-4793-8723-01681bdcfda4","Type":"ContainerDied","Data":"0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.279512 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-de72-account-create-update-ckzp8" event={"ID":"f4335ead-4d73-4061-b069-960881c2f2d9","Type":"ContainerStarted","Data":"608bf025bd1045513c66e1133ef707f88b1ba91cc955e54d5935933c4043c5c9"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.284267 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f198eff9-f493-43d9-9b64-06196b205142" (UID: "f198eff9-f493-43d9-9b64-06196b205142"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.310890 4841 scope.go:117] "RemoveContainer" containerID="8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.328684 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d\": container with ID starting with 8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d not found: ID does not exist" containerID="8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.328723 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d"} err="failed to get container status \"8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d\": rpc error: code = NotFound desc = could not find container \"8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d\": container with ID starting with 8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d not found: ID does not exist" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.328751 4841 scope.go:117] "RemoveContainer" containerID="fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.329138 4841 generic.go:334] "Generic (PLEG): container finished" podID="a035e8ef-e433-4c59-a0fd-09937eb5f226" containerID="da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184" exitCode=143 Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.329182 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" event={"ID":"a035e8ef-e433-4c59-a0fd-09937eb5f226","Type":"ContainerDied","Data":"da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184"} Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.329222 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3\": container with ID starting with fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3 not found: ID does not exist" containerID="fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.329236 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3"} err="failed to get container status \"fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3\": rpc error: code = NotFound desc = could not find container \"fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3\": container with ID starting with fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3 not found: ID does not exist" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.329247 4841 scope.go:117] "RemoveContainer" containerID="8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.329759 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d"} err="failed to get container status \"8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d\": rpc error: code = NotFound desc = could not find container \"8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d\": container with ID starting with 8c27508e69a6f835f8f64b5392a7be0a4b3a1689c2be5923111bb03b6131bb1d not found: ID does not exist" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.329788 4841 scope.go:117] "RemoveContainer" containerID="fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.330082 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3"} err="failed to get container status \"fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3\": rpc error: code = NotFound desc = could not find container \"fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3\": container with ID starting with fddb0dcefb32779ac4e4c549bf62d506f30f00194b7e1365fab19687b64844d3 not found: ID does not exist" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.330097 4841 scope.go:117] "RemoveContainer" containerID="3c7852e63582709a9e74b827146a470d704988cb06dbe6df77ac6ac4fc666c94" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.341481 4841 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-7z8bf" secret="" err="secret \"galera-openstack-cell1-dockercfg-wbm6q\" not found" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.342098 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d99c-account-create-update-xhj29" event={"ID":"69db2b46-7a27-45b3-9bca-fac2189f47ef","Type":"ContainerStarted","Data":"4b1739550fc839f0fd61e46baaac5a1cf00c4b6971e4cb1bfd77b08bb0ce7bc2"} Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.343383 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.343415 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.343424 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f198eff9-f493-43d9-9b64-06196b205142-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.359451 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:46 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:46 crc kubenswrapper[4841]: Jan 30 05:29:46 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:46 crc kubenswrapper[4841]: Jan 30 05:29:46 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:46 crc kubenswrapper[4841]: Jan 30 05:29:46 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:46 crc kubenswrapper[4841]: Jan 30 05:29:46 crc kubenswrapper[4841]: if [ -n "" ]; then Jan 30 05:29:46 crc kubenswrapper[4841]: GRANT_DATABASE="" Jan 30 05:29:46 crc kubenswrapper[4841]: else Jan 30 05:29:46 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:46 crc kubenswrapper[4841]: fi Jan 30 05:29:46 crc kubenswrapper[4841]: Jan 30 05:29:46 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:46 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:46 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:46 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:46 crc kubenswrapper[4841]: # support updates Jan 30 05:29:46 crc kubenswrapper[4841]: Jan 30 05:29:46 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.360587 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-7z8bf" podUID="1afed894-4dfb-4873-a45c-29b70507295a" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.367931 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-65977b5879-qctf6" podUID="8ad9e30b-abf9-45fd-9088-103c94e4ed70" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.170:9696/\": dial tcp 10.217.0.170:9696: connect: connection refused" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.448235 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0da7e312-7550-4d60-a14b-d5dbdc500e88" path="/var/lib/kubelet/pods/0da7e312-7550-4d60-a14b-d5dbdc500e88/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.449029 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1611ffb0-e1aa-487c-aabb-a0f71f4856ff" path="/var/lib/kubelet/pods/1611ffb0-e1aa-487c-aabb-a0f71f4856ff/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.450344 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1faa8a3a-4660-4a7a-81d5-0fd94025b1ad" path="/var/lib/kubelet/pods/1faa8a3a-4660-4a7a-81d5-0fd94025b1ad/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.451354 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c47888-1780-4539-8777-5914009b862f" path="/var/lib/kubelet/pods/30c47888-1780-4539-8777-5914009b862f/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.452083 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5" path="/var/lib/kubelet/pods/34b47d5e-2531-4e8c-b1ea-49ba2f52b7c5/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.453437 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f2c58b7-18b1-453c-be45-be86c5008871" path="/var/lib/kubelet/pods/4f2c58b7-18b1-453c-be45-be86c5008871/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.454878 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df66af1-0c57-44f7-8b54-bc351a3faa66" path="/var/lib/kubelet/pods/5df66af1-0c57-44f7-8b54-bc351a3faa66/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.455491 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80ca3de9-8a40-47c6-8214-2c7cdb5724c9" path="/var/lib/kubelet/pods/80ca3de9-8a40-47c6-8214-2c7cdb5724c9/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.456091 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda8dda1-2f95-43f2-952f-79c7f8adbb63" path="/var/lib/kubelet/pods/bda8dda1-2f95-43f2-952f-79c7f8adbb63/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.456751 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbafcabd-b6f1-4c03-839d-4f837803974c" path="/var/lib/kubelet/pods/cbafcabd-b6f1-4c03-839d-4f837803974c/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.457884 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd33f000-ac38-400f-95b4-d9f6a68d13c0" path="/var/lib/kubelet/pods/cd33f000-ac38-400f-95b4-d9f6a68d13c0/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.458532 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2214644-246d-408c-9315-91b23e85d3f2" path="/var/lib/kubelet/pods/d2214644-246d-408c-9315-91b23e85d3f2/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.459712 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5" path="/var/lib/kubelet/pods/f39f9d1b-33bb-4c4f-b168-3e7b2cdcf7d5/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.460310 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa02d9bc-89ad-4e58-aa4b-62455308f9e7" path="/var/lib/kubelet/pods/fa02d9bc-89ad-4e58-aa4b-62455308f9e7/volumes" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.491090 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.554489 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-vencrypt-tls-certs\") pod \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.554583 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-config-data\") pod \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.554612 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-combined-ca-bundle\") pod \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.554688 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-nova-novncproxy-tls-certs\") pod \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.554737 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh9kf\" (UniqueName: \"kubernetes.io/projected/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-kube-api-access-hh9kf\") pod \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\" (UID: \"6fe10a9a-a21e-4b2c-a5da-d50340115c7a\") " Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.556207 4841 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.556259 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data podName:cc423120-ba93-465b-8ef8-871904b901ef nodeName:}" failed. No retries permitted until 2026-01-30 05:29:50.556244919 +0000 UTC m=+1327.549717557 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data") pod "rabbitmq-server-0" (UID: "cc423120-ba93-465b-8ef8-871904b901ef") : configmap "rabbitmq-config-data" not found Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.559690 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-kube-api-access-hh9kf" (OuterVolumeSpecName: "kube-api-access-hh9kf") pod "6fe10a9a-a21e-4b2c-a5da-d50340115c7a" (UID: "6fe10a9a-a21e-4b2c-a5da-d50340115c7a"). InnerVolumeSpecName "kube-api-access-hh9kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.584521 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5fb964589-phnmn"] Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.588530 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fe10a9a-a21e-4b2c-a5da-d50340115c7a" (UID: "6fe10a9a-a21e-4b2c-a5da-d50340115c7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.598846 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.604475 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5fb964589-phnmn"] Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.609842 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-config-data" (OuterVolumeSpecName: "config-data") pod "6fe10a9a-a21e-4b2c-a5da-d50340115c7a" (UID: "6fe10a9a-a21e-4b2c-a5da-d50340115c7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.643548 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "6fe10a9a-a21e-4b2c-a5da-d50340115c7a" (UID: "6fe10a9a-a21e-4b2c-a5da-d50340115c7a"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.654555 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "6fe10a9a-a21e-4b2c-a5da-d50340115c7a" (UID: "6fe10a9a-a21e-4b2c-a5da-d50340115c7a"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.656890 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/365caacf-756c-4558-b281-f8644c9c1c5f-galera-tls-certs\") pod \"365caacf-756c-4558-b281-f8644c9c1c5f\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.656948 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-config-data-default\") pod \"365caacf-756c-4558-b281-f8644c9c1c5f\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.657006 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7zqc\" (UniqueName: \"kubernetes.io/projected/365caacf-756c-4558-b281-f8644c9c1c5f-kube-api-access-n7zqc\") pod \"365caacf-756c-4558-b281-f8644c9c1c5f\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.657066 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365caacf-756c-4558-b281-f8644c9c1c5f-combined-ca-bundle\") pod \"365caacf-756c-4558-b281-f8644c9c1c5f\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.657088 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-operator-scripts\") pod \"365caacf-756c-4558-b281-f8644c9c1c5f\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.657154 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-kolla-config\") pod \"365caacf-756c-4558-b281-f8644c9c1c5f\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.657234 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/365caacf-756c-4558-b281-f8644c9c1c5f-config-data-generated\") pod \"365caacf-756c-4558-b281-f8644c9c1c5f\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.657273 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"365caacf-756c-4558-b281-f8644c9c1c5f\" (UID: \"365caacf-756c-4558-b281-f8644c9c1c5f\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.657653 4841 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.657670 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh9kf\" (UniqueName: \"kubernetes.io/projected/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-kube-api-access-hh9kf\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.657678 4841 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.657689 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.657698 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe10a9a-a21e-4b2c-a5da-d50340115c7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.657749 4841 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.657787 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts podName:1afed894-4dfb-4873-a45c-29b70507295a nodeName:}" failed. No retries permitted until 2026-01-30 05:29:48.657774785 +0000 UTC m=+1325.651247423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts") pod "root-account-create-update-7z8bf" (UID: "1afed894-4dfb-4873-a45c-29b70507295a") : configmap "openstack-cell1-scripts" not found Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.657843 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "365caacf-756c-4558-b281-f8644c9c1c5f" (UID: "365caacf-756c-4558-b281-f8644c9c1c5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.659968 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/365caacf-756c-4558-b281-f8644c9c1c5f-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "365caacf-756c-4558-b281-f8644c9c1c5f" (UID: "365caacf-756c-4558-b281-f8644c9c1c5f"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.660603 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "365caacf-756c-4558-b281-f8644c9c1c5f" (UID: "365caacf-756c-4558-b281-f8644c9c1c5f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.664519 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "365caacf-756c-4558-b281-f8644c9c1c5f" (UID: "365caacf-756c-4558-b281-f8644c9c1c5f"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.668934 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365caacf-756c-4558-b281-f8644c9c1c5f-kube-api-access-n7zqc" (OuterVolumeSpecName: "kube-api-access-n7zqc") pod "365caacf-756c-4558-b281-f8644c9c1c5f" (UID: "365caacf-756c-4558-b281-f8644c9c1c5f"). InnerVolumeSpecName "kube-api-access-n7zqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.679487 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "365caacf-756c-4558-b281-f8644c9c1c5f" (UID: "365caacf-756c-4558-b281-f8644c9c1c5f"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.688318 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365caacf-756c-4558-b281-f8644c9c1c5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "365caacf-756c-4558-b281-f8644c9c1c5f" (UID: "365caacf-756c-4558-b281-f8644c9c1c5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.735628 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365caacf-756c-4558-b281-f8644c9c1c5f-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "365caacf-756c-4558-b281-f8644c9c1c5f" (UID: "365caacf-756c-4558-b281-f8644c9c1c5f"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.759733 4841 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.759764 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/365caacf-756c-4558-b281-f8644c9c1c5f-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.759787 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.759798 4841 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/365caacf-756c-4558-b281-f8644c9c1c5f-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.759806 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.759814 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7zqc\" (UniqueName: \"kubernetes.io/projected/365caacf-756c-4558-b281-f8644c9c1c5f-kube-api-access-n7zqc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.759822 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365caacf-756c-4558-b281-f8644c9c1c5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.759830 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/365caacf-756c-4558-b281-f8644c9c1c5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.781415 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.827926 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9a61-account-create-update-8jgdj" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.845109 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d99c-account-create-update-xhj29" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.846446 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1306-account-create-update-9p79b" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.855250 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-de72-account-create-update-ckzp8" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.861613 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8z62\" (UniqueName: \"kubernetes.io/projected/4e1bee18-1dd7-42be-b113-6a746b3ff70d-kube-api-access-q8z62\") pod \"4e1bee18-1dd7-42be-b113-6a746b3ff70d\" (UID: \"4e1bee18-1dd7-42be-b113-6a746b3ff70d\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.861687 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e1bee18-1dd7-42be-b113-6a746b3ff70d-operator-scripts\") pod \"4e1bee18-1dd7-42be-b113-6a746b3ff70d\" (UID: \"4e1bee18-1dd7-42be-b113-6a746b3ff70d\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.862189 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.862553 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1bee18-1dd7-42be-b113-6a746b3ff70d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e1bee18-1dd7-42be-b113-6a746b3ff70d" (UID: "4e1bee18-1dd7-42be-b113-6a746b3ff70d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.867438 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fec-account-create-update-46xxt" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.868329 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1bee18-1dd7-42be-b113-6a746b3ff70d-kube-api-access-q8z62" (OuterVolumeSpecName: "kube-api-access-q8z62") pod "4e1bee18-1dd7-42be-b113-6a746b3ff70d" (UID: "4e1bee18-1dd7-42be-b113-6a746b3ff70d"). InnerVolumeSpecName "kube-api-access-q8z62". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.913966 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xmpzp"] Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.914336 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c47888-1780-4539-8777-5914009b862f" containerName="openstack-network-exporter" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914353 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c47888-1780-4539-8777-5914009b862f" containerName="openstack-network-exporter" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.914369 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f198eff9-f493-43d9-9b64-06196b205142" containerName="proxy-httpd" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914377 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f198eff9-f493-43d9-9b64-06196b205142" containerName="proxy-httpd" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.914392 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df66af1-0c57-44f7-8b54-bc351a3faa66" containerName="ovsdbserver-nb" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914412 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df66af1-0c57-44f7-8b54-bc351a3faa66" containerName="ovsdbserver-nb" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.914424 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365caacf-756c-4558-b281-f8644c9c1c5f" containerName="galera" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914429 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="365caacf-756c-4558-b281-f8644c9c1c5f" containerName="galera" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.914451 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f198eff9-f493-43d9-9b64-06196b205142" containerName="proxy-server" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914457 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f198eff9-f493-43d9-9b64-06196b205142" containerName="proxy-server" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.914472 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365caacf-756c-4558-b281-f8644c9c1c5f" containerName="mysql-bootstrap" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914478 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="365caacf-756c-4558-b281-f8644c9c1c5f" containerName="mysql-bootstrap" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.914487 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd33f000-ac38-400f-95b4-d9f6a68d13c0" containerName="ovsdbserver-sb" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914493 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd33f000-ac38-400f-95b4-d9f6a68d13c0" containerName="ovsdbserver-sb" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.914502 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df66af1-0c57-44f7-8b54-bc351a3faa66" containerName="openstack-network-exporter" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914508 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df66af1-0c57-44f7-8b54-bc351a3faa66" containerName="openstack-network-exporter" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.914518 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da7e312-7550-4d60-a14b-d5dbdc500e88" containerName="init" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914523 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da7e312-7550-4d60-a14b-d5dbdc500e88" containerName="init" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.914532 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd33f000-ac38-400f-95b4-d9f6a68d13c0" containerName="openstack-network-exporter" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914537 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd33f000-ac38-400f-95b4-d9f6a68d13c0" containerName="openstack-network-exporter" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.914549 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe10a9a-a21e-4b2c-a5da-d50340115c7a" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914555 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe10a9a-a21e-4b2c-a5da-d50340115c7a" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 05:29:46 crc kubenswrapper[4841]: E0130 05:29:46.914564 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da7e312-7550-4d60-a14b-d5dbdc500e88" containerName="dnsmasq-dns" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914569 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da7e312-7550-4d60-a14b-d5dbdc500e88" containerName="dnsmasq-dns" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914705 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c47888-1780-4539-8777-5914009b862f" containerName="openstack-network-exporter" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914716 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd33f000-ac38-400f-95b4-d9f6a68d13c0" containerName="openstack-network-exporter" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914725 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd33f000-ac38-400f-95b4-d9f6a68d13c0" containerName="ovsdbserver-sb" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914734 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df66af1-0c57-44f7-8b54-bc351a3faa66" containerName="openstack-network-exporter" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914743 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f198eff9-f493-43d9-9b64-06196b205142" containerName="proxy-httpd" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914753 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da7e312-7550-4d60-a14b-d5dbdc500e88" containerName="dnsmasq-dns" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914759 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="365caacf-756c-4558-b281-f8644c9c1c5f" containerName="galera" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914766 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe10a9a-a21e-4b2c-a5da-d50340115c7a" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914777 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df66af1-0c57-44f7-8b54-bc351a3faa66" containerName="ovsdbserver-nb" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.914785 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f198eff9-f493-43d9-9b64-06196b205142" containerName="proxy-server" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.915327 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xmpzp" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.916891 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.924948 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xmpzp"] Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.962969 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4335ead-4d73-4061-b069-960881c2f2d9-operator-scripts\") pod \"f4335ead-4d73-4061-b069-960881c2f2d9\" (UID: \"f4335ead-4d73-4061-b069-960881c2f2d9\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.963091 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/703d8335-345d-4847-afca-b5667e0b4c0f-operator-scripts\") pod \"703d8335-345d-4847-afca-b5667e0b4c0f\" (UID: \"703d8335-345d-4847-afca-b5667e0b4c0f\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.963143 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn558\" (UniqueName: \"kubernetes.io/projected/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca-kube-api-access-bn558\") pod \"3106f00a-8ed8-4189-be2e-f5c6cce1b4ca\" (UID: \"3106f00a-8ed8-4189-be2e-f5c6cce1b4ca\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.963205 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dv67\" (UniqueName: \"kubernetes.io/projected/703d8335-345d-4847-afca-b5667e0b4c0f-kube-api-access-9dv67\") pod \"703d8335-345d-4847-afca-b5667e0b4c0f\" (UID: \"703d8335-345d-4847-afca-b5667e0b4c0f\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.963282 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca-operator-scripts\") pod \"3106f00a-8ed8-4189-be2e-f5c6cce1b4ca\" (UID: \"3106f00a-8ed8-4189-be2e-f5c6cce1b4ca\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.963304 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqqfr\" (UniqueName: \"kubernetes.io/projected/69db2b46-7a27-45b3-9bca-fac2189f47ef-kube-api-access-sqqfr\") pod \"69db2b46-7a27-45b3-9bca-fac2189f47ef\" (UID: \"69db2b46-7a27-45b3-9bca-fac2189f47ef\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.963339 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c248\" (UniqueName: \"kubernetes.io/projected/f4335ead-4d73-4061-b069-960881c2f2d9-kube-api-access-5c248\") pod \"f4335ead-4d73-4061-b069-960881c2f2d9\" (UID: \"f4335ead-4d73-4061-b069-960881c2f2d9\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.963356 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69db2b46-7a27-45b3-9bca-fac2189f47ef-operator-scripts\") pod \"69db2b46-7a27-45b3-9bca-fac2189f47ef\" (UID: \"69db2b46-7a27-45b3-9bca-fac2189f47ef\") " Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.964646 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61d87ff7-95c3-43dc-afab-c9ba144b69e9-operator-scripts\") pod \"root-account-create-update-xmpzp\" (UID: \"61d87ff7-95c3-43dc-afab-c9ba144b69e9\") " pod="openstack/root-account-create-update-xmpzp" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.964769 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wps9m\" (UniqueName: \"kubernetes.io/projected/61d87ff7-95c3-43dc-afab-c9ba144b69e9-kube-api-access-wps9m\") pod \"root-account-create-update-xmpzp\" (UID: \"61d87ff7-95c3-43dc-afab-c9ba144b69e9\") " pod="openstack/root-account-create-update-xmpzp" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.964817 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8z62\" (UniqueName: \"kubernetes.io/projected/4e1bee18-1dd7-42be-b113-6a746b3ff70d-kube-api-access-q8z62\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.964830 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e1bee18-1dd7-42be-b113-6a746b3ff70d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.965199 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4335ead-4d73-4061-b069-960881c2f2d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4335ead-4d73-4061-b069-960881c2f2d9" (UID: "f4335ead-4d73-4061-b069-960881c2f2d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.965534 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703d8335-345d-4847-afca-b5667e0b4c0f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "703d8335-345d-4847-afca-b5667e0b4c0f" (UID: "703d8335-345d-4847-afca-b5667e0b4c0f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.967292 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69db2b46-7a27-45b3-9bca-fac2189f47ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69db2b46-7a27-45b3-9bca-fac2189f47ef" (UID: "69db2b46-7a27-45b3-9bca-fac2189f47ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.968270 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3106f00a-8ed8-4189-be2e-f5c6cce1b4ca" (UID: "3106f00a-8ed8-4189-be2e-f5c6cce1b4ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.971432 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca-kube-api-access-bn558" (OuterVolumeSpecName: "kube-api-access-bn558") pod "3106f00a-8ed8-4189-be2e-f5c6cce1b4ca" (UID: "3106f00a-8ed8-4189-be2e-f5c6cce1b4ca"). InnerVolumeSpecName "kube-api-access-bn558". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.971944 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69db2b46-7a27-45b3-9bca-fac2189f47ef-kube-api-access-sqqfr" (OuterVolumeSpecName: "kube-api-access-sqqfr") pod "69db2b46-7a27-45b3-9bca-fac2189f47ef" (UID: "69db2b46-7a27-45b3-9bca-fac2189f47ef"). InnerVolumeSpecName "kube-api-access-sqqfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.972870 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4335ead-4d73-4061-b069-960881c2f2d9-kube-api-access-5c248" (OuterVolumeSpecName: "kube-api-access-5c248") pod "f4335ead-4d73-4061-b069-960881c2f2d9" (UID: "f4335ead-4d73-4061-b069-960881c2f2d9"). InnerVolumeSpecName "kube-api-access-5c248". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:46 crc kubenswrapper[4841]: I0130 05:29:46.975219 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703d8335-345d-4847-afca-b5667e0b4c0f-kube-api-access-9dv67" (OuterVolumeSpecName: "kube-api-access-9dv67") pod "703d8335-345d-4847-afca-b5667e0b4c0f" (UID: "703d8335-345d-4847-afca-b5667e0b4c0f"). InnerVolumeSpecName "kube-api-access-9dv67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.067024 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wps9m\" (UniqueName: \"kubernetes.io/projected/61d87ff7-95c3-43dc-afab-c9ba144b69e9-kube-api-access-wps9m\") pod \"root-account-create-update-xmpzp\" (UID: \"61d87ff7-95c3-43dc-afab-c9ba144b69e9\") " pod="openstack/root-account-create-update-xmpzp" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.067139 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61d87ff7-95c3-43dc-afab-c9ba144b69e9-operator-scripts\") pod \"root-account-create-update-xmpzp\" (UID: \"61d87ff7-95c3-43dc-afab-c9ba144b69e9\") " pod="openstack/root-account-create-update-xmpzp" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.067283 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn558\" (UniqueName: \"kubernetes.io/projected/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca-kube-api-access-bn558\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.067298 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dv67\" (UniqueName: \"kubernetes.io/projected/703d8335-345d-4847-afca-b5667e0b4c0f-kube-api-access-9dv67\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.067325 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.067334 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqqfr\" (UniqueName: \"kubernetes.io/projected/69db2b46-7a27-45b3-9bca-fac2189f47ef-kube-api-access-sqqfr\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.067342 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c248\" (UniqueName: \"kubernetes.io/projected/f4335ead-4d73-4061-b069-960881c2f2d9-kube-api-access-5c248\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.067351 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69db2b46-7a27-45b3-9bca-fac2189f47ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.067359 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4335ead-4d73-4061-b069-960881c2f2d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.067366 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/703d8335-345d-4847-afca-b5667e0b4c0f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.067782 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61d87ff7-95c3-43dc-afab-c9ba144b69e9-operator-scripts\") pod \"root-account-create-update-xmpzp\" (UID: \"61d87ff7-95c3-43dc-afab-c9ba144b69e9\") " pod="openstack/root-account-create-update-xmpzp" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.089976 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wps9m\" (UniqueName: \"kubernetes.io/projected/61d87ff7-95c3-43dc-afab-c9ba144b69e9-kube-api-access-wps9m\") pod \"root-account-create-update-xmpzp\" (UID: \"61d87ff7-95c3-43dc-afab-c9ba144b69e9\") " pod="openstack/root-account-create-update-xmpzp" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.119357 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="1a2724da-6b9b-4947-a4e3-894938742304" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.171:8776/healthcheck\": dial tcp 10.217.0.171:8776: connect: connection refused" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.143045 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.143300 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="ceilometer-central-agent" containerID="cri-o://cfc0ae345e05eb3f086c878d95ec4681a8950e4f7bd6e49a010f4a0710f07baa" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.145808 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="sg-core" containerID="cri-o://e8fd8e866eed90045f99756e46172a271386a959f2b77da427ad4735cdb79473" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.145939 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="proxy-httpd" containerID="cri-o://2888bc3f6d872d8a4049937fb5ba2193d6506521580fd4ed32cf36760940a837" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.145975 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="ceilometer-notification-agent" containerID="cri-o://d71848d69df407548c21d416cd1905e03bc4273aed56307ec048215b6bd60f64" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.187206 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.211620 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ef067657-4804-406e-b45f-e19553dcd2d8" containerName="kube-state-metrics" containerID="cri-o://c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.269267 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xmpzp" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.331928 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.332308 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="90edf3da-3cbc-407f-9cfa-de97879f3834" containerName="memcached" containerID="cri-o://33a2da7865353e0526f0692b43cfed86797432813b0f51302a7eafdb6082c4e2" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4841]: E0130 05:29:47.348423 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 93721abe2b80f73cd1be85412e3cfc53afaba2d7b1d8e8eec69d4a39915539f1 is running failed: container process not found" containerID="93721abe2b80f73cd1be85412e3cfc53afaba2d7b1d8e8eec69d4a39915539f1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.348772 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2e68-account-create-update-9fkzh"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.369961 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2e68-account-create-update-9fkzh"] Jan 30 05:29:47 crc kubenswrapper[4841]: E0130 05:29:47.371798 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 93721abe2b80f73cd1be85412e3cfc53afaba2d7b1d8e8eec69d4a39915539f1 is running failed: container process not found" containerID="93721abe2b80f73cd1be85412e3cfc53afaba2d7b1d8e8eec69d4a39915539f1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 30 05:29:47 crc kubenswrapper[4841]: E0130 05:29:47.373681 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 93721abe2b80f73cd1be85412e3cfc53afaba2d7b1d8e8eec69d4a39915539f1 is running failed: container process not found" containerID="93721abe2b80f73cd1be85412e3cfc53afaba2d7b1d8e8eec69d4a39915539f1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 30 05:29:47 crc kubenswrapper[4841]: E0130 05:29:47.373736 4841 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 93721abe2b80f73cd1be85412e3cfc53afaba2d7b1d8e8eec69d4a39915539f1 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="9baa24e8-552c-425b-a494-ca70b9bcff0c" containerName="nova-cell0-conductor-conductor" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.375035 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2e68-account-create-update-ldwvb"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.376585 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e68-account-create-update-ldwvb" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.385528 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.387088 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wwlqb"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.392668 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1306-account-create-update-9p79b" event={"ID":"3106f00a-8ed8-4189-be2e-f5c6cce1b4ca","Type":"ContainerDied","Data":"eb15ee17f4d7c8dc588c845a05a968e79aaededfe3658159c5af648558a58bc2"} Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.392777 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1306-account-create-update-9p79b" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.397943 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-45dj5"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.403791 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wwlqb"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.409171 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-45dj5"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.415049 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2e68-account-create-update-ldwvb"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.419660 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5445c58497-n245m"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.419870 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5445c58497-n245m" podUID="b191848f-8f16-40e6-8a2b-f66a0179f359" containerName="keystone-api" containerID="cri-o://5f9a9849cbf3d28b7e6adfb866f7318a167f6a59eaf9156c8cc67f2d836ff57f" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.425060 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.425082 4841 generic.go:334] "Generic (PLEG): container finished" podID="9baa24e8-552c-425b-a494-ca70b9bcff0c" containerID="93721abe2b80f73cd1be85412e3cfc53afaba2d7b1d8e8eec69d4a39915539f1" exitCode=0 Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.425103 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9baa24e8-552c-425b-a494-ca70b9bcff0c","Type":"ContainerDied","Data":"93721abe2b80f73cd1be85412e3cfc53afaba2d7b1d8e8eec69d4a39915539f1"} Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.441507 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2knp8"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.452929 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2knp8"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.456286 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"365caacf-756c-4558-b281-f8644c9c1c5f","Type":"ContainerDied","Data":"fc053fb5b0d026c63c908afd011d2cb35880c88e1d810dafc464dc552806e00f"} Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.456319 4841 scope.go:117] "RemoveContainer" containerID="42b9436e6ff50cbea7adadc24b6d483db87ac5a4389559de150639c961eb2f31" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.456461 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.467560 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2e68-account-create-update-ldwvb"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.469162 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9a61-account-create-update-8jgdj" event={"ID":"4e1bee18-1dd7-42be-b113-6a746b3ff70d","Type":"ContainerDied","Data":"f5e7d86049bcc5e08b656357960f8a4470152ba9d86730ed4073bdb2e47f5a8a"} Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.469250 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9a61-account-create-update-8jgdj" Jan 30 05:29:47 crc kubenswrapper[4841]: E0130 05:29:47.469354 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-qmqn6 operator-scripts], unattached volumes=[], failed to process volumes=[kube-api-access-qmqn6 operator-scripts]: context canceled" pod="openstack/keystone-2e68-account-create-update-ldwvb" podUID="06e72b40-f22a-4602-a483-7653890ae089" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.476512 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e72b40-f22a-4602-a483-7653890ae089-operator-scripts\") pod \"keystone-2e68-account-create-update-ldwvb\" (UID: \"06e72b40-f22a-4602-a483-7653890ae089\") " pod="openstack/keystone-2e68-account-create-update-ldwvb" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.476682 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmqn6\" (UniqueName: \"kubernetes.io/projected/06e72b40-f22a-4602-a483-7653890ae089-kube-api-access-qmqn6\") pod \"keystone-2e68-account-create-update-ldwvb\" (UID: \"06e72b40-f22a-4602-a483-7653890ae089\") " pod="openstack/keystone-2e68-account-create-update-ldwvb" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.487254 4841 generic.go:334] "Generic (PLEG): container finished" podID="84508935-db34-4e2b-a3af-800ac432353b" containerID="e8fd8e866eed90045f99756e46172a271386a959f2b77da427ad4735cdb79473" exitCode=2 Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.487300 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84508935-db34-4e2b-a3af-800ac432353b","Type":"ContainerDied","Data":"e8fd8e866eed90045f99756e46172a271386a959f2b77da427ad4735cdb79473"} Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.494475 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xmpzp"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.498094 4841 generic.go:334] "Generic (PLEG): container finished" podID="1a2724da-6b9b-4947-a4e3-894938742304" containerID="a480751fcbd360c7947b1114ae086d16d2b9b26b051a46078f3d5cb250e0a982" exitCode=0 Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.498687 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a2724da-6b9b-4947-a4e3-894938742304","Type":"ContainerDied","Data":"a480751fcbd360c7947b1114ae086d16d2b9b26b051a46078f3d5cb250e0a982"} Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.502792 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d99c-account-create-update-xhj29" event={"ID":"69db2b46-7a27-45b3-9bca-fac2189f47ef","Type":"ContainerDied","Data":"4b1739550fc839f0fd61e46baaac5a1cf00c4b6971e4cb1bfd77b08bb0ce7bc2"} Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.502885 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d99c-account-create-update-xhj29" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.506900 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1306-account-create-update-9p79b"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.509079 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1306-account-create-update-9p79b"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.511122 4841 generic.go:334] "Generic (PLEG): container finished" podID="28551500-d017-475a-aae4-8352782c0b4e" containerID="1608c08a181e26696b13ff2abda250d1ad88e50ec15ce51f053cef31c22f983e" exitCode=0 Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.511165 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc8b6ddd6-nkc6r" event={"ID":"28551500-d017-475a-aae4-8352782c0b4e","Type":"ContainerDied","Data":"1608c08a181e26696b13ff2abda250d1ad88e50ec15ce51f053cef31c22f983e"} Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.512072 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fec-account-create-update-46xxt" event={"ID":"703d8335-345d-4847-afca-b5667e0b4c0f","Type":"ContainerDied","Data":"cbc0bf8fa64e7fb6ec3c68ab7842c5c6aa8b5911b9beca6f3b472cc0c18c6f1b"} Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.512133 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fec-account-create-update-46xxt" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.520037 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-de72-account-create-update-ckzp8" event={"ID":"f4335ead-4d73-4061-b069-960881c2f2d9","Type":"ContainerDied","Data":"608bf025bd1045513c66e1133ef707f88b1ba91cc955e54d5935933c4043c5c9"} Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.520144 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-de72-account-create-update-ckzp8" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.528832 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.529071 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6fe10a9a-a21e-4b2c-a5da-d50340115c7a","Type":"ContainerDied","Data":"7c5495bfedaeafce9f0e4e0f6e39d3e026f871946ae30e63e06981febebac3a9"} Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.578568 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e72b40-f22a-4602-a483-7653890ae089-operator-scripts\") pod \"keystone-2e68-account-create-update-ldwvb\" (UID: \"06e72b40-f22a-4602-a483-7653890ae089\") " pod="openstack/keystone-2e68-account-create-update-ldwvb" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.578659 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmqn6\" (UniqueName: \"kubernetes.io/projected/06e72b40-f22a-4602-a483-7653890ae089-kube-api-access-qmqn6\") pod \"keystone-2e68-account-create-update-ldwvb\" (UID: \"06e72b40-f22a-4602-a483-7653890ae089\") " pod="openstack/keystone-2e68-account-create-update-ldwvb" Jan 30 05:29:47 crc kubenswrapper[4841]: E0130 05:29:47.578843 4841 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 30 05:29:47 crc kubenswrapper[4841]: E0130 05:29:47.578989 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/06e72b40-f22a-4602-a483-7653890ae089-operator-scripts podName:06e72b40-f22a-4602-a483-7653890ae089 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:48.078945686 +0000 UTC m=+1325.072418324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/06e72b40-f22a-4602-a483-7653890ae089-operator-scripts") pod "keystone-2e68-account-create-update-ldwvb" (UID: "06e72b40-f22a-4602-a483-7653890ae089") : configmap "openstack-scripts" not found Jan 30 05:29:47 crc kubenswrapper[4841]: E0130 05:29:47.583644 4841 projected.go:194] Error preparing data for projected volume kube-api-access-qmqn6 for pod openstack/keystone-2e68-account-create-update-ldwvb: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 30 05:29:47 crc kubenswrapper[4841]: E0130 05:29:47.583703 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06e72b40-f22a-4602-a483-7653890ae089-kube-api-access-qmqn6 podName:06e72b40-f22a-4602-a483-7653890ae089 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:48.083686613 +0000 UTC m=+1325.077159251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qmqn6" (UniqueName: "kubernetes.io/projected/06e72b40-f22a-4602-a483-7653890ae089-kube-api-access-qmqn6") pod "keystone-2e68-account-create-update-ldwvb" (UID: "06e72b40-f22a-4602-a483-7653890ae089") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.624615 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.662438 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9a61-account-create-update-8jgdj"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.678686 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="8e2461f9-732a-448f-a5d7-7528bc3956e3" containerName="galera" containerID="cri-o://11b34a44cee08eb57f7dd12e3101d19de35b79f567abf07b3e86b4e5ced1412b" gracePeriod=30 Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.679473 4841 scope.go:117] "RemoveContainer" containerID="e0d76e4beb885ef332ab1fc012958772ae48e732b5830a56d72303260efbbb68" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.679538 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9a61-account-create-update-8jgdj"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.689466 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-de72-account-create-update-ckzp8"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.716669 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-de72-account-create-update-ckzp8"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.747878 4841 scope.go:117] "RemoveContainer" containerID="de54af2206ef9739b9851bdbdbfe8715a9ca62af5991471c3c6388dc3e2c68b3" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.764883 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d99c-account-create-update-xhj29"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.779932 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d99c-account-create-update-xhj29"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.782128 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9baa24e8-552c-425b-a494-ca70b9bcff0c-config-data\") pod \"9baa24e8-552c-425b-a494-ca70b9bcff0c\" (UID: \"9baa24e8-552c-425b-a494-ca70b9bcff0c\") " Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.782210 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9baa24e8-552c-425b-a494-ca70b9bcff0c-combined-ca-bundle\") pod \"9baa24e8-552c-425b-a494-ca70b9bcff0c\" (UID: \"9baa24e8-552c-425b-a494-ca70b9bcff0c\") " Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.782339 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2t4x\" (UniqueName: \"kubernetes.io/projected/9baa24e8-552c-425b-a494-ca70b9bcff0c-kube-api-access-x2t4x\") pod \"9baa24e8-552c-425b-a494-ca70b9bcff0c\" (UID: \"9baa24e8-552c-425b-a494-ca70b9bcff0c\") " Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.793073 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9baa24e8-552c-425b-a494-ca70b9bcff0c-kube-api-access-x2t4x" (OuterVolumeSpecName: "kube-api-access-x2t4x") pod "9baa24e8-552c-425b-a494-ca70b9bcff0c" (UID: "9baa24e8-552c-425b-a494-ca70b9bcff0c"). InnerVolumeSpecName "kube-api-access-x2t4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.805640 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1fec-account-create-update-46xxt"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.816002 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1fec-account-create-update-46xxt"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.826727 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9baa24e8-552c-425b-a494-ca70b9bcff0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9baa24e8-552c-425b-a494-ca70b9bcff0c" (UID: "9baa24e8-552c-425b-a494-ca70b9bcff0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.829445 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.834840 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9baa24e8-552c-425b-a494-ca70b9bcff0c-config-data" (OuterVolumeSpecName: "config-data") pod "9baa24e8-552c-425b-a494-ca70b9bcff0c" (UID: "9baa24e8-552c-425b-a494-ca70b9bcff0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.839261 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.842742 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.849917 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.854472 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.887141 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9baa24e8-552c-425b-a494-ca70b9bcff0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.887187 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2t4x\" (UniqueName: \"kubernetes.io/projected/9baa24e8-552c-425b-a494-ca70b9bcff0c-kube-api-access-x2t4x\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.887201 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9baa24e8-552c-425b-a494-ca70b9bcff0c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.920366 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.932439 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:47 crc kubenswrapper[4841]: I0130 05:29:47.990259 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.069763 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mtk2w"] Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090489 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-internal-tls-certs\") pod \"28551500-d017-475a-aae4-8352782c0b4e\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090550 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-config-data\") pod \"1a2724da-6b9b-4947-a4e3-894938742304\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090592 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-config-data-custom\") pod \"1a2724da-6b9b-4947-a4e3-894938742304\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090645 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9p22\" (UniqueName: \"kubernetes.io/projected/28551500-d017-475a-aae4-8352782c0b4e-kube-api-access-j9p22\") pod \"28551500-d017-475a-aae4-8352782c0b4e\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090703 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-combined-ca-bundle\") pod \"28551500-d017-475a-aae4-8352782c0b4e\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090742 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-config-data\") pod \"28551500-d017-475a-aae4-8352782c0b4e\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090759 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a2724da-6b9b-4947-a4e3-894938742304-logs\") pod \"1a2724da-6b9b-4947-a4e3-894938742304\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090776 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-scripts\") pod \"28551500-d017-475a-aae4-8352782c0b4e\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090797 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a2724da-6b9b-4947-a4e3-894938742304-etc-machine-id\") pod \"1a2724da-6b9b-4947-a4e3-894938742304\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090840 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-public-tls-certs\") pod \"28551500-d017-475a-aae4-8352782c0b4e\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090855 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-public-tls-certs\") pod \"1a2724da-6b9b-4947-a4e3-894938742304\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090883 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-combined-ca-bundle\") pod \"1a2724da-6b9b-4947-a4e3-894938742304\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090913 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9g78\" (UniqueName: \"kubernetes.io/projected/1a2724da-6b9b-4947-a4e3-894938742304-kube-api-access-z9g78\") pod \"1a2724da-6b9b-4947-a4e3-894938742304\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090952 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-internal-tls-certs\") pod \"1a2724da-6b9b-4947-a4e3-894938742304\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090970 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-scripts\") pod \"1a2724da-6b9b-4947-a4e3-894938742304\" (UID: \"1a2724da-6b9b-4947-a4e3-894938742304\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.090985 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28551500-d017-475a-aae4-8352782c0b4e-logs\") pod \"28551500-d017-475a-aae4-8352782c0b4e\" (UID: \"28551500-d017-475a-aae4-8352782c0b4e\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.091205 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmqn6\" (UniqueName: \"kubernetes.io/projected/06e72b40-f22a-4602-a483-7653890ae089-kube-api-access-qmqn6\") pod \"keystone-2e68-account-create-update-ldwvb\" (UID: \"06e72b40-f22a-4602-a483-7653890ae089\") " pod="openstack/keystone-2e68-account-create-update-ldwvb" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.091322 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e72b40-f22a-4602-a483-7653890ae089-operator-scripts\") pod \"keystone-2e68-account-create-update-ldwvb\" (UID: \"06e72b40-f22a-4602-a483-7653890ae089\") " pod="openstack/keystone-2e68-account-create-update-ldwvb" Jan 30 05:29:48 crc kubenswrapper[4841]: E0130 05:29:48.091425 4841 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 30 05:29:48 crc kubenswrapper[4841]: E0130 05:29:48.091482 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/06e72b40-f22a-4602-a483-7653890ae089-operator-scripts podName:06e72b40-f22a-4602-a483-7653890ae089 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:49.091455755 +0000 UTC m=+1326.084928393 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/06e72b40-f22a-4602-a483-7653890ae089-operator-scripts") pod "keystone-2e68-account-create-update-ldwvb" (UID: "06e72b40-f22a-4602-a483-7653890ae089") : configmap "openstack-scripts" not found Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.091524 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a2724da-6b9b-4947-a4e3-894938742304-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1a2724da-6b9b-4947-a4e3-894938742304" (UID: "1a2724da-6b9b-4947-a4e3-894938742304"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.094481 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-scripts" (OuterVolumeSpecName: "scripts") pod "28551500-d017-475a-aae4-8352782c0b4e" (UID: "28551500-d017-475a-aae4-8352782c0b4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.099074 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1a2724da-6b9b-4947-a4e3-894938742304" (UID: "1a2724da-6b9b-4947-a4e3-894938742304"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.099996 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28551500-d017-475a-aae4-8352782c0b4e-logs" (OuterVolumeSpecName: "logs") pod "28551500-d017-475a-aae4-8352782c0b4e" (UID: "28551500-d017-475a-aae4-8352782c0b4e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.102794 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28551500-d017-475a-aae4-8352782c0b4e-kube-api-access-j9p22" (OuterVolumeSpecName: "kube-api-access-j9p22") pod "28551500-d017-475a-aae4-8352782c0b4e" (UID: "28551500-d017-475a-aae4-8352782c0b4e"). InnerVolumeSpecName "kube-api-access-j9p22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.104131 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a2724da-6b9b-4947-a4e3-894938742304-logs" (OuterVolumeSpecName: "logs") pod "1a2724da-6b9b-4947-a4e3-894938742304" (UID: "1a2724da-6b9b-4947-a4e3-894938742304"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: E0130 05:29:48.114609 4841 projected.go:194] Error preparing data for projected volume kube-api-access-qmqn6 for pod openstack/keystone-2e68-account-create-update-ldwvb: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 30 05:29:48 crc kubenswrapper[4841]: E0130 05:29:48.114678 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06e72b40-f22a-4602-a483-7653890ae089-kube-api-access-qmqn6 podName:06e72b40-f22a-4602-a483-7653890ae089 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:49.114657939 +0000 UTC m=+1326.108130577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qmqn6" (UniqueName: "kubernetes.io/projected/06e72b40-f22a-4602-a483-7653890ae089-kube-api-access-qmqn6") pod "keystone-2e68-account-create-update-ldwvb" (UID: "06e72b40-f22a-4602-a483-7653890ae089") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.124596 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-scripts" (OuterVolumeSpecName: "scripts") pod "1a2724da-6b9b-4947-a4e3-894938742304" (UID: "1a2724da-6b9b-4947-a4e3-894938742304"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.154134 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a2724da-6b9b-4947-a4e3-894938742304-kube-api-access-z9g78" (OuterVolumeSpecName: "kube-api-access-z9g78") pod "1a2724da-6b9b-4947-a4e3-894938742304" (UID: "1a2724da-6b9b-4947-a4e3-894938742304"). InnerVolumeSpecName "kube-api-access-z9g78". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.157639 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a2724da-6b9b-4947-a4e3-894938742304" (UID: "1a2724da-6b9b-4947-a4e3-894938742304"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.183931 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": read tcp 10.217.0.2:48216->10.217.0.209:8775: read: connection reset by peer" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.184077 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": read tcp 10.217.0.2:48206->10.217.0.209:8775: read: connection reset by peer" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.191960 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="a9e49c58-8075-46a1-8bfd-44412a673589" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.176:9292/healthcheck\": dial tcp 10.217.0.176:9292: connect: connection refused" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.192923 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.193349 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9g78\" (UniqueName: \"kubernetes.io/projected/1a2724da-6b9b-4947-a4e3-894938742304-kube-api-access-z9g78\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.193464 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.193526 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28551500-d017-475a-aae4-8352782c0b4e-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.193580 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.193633 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9p22\" (UniqueName: \"kubernetes.io/projected/28551500-d017-475a-aae4-8352782c0b4e-kube-api-access-j9p22\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.193692 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a2724da-6b9b-4947-a4e3-894938742304-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.193913 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.194000 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a2724da-6b9b-4947-a4e3-894938742304-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.202884 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="a9e49c58-8075-46a1-8bfd-44412a673589" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.176:9292/healthcheck\": dial tcp 10.217.0.176:9292: connect: connection refused" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.260886 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1a2724da-6b9b-4947-a4e3-894938742304" (UID: "1a2724da-6b9b-4947-a4e3-894938742304"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.268185 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28551500-d017-475a-aae4-8352782c0b4e" (UID: "28551500-d017-475a-aae4-8352782c0b4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.280916 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-config-data" (OuterVolumeSpecName: "config-data") pod "28551500-d017-475a-aae4-8352782c0b4e" (UID: "28551500-d017-475a-aae4-8352782c0b4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.288659 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1a2724da-6b9b-4947-a4e3-894938742304" (UID: "1a2724da-6b9b-4947-a4e3-894938742304"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.299647 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.299674 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.299685 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.299693 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.307566 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "28551500-d017-475a-aae4-8352782c0b4e" (UID: "28551500-d017-475a-aae4-8352782c0b4e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.330137 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-config-data" (OuterVolumeSpecName: "config-data") pod "1a2724da-6b9b-4947-a4e3-894938742304" (UID: "1a2724da-6b9b-4947-a4e3-894938742304"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.339340 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-676d-account-create-update-tkbqn" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.375319 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "28551500-d017-475a-aae4-8352782c0b4e" (UID: "28551500-d017-475a-aae4-8352782c0b4e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.402092 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.402121 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28551500-d017-475a-aae4-8352782c0b4e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.402130 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a2724da-6b9b-4947-a4e3-894938742304-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.402989 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d94d6f7cb-lf9nq" podUID="91db9edf-7d6d-4189-aaac-480a438900be" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:42574->10.217.0.165:9311: read: connection reset by peer" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.403270 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d94d6f7cb-lf9nq" podUID="91db9edf-7d6d-4189-aaac-480a438900be" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:42588->10.217.0.165:9311: read: connection reset by peer" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.443056 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079879f7-939a-47a7-bbeb-98e1f8d7159b" path="/var/lib/kubelet/pods/079879f7-939a-47a7-bbeb-98e1f8d7159b/volumes" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.443726 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3106f00a-8ed8-4189-be2e-f5c6cce1b4ca" path="/var/lib/kubelet/pods/3106f00a-8ed8-4189-be2e-f5c6cce1b4ca/volumes" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.444238 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="365caacf-756c-4558-b281-f8644c9c1c5f" path="/var/lib/kubelet/pods/365caacf-756c-4558-b281-f8644c9c1c5f/volumes" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.445589 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1bee18-1dd7-42be-b113-6a746b3ff70d" path="/var/lib/kubelet/pods/4e1bee18-1dd7-42be-b113-6a746b3ff70d/volumes" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.446046 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed3d0b9-0cdf-4174-8283-5dd9faadbe34" path="/var/lib/kubelet/pods/4ed3d0b9-0cdf-4174-8283-5dd9faadbe34/volumes" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.446688 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3379a5-71bf-421e-8262-9f63141e2a09" path="/var/lib/kubelet/pods/4f3379a5-71bf-421e-8262-9f63141e2a09/volumes" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.447413 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c5a675-0799-4ed6-b76b-3bfc55c6acbc" path="/var/lib/kubelet/pods/69c5a675-0799-4ed6-b76b-3bfc55c6acbc/volumes" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.449036 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69db2b46-7a27-45b3-9bca-fac2189f47ef" path="/var/lib/kubelet/pods/69db2b46-7a27-45b3-9bca-fac2189f47ef/volumes" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.449439 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe10a9a-a21e-4b2c-a5da-d50340115c7a" path="/var/lib/kubelet/pods/6fe10a9a-a21e-4b2c-a5da-d50340115c7a/volumes" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.449984 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="703d8335-345d-4847-afca-b5667e0b4c0f" path="/var/lib/kubelet/pods/703d8335-345d-4847-afca-b5667e0b4c0f/volumes" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.450506 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f198eff9-f493-43d9-9b64-06196b205142" path="/var/lib/kubelet/pods/f198eff9-f493-43d9-9b64-06196b205142/volumes" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.451885 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4335ead-4d73-4061-b069-960881c2f2d9" path="/var/lib/kubelet/pods/f4335ead-4d73-4061-b069-960881c2f2d9/volumes" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.509196 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff1ce10f-fa3e-4526-a823-f4defdaf9085-operator-scripts\") pod \"ff1ce10f-fa3e-4526-a823-f4defdaf9085\" (UID: \"ff1ce10f-fa3e-4526-a823-f4defdaf9085\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.509506 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vgv5\" (UniqueName: \"kubernetes.io/projected/ff1ce10f-fa3e-4526-a823-f4defdaf9085-kube-api-access-9vgv5\") pod \"ff1ce10f-fa3e-4526-a823-f4defdaf9085\" (UID: \"ff1ce10f-fa3e-4526-a823-f4defdaf9085\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.509882 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff1ce10f-fa3e-4526-a823-f4defdaf9085-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff1ce10f-fa3e-4526-a823-f4defdaf9085" (UID: "ff1ce10f-fa3e-4526-a823-f4defdaf9085"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.518583 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff1ce10f-fa3e-4526-a823-f4defdaf9085-kube-api-access-9vgv5" (OuterVolumeSpecName: "kube-api-access-9vgv5") pod "ff1ce10f-fa3e-4526-a823-f4defdaf9085" (UID: "ff1ce10f-fa3e-4526-a823-f4defdaf9085"). InnerVolumeSpecName "kube-api-access-9vgv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: E0130 05:29:48.556819 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 05:29:48 crc kubenswrapper[4841]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 30 05:29:48 crc kubenswrapper[4841]: Jan 30 05:29:48 crc kubenswrapper[4841]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 30 05:29:48 crc kubenswrapper[4841]: Jan 30 05:29:48 crc kubenswrapper[4841]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 30 05:29:48 crc kubenswrapper[4841]: Jan 30 05:29:48 crc kubenswrapper[4841]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 30 05:29:48 crc kubenswrapper[4841]: Jan 30 05:29:48 crc kubenswrapper[4841]: if [ -n "" ]; then Jan 30 05:29:48 crc kubenswrapper[4841]: GRANT_DATABASE="" Jan 30 05:29:48 crc kubenswrapper[4841]: else Jan 30 05:29:48 crc kubenswrapper[4841]: GRANT_DATABASE="*" Jan 30 05:29:48 crc kubenswrapper[4841]: fi Jan 30 05:29:48 crc kubenswrapper[4841]: Jan 30 05:29:48 crc kubenswrapper[4841]: # going for maximum compatibility here: Jan 30 05:29:48 crc kubenswrapper[4841]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 30 05:29:48 crc kubenswrapper[4841]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 30 05:29:48 crc kubenswrapper[4841]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 30 05:29:48 crc kubenswrapper[4841]: # support updates Jan 30 05:29:48 crc kubenswrapper[4841]: Jan 30 05:29:48 crc kubenswrapper[4841]: $MYSQL_CMD < logger="UnhandledError" Jan 30 05:29:48 crc kubenswrapper[4841]: E0130 05:29:48.558109 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-xmpzp" podUID="61d87ff7-95c3-43dc-afab-c9ba144b69e9" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.562696 4841 generic.go:334] "Generic (PLEG): container finished" podID="ef067657-4804-406e-b45f-e19553dcd2d8" containerID="c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad" exitCode=2 Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.569744 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.204:3000/\": dial tcp 10.217.0.204:3000: connect: connection refused" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.570630 4841 generic.go:334] "Generic (PLEG): container finished" podID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerID="d334449b0de1d005d626d94fd883ec0192a4936b40dc9ec24b425dde3a584637" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.573647 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fc8b6ddd6-nkc6r" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.594520 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.611445 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff1ce10f-fa3e-4526-a823-f4defdaf9085-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.611476 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vgv5\" (UniqueName: \"kubernetes.io/projected/ff1ce10f-fa3e-4526-a823-f4defdaf9085-kube-api-access-9vgv5\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.612861 4841 generic.go:334] "Generic (PLEG): container finished" podID="a9e49c58-8075-46a1-8bfd-44412a673589" containerID="423863723ace49a11675767356ead1a32dc7658d7769ea7c64a96cba5343a3ca" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.615550 4841 generic.go:334] "Generic (PLEG): container finished" podID="91db9edf-7d6d-4189-aaac-480a438900be" containerID="bc897d8ef0f32c66c2606560ae71bcd74a208effcd7e7b2a87ba2f0b34843405" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.627945 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.636664 4841 generic.go:334] "Generic (PLEG): container finished" podID="73fdf532-7bb7-43db-acbc-b949166ccd6b" containerID="249a278941e10cc86f1a6bbc7212e043a936d0a3e53d7ee03f0ed2a158d8459b" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.642699 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-676d-account-create-update-tkbqn" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.670477 4841 generic.go:334] "Generic (PLEG): container finished" podID="90edf3da-3cbc-407f-9cfa-de97879f3834" containerID="33a2da7865353e0526f0692b43cfed86797432813b0f51302a7eafdb6082c4e2" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.679907 4841 generic.go:334] "Generic (PLEG): container finished" podID="84508935-db34-4e2b-a3af-800ac432353b" containerID="2888bc3f6d872d8a4049937fb5ba2193d6506521580fd4ed32cf36760940a837" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.679926 4841 generic.go:334] "Generic (PLEG): container finished" podID="84508935-db34-4e2b-a3af-800ac432353b" containerID="cfc0ae345e05eb3f086c878d95ec4681a8950e4f7bd6e49a010f4a0710f07baa" exitCode=0 Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.679977 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e68-account-create-update-ldwvb" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.696445 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:29:48 crc kubenswrapper[4841]: E0130 05:29:48.712954 4841 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 30 05:29:48 crc kubenswrapper[4841]: E0130 05:29:48.713031 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts podName:1afed894-4dfb-4873-a45c-29b70507295a nodeName:}" failed. No retries permitted until 2026-01-30 05:29:52.713017922 +0000 UTC m=+1329.706490560 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts") pod "root-account-create-update-7z8bf" (UID: "1afed894-4dfb-4873-a45c-29b70507295a") : configmap "openstack-cell1-scripts" not found Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744808 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ef067657-4804-406e-b45f-e19553dcd2d8","Type":"ContainerDied","Data":"c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744846 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xmpzp"] Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744867 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ef067657-4804-406e-b45f-e19553dcd2d8","Type":"ContainerDied","Data":"b9f7b2bd9deaf526075ef3854fe26c25fc2dbb657b638d758de565f569a85f74"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744882 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7be8df86-7b8d-4741-ae13-ec1b243549b3","Type":"ContainerDied","Data":"d334449b0de1d005d626d94fd883ec0192a4936b40dc9ec24b425dde3a584637"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744894 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc8b6ddd6-nkc6r" event={"ID":"28551500-d017-475a-aae4-8352782c0b4e","Type":"ContainerDied","Data":"fb3571d156a923c7d7af537c762725872e8f52a1c84ffa314dad1c545f735afb"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744906 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9baa24e8-552c-425b-a494-ca70b9bcff0c","Type":"ContainerDied","Data":"370b2e574720f045233c9ef080ea7db818028730dc30c9be26a139d9ef9a22b0"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744917 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9e49c58-8075-46a1-8bfd-44412a673589","Type":"ContainerDied","Data":"423863723ace49a11675767356ead1a32dc7658d7769ea7c64a96cba5343a3ca"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744928 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9e49c58-8075-46a1-8bfd-44412a673589","Type":"ContainerDied","Data":"87ea243204ba0ef9295c71af73b5b0686f8305cf223636aee885ab9dcd4663a4"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744937 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87ea243204ba0ef9295c71af73b5b0686f8305cf223636aee885ab9dcd4663a4" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744947 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d94d6f7cb-lf9nq" event={"ID":"91db9edf-7d6d-4189-aaac-480a438900be","Type":"ContainerDied","Data":"bc897d8ef0f32c66c2606560ae71bcd74a208effcd7e7b2a87ba2f0b34843405"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744958 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a2724da-6b9b-4947-a4e3-894938742304","Type":"ContainerDied","Data":"36f7afdbe5084d5625b8087adb504fde099a7eba590244560f3e9ec87dbbbff8"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744969 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"73fdf532-7bb7-43db-acbc-b949166ccd6b","Type":"ContainerDied","Data":"249a278941e10cc86f1a6bbc7212e043a936d0a3e53d7ee03f0ed2a158d8459b"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744979 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-676d-account-create-update-tkbqn" event={"ID":"ff1ce10f-fa3e-4526-a823-f4defdaf9085","Type":"ContainerDied","Data":"43ed044e1286fb9780ca41391af739022fad6b12ebd947d30dabd327cf555fca"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744989 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7z8bf" event={"ID":"1afed894-4dfb-4873-a45c-29b70507295a","Type":"ContainerDied","Data":"68a54c2161fbb02f8f5192095e93d59f373e572d8122da43c30c351668ab291a"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.744998 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68a54c2161fbb02f8f5192095e93d59f373e572d8122da43c30c351668ab291a" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.745005 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"90edf3da-3cbc-407f-9cfa-de97879f3834","Type":"ContainerDied","Data":"33a2da7865353e0526f0692b43cfed86797432813b0f51302a7eafdb6082c4e2"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.745015 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xmpzp" event={"ID":"61d87ff7-95c3-43dc-afab-c9ba144b69e9","Type":"ContainerStarted","Data":"b619be961f81759ded24e296b3d6adce2828cbc3371487faf7077205b3294127"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.745024 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84508935-db34-4e2b-a3af-800ac432353b","Type":"ContainerDied","Data":"2888bc3f6d872d8a4049937fb5ba2193d6506521580fd4ed32cf36760940a837"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.745034 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84508935-db34-4e2b-a3af-800ac432353b","Type":"ContainerDied","Data":"cfc0ae345e05eb3f086c878d95ec4681a8950e4f7bd6e49a010f4a0710f07baa"} Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.745049 4841 scope.go:117] "RemoveContainer" containerID="c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.777234 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7z8bf" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.805675 4841 scope.go:117] "RemoveContainer" containerID="c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad" Jan 30 05:29:48 crc kubenswrapper[4841]: E0130 05:29:48.807626 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad\": container with ID starting with c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad not found: ID does not exist" containerID="c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.808013 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad"} err="failed to get container status \"c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad\": rpc error: code = NotFound desc = could not find container \"c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad\": container with ID starting with c8c1f488e68caec7d1b79e8f75ff07f5969a9c9df7b885521ec898c37e68bdad not found: ID does not exist" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.808054 4841 scope.go:117] "RemoveContainer" containerID="1608c08a181e26696b13ff2abda250d1ad88e50ec15ce51f053cef31c22f983e" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.808530 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.813588 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-kube-state-metrics-tls-certs\") pod \"ef067657-4804-406e-b45f-e19553dcd2d8\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.813641 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmtnz\" (UniqueName: \"kubernetes.io/projected/ef067657-4804-406e-b45f-e19553dcd2d8-kube-api-access-pmtnz\") pod \"ef067657-4804-406e-b45f-e19553dcd2d8\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.813699 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-combined-ca-bundle\") pod \"ef067657-4804-406e-b45f-e19553dcd2d8\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.814726 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-kube-state-metrics-tls-config\") pod \"ef067657-4804-406e-b45f-e19553dcd2d8\" (UID: \"ef067657-4804-406e-b45f-e19553dcd2d8\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.832694 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef067657-4804-406e-b45f-e19553dcd2d8-kube-api-access-pmtnz" (OuterVolumeSpecName: "kube-api-access-pmtnz") pod "ef067657-4804-406e-b45f-e19553dcd2d8" (UID: "ef067657-4804-406e-b45f-e19553dcd2d8"). InnerVolumeSpecName "kube-api-access-pmtnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.852555 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.863226 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.870524 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef067657-4804-406e-b45f-e19553dcd2d8" (UID: "ef067657-4804-406e-b45f-e19553dcd2d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.884505 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.903112 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6fc8b6ddd6-nkc6r"] Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.905223 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "ef067657-4804-406e-b45f-e19553dcd2d8" (UID: "ef067657-4804-406e-b45f-e19553dcd2d8"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.906340 4841 scope.go:117] "RemoveContainer" containerID="088a47d541f00148328e19a9fb5636697d24c805841cd93d6a4ac4b7d6e6779f" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.909010 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "ef067657-4804-406e-b45f-e19553dcd2d8" (UID: "ef067657-4804-406e-b45f-e19553dcd2d8"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.920225 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6fc8b6ddd6-nkc6r"] Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.924726 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e49c58-8075-46a1-8bfd-44412a673589-logs\") pod \"a9e49c58-8075-46a1-8bfd-44412a673589\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.924827 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nplhv\" (UniqueName: \"kubernetes.io/projected/1afed894-4dfb-4873-a45c-29b70507295a-kube-api-access-nplhv\") pod \"1afed894-4dfb-4873-a45c-29b70507295a\" (UID: \"1afed894-4dfb-4873-a45c-29b70507295a\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.924894 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-config-data\") pod \"a9e49c58-8075-46a1-8bfd-44412a673589\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.924919 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-scripts\") pod \"a9e49c58-8075-46a1-8bfd-44412a673589\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.924958 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9e49c58-8075-46a1-8bfd-44412a673589-httpd-run\") pod \"a9e49c58-8075-46a1-8bfd-44412a673589\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.924976 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-public-tls-certs\") pod \"a9e49c58-8075-46a1-8bfd-44412a673589\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.925010 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-combined-ca-bundle\") pod \"a9e49c58-8075-46a1-8bfd-44412a673589\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.925039 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5pwb\" (UniqueName: \"kubernetes.io/projected/a9e49c58-8075-46a1-8bfd-44412a673589-kube-api-access-m5pwb\") pod \"a9e49c58-8075-46a1-8bfd-44412a673589\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.925062 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts\") pod \"1afed894-4dfb-4873-a45c-29b70507295a\" (UID: \"1afed894-4dfb-4873-a45c-29b70507295a\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.925124 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"a9e49c58-8075-46a1-8bfd-44412a673589\" (UID: \"a9e49c58-8075-46a1-8bfd-44412a673589\") " Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.925536 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.925550 4841 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.925560 4841 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef067657-4804-406e-b45f-e19553dcd2d8-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.925569 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmtnz\" (UniqueName: \"kubernetes.io/projected/ef067657-4804-406e-b45f-e19553dcd2d8-kube-api-access-pmtnz\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.931991 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "a9e49c58-8075-46a1-8bfd-44412a673589" (UID: "a9e49c58-8075-46a1-8bfd-44412a673589"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.933507 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9e49c58-8075-46a1-8bfd-44412a673589-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a9e49c58-8075-46a1-8bfd-44412a673589" (UID: "a9e49c58-8075-46a1-8bfd-44412a673589"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.933781 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9e49c58-8075-46a1-8bfd-44412a673589-logs" (OuterVolumeSpecName: "logs") pod "a9e49c58-8075-46a1-8bfd-44412a673589" (UID: "a9e49c58-8075-46a1-8bfd-44412a673589"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.938848 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1afed894-4dfb-4873-a45c-29b70507295a" (UID: "1afed894-4dfb-4873-a45c-29b70507295a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.943462 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1afed894-4dfb-4873-a45c-29b70507295a-kube-api-access-nplhv" (OuterVolumeSpecName: "kube-api-access-nplhv") pod "1afed894-4dfb-4873-a45c-29b70507295a" (UID: "1afed894-4dfb-4873-a45c-29b70507295a"). InnerVolumeSpecName "kube-api-access-nplhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.943696 4841 scope.go:117] "RemoveContainer" containerID="93721abe2b80f73cd1be85412e3cfc53afaba2d7b1d8e8eec69d4a39915539f1" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.945507 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e49c58-8075-46a1-8bfd-44412a673589-kube-api-access-m5pwb" (OuterVolumeSpecName: "kube-api-access-m5pwb") pod "a9e49c58-8075-46a1-8bfd-44412a673589" (UID: "a9e49c58-8075-46a1-8bfd-44412a673589"). InnerVolumeSpecName "kube-api-access-m5pwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.949076 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-scripts" (OuterVolumeSpecName: "scripts") pod "a9e49c58-8075-46a1-8bfd-44412a673589" (UID: "a9e49c58-8075-46a1-8bfd-44412a673589"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:48 crc kubenswrapper[4841]: E0130 05:29:48.956331 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9baa24e8_552c_425b_a494_ca70b9bcff0c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff1ce10f_fa3e_4526_a823_f4defdaf9085.slice/crio-43ed044e1286fb9780ca41391af739022fad6b12ebd947d30dabd327cf555fca\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28551500_d017_475a_aae4_8352782c0b4e.slice/crio-fb3571d156a923c7d7af537c762725872e8f52a1c84ffa314dad1c545f735afb\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a2724da_6b9b_4947_a4e3_894938742304.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b29e384_bd86_4102_8a9e_4745cd0ae8d5.slice/crio-5c4314bd5d4e8c9a32d1faa3556ab7cd3dae6013e3228dbb71483e6295221f1f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28551500_d017_475a_aae4_8352782c0b4e.slice\": RecentStats: unable to find data in memory cache]" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.961837 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e68-account-create-update-ldwvb" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.980009 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-676d-account-create-update-tkbqn"] Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.980061 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-676d-account-create-update-tkbqn"] Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.986977 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 05:29:48 crc kubenswrapper[4841]: I0130 05:29:48.995645 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9e49c58-8075-46a1-8bfd-44412a673589" (UID: "a9e49c58-8075-46a1-8bfd-44412a673589"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.038533 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-config-data" (OuterVolumeSpecName: "config-data") pod "a9e49c58-8075-46a1-8bfd-44412a673589" (UID: "a9e49c58-8075-46a1-8bfd-44412a673589"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.038629 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a9e49c58-8075-46a1-8bfd-44412a673589" (UID: "a9e49c58-8075-46a1-8bfd-44412a673589"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.040882 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd9v9\" (UniqueName: \"kubernetes.io/projected/7be8df86-7b8d-4741-ae13-ec1b243549b3-kube-api-access-xd9v9\") pod \"7be8df86-7b8d-4741-ae13-ec1b243549b3\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.041067 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-config-data\") pod \"7be8df86-7b8d-4741-ae13-ec1b243549b3\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.041108 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7be8df86-7b8d-4741-ae13-ec1b243549b3-logs\") pod \"7be8df86-7b8d-4741-ae13-ec1b243549b3\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.041156 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-combined-ca-bundle\") pod \"7be8df86-7b8d-4741-ae13-ec1b243549b3\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.041271 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-nova-metadata-tls-certs\") pod \"7be8df86-7b8d-4741-ae13-ec1b243549b3\" (UID: \"7be8df86-7b8d-4741-ae13-ec1b243549b3\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.042513 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7be8df86-7b8d-4741-ae13-ec1b243549b3-logs" (OuterVolumeSpecName: "logs") pod "7be8df86-7b8d-4741-ae13-ec1b243549b3" (UID: "7be8df86-7b8d-4741-ae13-ec1b243549b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.044331 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.049905 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be8df86-7b8d-4741-ae13-ec1b243549b3-kube-api-access-xd9v9" (OuterVolumeSpecName: "kube-api-access-xd9v9") pod "7be8df86-7b8d-4741-ae13-ec1b243549b3" (UID: "7be8df86-7b8d-4741-ae13-ec1b243549b3"). InnerVolumeSpecName "kube-api-access-xd9v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.091617 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9e49c58-8075-46a1-8bfd-44412a673589-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.091649 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.091662 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.091681 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5pwb\" (UniqueName: \"kubernetes.io/projected/a9e49c58-8075-46a1-8bfd-44412a673589-kube-api-access-m5pwb\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.091693 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1afed894-4dfb-4873-a45c-29b70507295a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.091720 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.091923 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e49c58-8075-46a1-8bfd-44412a673589-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.091959 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7be8df86-7b8d-4741-ae13-ec1b243549b3-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.091972 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nplhv\" (UniqueName: \"kubernetes.io/projected/1afed894-4dfb-4873-a45c-29b70507295a-kube-api-access-nplhv\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.091984 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e49c58-8075-46a1-8bfd-44412a673589-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.106610 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-config-data" (OuterVolumeSpecName: "config-data") pod "7be8df86-7b8d-4741-ae13-ec1b243549b3" (UID: "7be8df86-7b8d-4741-ae13-ec1b243549b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.108310 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7be8df86-7b8d-4741-ae13-ec1b243549b3" (UID: "7be8df86-7b8d-4741-ae13-ec1b243549b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.132616 4841 scope.go:117] "RemoveContainer" containerID="a480751fcbd360c7947b1114ae086d16d2b9b26b051a46078f3d5cb250e0a982" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.137546 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7be8df86-7b8d-4741-ae13-ec1b243549b3" (UID: "7be8df86-7b8d-4741-ae13-ec1b243549b3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.156068 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.157838 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.176171 4841 scope.go:117] "RemoveContainer" containerID="111f861be82792295977d9e1509a0d34e506936e498cf079b08adb56c250f14b" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.178802 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.179249 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.187442 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xmpzp" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.210901 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/90edf3da-3cbc-407f-9cfa-de97879f3834-memcached-tls-certs\") pod \"90edf3da-3cbc-407f-9cfa-de97879f3834\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.210998 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90edf3da-3cbc-407f-9cfa-de97879f3834-config-data\") pod \"90edf3da-3cbc-407f-9cfa-de97879f3834\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.211077 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/90edf3da-3cbc-407f-9cfa-de97879f3834-kolla-config\") pod \"90edf3da-3cbc-407f-9cfa-de97879f3834\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.211114 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90edf3da-3cbc-407f-9cfa-de97879f3834-combined-ca-bundle\") pod \"90edf3da-3cbc-407f-9cfa-de97879f3834\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.211197 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5vwx\" (UniqueName: \"kubernetes.io/projected/90edf3da-3cbc-407f-9cfa-de97879f3834-kube-api-access-d5vwx\") pod \"90edf3da-3cbc-407f-9cfa-de97879f3834\" (UID: \"90edf3da-3cbc-407f-9cfa-de97879f3834\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.211444 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmqn6\" (UniqueName: \"kubernetes.io/projected/06e72b40-f22a-4602-a483-7653890ae089-kube-api-access-qmqn6\") pod \"keystone-2e68-account-create-update-ldwvb\" (UID: \"06e72b40-f22a-4602-a483-7653890ae089\") " pod="openstack/keystone-2e68-account-create-update-ldwvb" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.211557 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e72b40-f22a-4602-a483-7653890ae089-operator-scripts\") pod \"keystone-2e68-account-create-update-ldwvb\" (UID: \"06e72b40-f22a-4602-a483-7653890ae089\") " pod="openstack/keystone-2e68-account-create-update-ldwvb" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.212191 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90edf3da-3cbc-407f-9cfa-de97879f3834-config-data" (OuterVolumeSpecName: "config-data") pod "90edf3da-3cbc-407f-9cfa-de97879f3834" (UID: "90edf3da-3cbc-407f-9cfa-de97879f3834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.212213 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.212303 4841 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.212350 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/06e72b40-f22a-4602-a483-7653890ae089-operator-scripts podName:06e72b40-f22a-4602-a483-7653890ae089 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:51.212332165 +0000 UTC m=+1328.205804803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/06e72b40-f22a-4602-a483-7653890ae089-operator-scripts") pod "keystone-2e68-account-create-update-ldwvb" (UID: "06e72b40-f22a-4602-a483-7653890ae089") : configmap "openstack-scripts" not found Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.212596 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90edf3da-3cbc-407f-9cfa-de97879f3834-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "90edf3da-3cbc-407f-9cfa-de97879f3834" (UID: "90edf3da-3cbc-407f-9cfa-de97879f3834"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.213386 4841 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.213445 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd9v9\" (UniqueName: \"kubernetes.io/projected/7be8df86-7b8d-4741-ae13-ec1b243549b3-kube-api-access-xd9v9\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.213461 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.213481 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7be8df86-7b8d-4741-ae13-ec1b243549b3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.223522 4841 projected.go:194] Error preparing data for projected volume kube-api-access-qmqn6 for pod openstack/keystone-2e68-account-create-update-ldwvb: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.223581 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06e72b40-f22a-4602-a483-7653890ae089-kube-api-access-qmqn6 podName:06e72b40-f22a-4602-a483-7653890ae089 nodeName:}" failed. No retries permitted until 2026-01-30 05:29:51.223564328 +0000 UTC m=+1328.217036966 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qmqn6" (UniqueName: "kubernetes.io/projected/06e72b40-f22a-4602-a483-7653890ae089-kube-api-access-qmqn6") pod "keystone-2e68-account-create-update-ldwvb" (UID: "06e72b40-f22a-4602-a483-7653890ae089") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.226282 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90edf3da-3cbc-407f-9cfa-de97879f3834-kube-api-access-d5vwx" (OuterVolumeSpecName: "kube-api-access-d5vwx") pod "90edf3da-3cbc-407f-9cfa-de97879f3834" (UID: "90edf3da-3cbc-407f-9cfa-de97879f3834"). InnerVolumeSpecName "kube-api-access-d5vwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.238883 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90edf3da-3cbc-407f-9cfa-de97879f3834-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90edf3da-3cbc-407f-9cfa-de97879f3834" (UID: "90edf3da-3cbc-407f-9cfa-de97879f3834"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.274504 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90edf3da-3cbc-407f-9cfa-de97879f3834-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "90edf3da-3cbc-407f-9cfa-de97879f3834" (UID: "90edf3da-3cbc-407f-9cfa-de97879f3834"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315026 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"73fdf532-7bb7-43db-acbc-b949166ccd6b\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315085 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bv5s\" (UniqueName: \"kubernetes.io/projected/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-kube-api-access-7bv5s\") pod \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315113 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91db9edf-7d6d-4189-aaac-480a438900be-logs\") pod \"91db9edf-7d6d-4189-aaac-480a438900be\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315140 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73fdf532-7bb7-43db-acbc-b949166ccd6b-httpd-run\") pod \"73fdf532-7bb7-43db-acbc-b949166ccd6b\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315172 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wps9m\" (UniqueName: \"kubernetes.io/projected/61d87ff7-95c3-43dc-afab-c9ba144b69e9-kube-api-access-wps9m\") pod \"61d87ff7-95c3-43dc-afab-c9ba144b69e9\" (UID: \"61d87ff7-95c3-43dc-afab-c9ba144b69e9\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315211 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqhrz\" (UniqueName: \"kubernetes.io/projected/91db9edf-7d6d-4189-aaac-480a438900be-kube-api-access-vqhrz\") pod \"91db9edf-7d6d-4189-aaac-480a438900be\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315248 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61d87ff7-95c3-43dc-afab-c9ba144b69e9-operator-scripts\") pod \"61d87ff7-95c3-43dc-afab-c9ba144b69e9\" (UID: \"61d87ff7-95c3-43dc-afab-c9ba144b69e9\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315275 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73fdf532-7bb7-43db-acbc-b949166ccd6b-logs\") pod \"73fdf532-7bb7-43db-acbc-b949166ccd6b\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315322 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-combined-ca-bundle\") pod \"91db9edf-7d6d-4189-aaac-480a438900be\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315359 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-public-tls-certs\") pod \"91db9edf-7d6d-4189-aaac-480a438900be\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315393 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-public-tls-certs\") pod \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315449 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-config-data-custom\") pod \"91db9edf-7d6d-4189-aaac-480a438900be\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315501 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-internal-tls-certs\") pod \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315539 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-combined-ca-bundle\") pod \"73fdf532-7bb7-43db-acbc-b949166ccd6b\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315733 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91db9edf-7d6d-4189-aaac-480a438900be-logs" (OuterVolumeSpecName: "logs") pod "91db9edf-7d6d-4189-aaac-480a438900be" (UID: "91db9edf-7d6d-4189-aaac-480a438900be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315829 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73fdf532-7bb7-43db-acbc-b949166ccd6b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "73fdf532-7bb7-43db-acbc-b949166ccd6b" (UID: "73fdf532-7bb7-43db-acbc-b949166ccd6b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.315597 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-logs\") pod \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.316142 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-scripts\") pod \"73fdf532-7bb7-43db-acbc-b949166ccd6b\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.316169 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-config-data\") pod \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.316220 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-internal-tls-certs\") pod \"91db9edf-7d6d-4189-aaac-480a438900be\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.316279 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-config-data\") pod \"73fdf532-7bb7-43db-acbc-b949166ccd6b\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.316306 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9ltz\" (UniqueName: \"kubernetes.io/projected/73fdf532-7bb7-43db-acbc-b949166ccd6b-kube-api-access-q9ltz\") pod \"73fdf532-7bb7-43db-acbc-b949166ccd6b\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.316330 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-combined-ca-bundle\") pod \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\" (UID: \"ecdab3bb-c4de-4c49-9988-d9ed592a40a7\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.316356 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-config-data\") pod \"91db9edf-7d6d-4189-aaac-480a438900be\" (UID: \"91db9edf-7d6d-4189-aaac-480a438900be\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.316383 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-internal-tls-certs\") pod \"73fdf532-7bb7-43db-acbc-b949166ccd6b\" (UID: \"73fdf532-7bb7-43db-acbc-b949166ccd6b\") " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.317064 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5vwx\" (UniqueName: \"kubernetes.io/projected/90edf3da-3cbc-407f-9cfa-de97879f3834-kube-api-access-d5vwx\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.317083 4841 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/90edf3da-3cbc-407f-9cfa-de97879f3834-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.317095 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90edf3da-3cbc-407f-9cfa-de97879f3834-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.317106 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91db9edf-7d6d-4189-aaac-480a438900be-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.317119 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73fdf532-7bb7-43db-acbc-b949166ccd6b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.317130 4841 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/90edf3da-3cbc-407f-9cfa-de97879f3834-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.317123 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61d87ff7-95c3-43dc-afab-c9ba144b69e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61d87ff7-95c3-43dc-afab-c9ba144b69e9" (UID: "61d87ff7-95c3-43dc-afab-c9ba144b69e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.317141 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90edf3da-3cbc-407f-9cfa-de97879f3834-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.317986 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-logs" (OuterVolumeSpecName: "logs") pod "ecdab3bb-c4de-4c49-9988-d9ed592a40a7" (UID: "ecdab3bb-c4de-4c49-9988-d9ed592a40a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.318317 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61d87ff7-95c3-43dc-afab-c9ba144b69e9-kube-api-access-wps9m" (OuterVolumeSpecName: "kube-api-access-wps9m") pod "61d87ff7-95c3-43dc-afab-c9ba144b69e9" (UID: "61d87ff7-95c3-43dc-afab-c9ba144b69e9"). InnerVolumeSpecName "kube-api-access-wps9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.318510 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73fdf532-7bb7-43db-acbc-b949166ccd6b-logs" (OuterVolumeSpecName: "logs") pod "73fdf532-7bb7-43db-acbc-b949166ccd6b" (UID: "73fdf532-7bb7-43db-acbc-b949166ccd6b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.318781 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "73fdf532-7bb7-43db-acbc-b949166ccd6b" (UID: "73fdf532-7bb7-43db-acbc-b949166ccd6b"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.321565 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "91db9edf-7d6d-4189-aaac-480a438900be" (UID: "91db9edf-7d6d-4189-aaac-480a438900be"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.321656 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-kube-api-access-7bv5s" (OuterVolumeSpecName: "kube-api-access-7bv5s") pod "ecdab3bb-c4de-4c49-9988-d9ed592a40a7" (UID: "ecdab3bb-c4de-4c49-9988-d9ed592a40a7"). InnerVolumeSpecName "kube-api-access-7bv5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.321704 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91db9edf-7d6d-4189-aaac-480a438900be-kube-api-access-vqhrz" (OuterVolumeSpecName: "kube-api-access-vqhrz") pod "91db9edf-7d6d-4189-aaac-480a438900be" (UID: "91db9edf-7d6d-4189-aaac-480a438900be"). InnerVolumeSpecName "kube-api-access-vqhrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.323510 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-scripts" (OuterVolumeSpecName: "scripts") pod "73fdf532-7bb7-43db-acbc-b949166ccd6b" (UID: "73fdf532-7bb7-43db-acbc-b949166ccd6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.336469 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73fdf532-7bb7-43db-acbc-b949166ccd6b-kube-api-access-q9ltz" (OuterVolumeSpecName: "kube-api-access-q9ltz") pod "73fdf532-7bb7-43db-acbc-b949166ccd6b" (UID: "73fdf532-7bb7-43db-acbc-b949166ccd6b"). InnerVolumeSpecName "kube-api-access-q9ltz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.341199 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91db9edf-7d6d-4189-aaac-480a438900be" (UID: "91db9edf-7d6d-4189-aaac-480a438900be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.356528 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-config-data" (OuterVolumeSpecName: "config-data") pod "ecdab3bb-c4de-4c49-9988-d9ed592a40a7" (UID: "ecdab3bb-c4de-4c49-9988-d9ed592a40a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.412050 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73fdf532-7bb7-43db-acbc-b949166ccd6b" (UID: "73fdf532-7bb7-43db-acbc-b949166ccd6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418002 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418027 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418035 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418044 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9ltz\" (UniqueName: \"kubernetes.io/projected/73fdf532-7bb7-43db-acbc-b949166ccd6b-kube-api-access-q9ltz\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418075 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418084 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bv5s\" (UniqueName: \"kubernetes.io/projected/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-kube-api-access-7bv5s\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418094 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wps9m\" (UniqueName: \"kubernetes.io/projected/61d87ff7-95c3-43dc-afab-c9ba144b69e9-kube-api-access-wps9m\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418103 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqhrz\" (UniqueName: \"kubernetes.io/projected/91db9edf-7d6d-4189-aaac-480a438900be-kube-api-access-vqhrz\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418112 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61d87ff7-95c3-43dc-afab-c9ba144b69e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418119 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73fdf532-7bb7-43db-acbc-b949166ccd6b-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418127 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418136 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.418143 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.435467 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ecdab3bb-c4de-4c49-9988-d9ed592a40a7" (UID: "ecdab3bb-c4de-4c49-9988-d9ed592a40a7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.441916 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecdab3bb-c4de-4c49-9988-d9ed592a40a7" (UID: "ecdab3bb-c4de-4c49-9988-d9ed592a40a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.468451 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-config-data" (OuterVolumeSpecName: "config-data") pod "73fdf532-7bb7-43db-acbc-b949166ccd6b" (UID: "73fdf532-7bb7-43db-acbc-b949166ccd6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.479608 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-config-data" (OuterVolumeSpecName: "config-data") pod "91db9edf-7d6d-4189-aaac-480a438900be" (UID: "91db9edf-7d6d-4189-aaac-480a438900be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.491022 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "91db9edf-7d6d-4189-aaac-480a438900be" (UID: "91db9edf-7d6d-4189-aaac-480a438900be"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.491047 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ecdab3bb-c4de-4c49-9988-d9ed592a40a7" (UID: "ecdab3bb-c4de-4c49-9988-d9ed592a40a7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.520326 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.520389 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.520438 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.520448 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.520457 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdab3bb-c4de-4c49-9988-d9ed592a40a7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.520466 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.539820 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.548522 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "73fdf532-7bb7-43db-acbc-b949166ccd6b" (UID: "73fdf532-7bb7-43db-acbc-b949166ccd6b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.555553 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "91db9edf-7d6d-4189-aaac-480a438900be" (UID: "91db9edf-7d6d-4189-aaac-480a438900be"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.567867 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b18eea1e80216e394517133207cd5dd3e554316b05364a6776862ef3821a0f05 is running failed: container process not found" containerID="b18eea1e80216e394517133207cd5dd3e554316b05364a6776862ef3821a0f05" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.569749 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.569784 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b18eea1e80216e394517133207cd5dd3e554316b05364a6776862ef3821a0f05 is running failed: container process not found" containerID="b18eea1e80216e394517133207cd5dd3e554316b05364a6776862ef3821a0f05" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.569830 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.571745 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.571763 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.571763 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b18eea1e80216e394517133207cd5dd3e554316b05364a6776862ef3821a0f05 is running failed: container process not found" containerID="b18eea1e80216e394517133207cd5dd3e554316b05364a6776862ef3821a0f05" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.571848 4841 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b18eea1e80216e394517133207cd5dd3e554316b05364a6776862ef3821a0f05 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="84924340-1dd2-488e-a6e6-adfe62b61f2f" containerName="ovn-northd" Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.575479 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.575514 4841 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovsdb-server" Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.576298 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.576334 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovs-vswitchd" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.621670 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73fdf532-7bb7-43db-acbc-b949166ccd6b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.621701 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.621711 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91db9edf-7d6d-4189-aaac-480a438900be-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.690197 4841 generic.go:334] "Generic (PLEG): container finished" podID="84508935-db34-4e2b-a3af-800ac432353b" containerID="d71848d69df407548c21d416cd1905e03bc4273aed56307ec048215b6bd60f64" exitCode=0 Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.690225 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84508935-db34-4e2b-a3af-800ac432353b","Type":"ContainerDied","Data":"d71848d69df407548c21d416cd1905e03bc4273aed56307ec048215b6bd60f64"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.690275 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84508935-db34-4e2b-a3af-800ac432353b","Type":"ContainerDied","Data":"fb4312d698aba3570b2814baa1b78d3ca797f5d3fabd9b6ab07c441e6ea68afc"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.690288 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb4312d698aba3570b2814baa1b78d3ca797f5d3fabd9b6ab07c441e6ea68afc" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.692072 4841 generic.go:334] "Generic (PLEG): container finished" podID="3b29e384-bd86-4102-8a9e-4745cd0ae8d5" containerID="5c4314bd5d4e8c9a32d1faa3556ab7cd3dae6013e3228dbb71483e6295221f1f" exitCode=0 Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.692117 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" event={"ID":"3b29e384-bd86-4102-8a9e-4745cd0ae8d5","Type":"ContainerDied","Data":"5c4314bd5d4e8c9a32d1faa3556ab7cd3dae6013e3228dbb71483e6295221f1f"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.692133 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" event={"ID":"3b29e384-bd86-4102-8a9e-4745cd0ae8d5","Type":"ContainerDied","Data":"a98cfd75ca6b38eb01720689777a32dfeb6ddb4b423204a48d38c8e079b35cd6"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.692142 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a98cfd75ca6b38eb01720689777a32dfeb6ddb4b423204a48d38c8e079b35cd6" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.693950 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.695812 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xmpzp" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.695806 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xmpzp" event={"ID":"61d87ff7-95c3-43dc-afab-c9ba144b69e9","Type":"ContainerDied","Data":"b619be961f81759ded24e296b3d6adce2828cbc3371487faf7077205b3294127"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.697362 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"90edf3da-3cbc-407f-9cfa-de97879f3834","Type":"ContainerDied","Data":"972295e9c0c3b66164e7fbe32b6501eced61aa6288391ea197e6a4619df11e29"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.697418 4841 scope.go:117] "RemoveContainer" containerID="33a2da7865353e0526f0692b43cfed86797432813b0f51302a7eafdb6082c4e2" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.697512 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.710473 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3043b1132b1d8cd3d83be2ceb2adfb4bc8bd0b66f8fa3fcc542bda3812e357ec" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.712689 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3043b1132b1d8cd3d83be2ceb2adfb4bc8bd0b66f8fa3fcc542bda3812e357ec" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.713871 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3043b1132b1d8cd3d83be2ceb2adfb4bc8bd0b66f8fa3fcc542bda3812e357ec" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 30 05:29:49 crc kubenswrapper[4841]: E0130 05:29:49.713900 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="60a29cc8-4615-40e7-a687-1852db124ba0" containerName="nova-cell1-conductor-conductor" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.714802 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"73fdf532-7bb7-43db-acbc-b949166ccd6b","Type":"ContainerDied","Data":"8c202ec5dddac615730615bf9ba4d68a18e872bb7cf1ae7d419c5d66943a6f6c"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.714884 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.723887 4841 generic.go:334] "Generic (PLEG): container finished" podID="8e2461f9-732a-448f-a5d7-7528bc3956e3" containerID="11b34a44cee08eb57f7dd12e3101d19de35b79f567abf07b3e86b4e5ced1412b" exitCode=0 Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.723963 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8e2461f9-732a-448f-a5d7-7528bc3956e3","Type":"ContainerDied","Data":"11b34a44cee08eb57f7dd12e3101d19de35b79f567abf07b3e86b4e5ced1412b"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.729921 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_84924340-1dd2-488e-a6e6-adfe62b61f2f/ovn-northd/0.log" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.729956 4841 generic.go:334] "Generic (PLEG): container finished" podID="84924340-1dd2-488e-a6e6-adfe62b61f2f" containerID="b18eea1e80216e394517133207cd5dd3e554316b05364a6776862ef3821a0f05" exitCode=139 Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.729998 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"84924340-1dd2-488e-a6e6-adfe62b61f2f","Type":"ContainerDied","Data":"b18eea1e80216e394517133207cd5dd3e554316b05364a6776862ef3821a0f05"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.741950 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d94d6f7cb-lf9nq" event={"ID":"91db9edf-7d6d-4189-aaac-480a438900be","Type":"ContainerDied","Data":"b4d32361f9052fe7c32aa9c1a7bae2e5fb1d7bfde165d57d450468b7d3637822"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.742154 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d94d6f7cb-lf9nq" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.758326 4841 generic.go:334] "Generic (PLEG): container finished" podID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" containerID="e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7" exitCode=0 Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.758373 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.758416 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecdab3bb-c4de-4c49-9988-d9ed592a40a7","Type":"ContainerDied","Data":"e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.759529 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecdab3bb-c4de-4c49-9988-d9ed592a40a7","Type":"ContainerDied","Data":"7191ea666925a4bd95337e8184f0dbfd12c2d8bb4996e5e1616237b7a804f5de"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.784026 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e68-account-create-update-ldwvb" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.784835 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.784883 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7be8df86-7b8d-4741-ae13-ec1b243549b3","Type":"ContainerDied","Data":"fa94895eb27278be690fd9feac29050bf3fa70985c449d4f4adec0b712a1b02e"} Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.784944 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.784948 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mtk2w" podUID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerName="registry-server" containerID="cri-o://639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35" gracePeriod=2 Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.785946 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7z8bf" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.786714 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-88rdn" podUID="47d25b55-9643-45fd-b2fe-eb593334924d" containerName="ovn-controller" probeResult="failure" output="command timed out" Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.859986 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-88rdn" podUID="47d25b55-9643-45fd-b2fe-eb593334924d" containerName="ovn-controller" probeResult="failure" output=< Jan 30 05:29:49 crc kubenswrapper[4841]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Jan 30 05:29:49 crc kubenswrapper[4841]: > Jan 30 05:29:49 crc kubenswrapper[4841]: I0130 05:29:49.995536 4841 scope.go:117] "RemoveContainer" containerID="249a278941e10cc86f1a6bbc7212e043a936d0a3e53d7ee03f0ed2a158d8459b" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.042353 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.042836 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.044734 4841 scope.go:117] "RemoveContainer" containerID="f9bffcb1c98bf20cc71c95a2813cb2b9fa85cff321938e56af140ba1916af595" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.067375 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.085141 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_84924340-1dd2-488e-a6e6-adfe62b61f2f/ovn-northd/0.log" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.085208 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.085871 4841 scope.go:117] "RemoveContainer" containerID="bc897d8ef0f32c66c2606560ae71bcd74a208effcd7e7b2a87ba2f0b34843405" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.134684 4841 scope.go:117] "RemoveContainer" containerID="0de4c7d3f6fcb35d2a2e2a038bee89e95869ad06f6b00924683feb595867a396" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.137586 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.160300 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.174320 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.183064 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.189886 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.196569 4841 scope.go:117] "RemoveContainer" containerID="e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.197449 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.206069 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d94d6f7cb-lf9nq"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.213017 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d94d6f7cb-lf9nq"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.219372 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.225358 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.234214 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.236163 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwcns\" (UniqueName: \"kubernetes.io/projected/84508935-db34-4e2b-a3af-800ac432353b-kube-api-access-wwcns\") pod \"84508935-db34-4e2b-a3af-800ac432353b\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.236204 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7z8bf"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.236210 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-config-data-custom\") pod \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.236375 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvrps\" (UniqueName: \"kubernetes.io/projected/8e2461f9-732a-448f-a5d7-7528bc3956e3-kube-api-access-jvrps\") pod \"8e2461f9-732a-448f-a5d7-7528bc3956e3\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.236777 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-ovn-northd-tls-certs\") pod \"84924340-1dd2-488e-a6e6-adfe62b61f2f\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.236839 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-kolla-config\") pod \"8e2461f9-732a-448f-a5d7-7528bc3956e3\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.236963 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-combined-ca-bundle\") pod \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.237066 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vlbb\" (UniqueName: \"kubernetes.io/projected/84924340-1dd2-488e-a6e6-adfe62b61f2f-kube-api-access-9vlbb\") pod \"84924340-1dd2-488e-a6e6-adfe62b61f2f\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.237091 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"8e2461f9-732a-448f-a5d7-7528bc3956e3\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.237126 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-combined-ca-bundle\") pod \"84508935-db34-4e2b-a3af-800ac432353b\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.243767 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8e2461f9-732a-448f-a5d7-7528bc3956e3" (UID: "8e2461f9-732a-448f-a5d7-7528bc3956e3"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.245988 4841 scope.go:117] "RemoveContainer" containerID="153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246096 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-combined-ca-bundle\") pod \"84924340-1dd2-488e-a6e6-adfe62b61f2f\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246132 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-config-data-default\") pod \"8e2461f9-732a-448f-a5d7-7528bc3956e3\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246355 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84508935-db34-4e2b-a3af-800ac432353b-log-httpd\") pod \"84508935-db34-4e2b-a3af-800ac432353b\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246358 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7z8bf"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246372 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-scripts\") pod \"84508935-db34-4e2b-a3af-800ac432353b\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246467 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-logs\") pod \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246494 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84924340-1dd2-488e-a6e6-adfe62b61f2f-config\") pod \"84924340-1dd2-488e-a6e6-adfe62b61f2f\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246522 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84508935-db34-4e2b-a3af-800ac432353b-run-httpd\") pod \"84508935-db34-4e2b-a3af-800ac432353b\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246572 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/84924340-1dd2-488e-a6e6-adfe62b61f2f-ovn-rundir\") pod \"84924340-1dd2-488e-a6e6-adfe62b61f2f\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246594 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-operator-scripts\") pod \"8e2461f9-732a-448f-a5d7-7528bc3956e3\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246617 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-config-data\") pod \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246641 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-metrics-certs-tls-certs\") pod \"84924340-1dd2-488e-a6e6-adfe62b61f2f\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246657 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-sg-core-conf-yaml\") pod \"84508935-db34-4e2b-a3af-800ac432353b\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246679 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-ceilometer-tls-certs\") pod \"84508935-db34-4e2b-a3af-800ac432353b\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246710 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e2461f9-732a-448f-a5d7-7528bc3956e3-galera-tls-certs\") pod \"8e2461f9-732a-448f-a5d7-7528bc3956e3\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246728 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84924340-1dd2-488e-a6e6-adfe62b61f2f-scripts\") pod \"84924340-1dd2-488e-a6e6-adfe62b61f2f\" (UID: \"84924340-1dd2-488e-a6e6-adfe62b61f2f\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246744 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-config-data\") pod \"84508935-db34-4e2b-a3af-800ac432353b\" (UID: \"84508935-db34-4e2b-a3af-800ac432353b\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246765 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4tm6\" (UniqueName: \"kubernetes.io/projected/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-kube-api-access-n4tm6\") pod \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\" (UID: \"3b29e384-bd86-4102-8a9e-4745cd0ae8d5\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246788 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8e2461f9-732a-448f-a5d7-7528bc3956e3-config-data-generated\") pod \"8e2461f9-732a-448f-a5d7-7528bc3956e3\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246804 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2461f9-732a-448f-a5d7-7528bc3956e3-combined-ca-bundle\") pod \"8e2461f9-732a-448f-a5d7-7528bc3956e3\" (UID: \"8e2461f9-732a-448f-a5d7-7528bc3956e3\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.246839 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "8e2461f9-732a-448f-a5d7-7528bc3956e3" (UID: "8e2461f9-732a-448f-a5d7-7528bc3956e3"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.247335 4841 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.247347 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: E0130 05:29:50.247407 4841 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:50 crc kubenswrapper[4841]: E0130 05:29:50.247442 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data podName:ad7779ad-0912-4695-853f-3ce786c2e9ae nodeName:}" failed. No retries permitted until 2026-01-30 05:29:58.247429845 +0000 UTC m=+1335.240902473 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data") pod "rabbitmq-cell1-server-0" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae") : configmap "rabbitmq-cell1-config-data" not found Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.247538 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84508935-db34-4e2b-a3af-800ac432353b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "84508935-db34-4e2b-a3af-800ac432353b" (UID: "84508935-db34-4e2b-a3af-800ac432353b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.248628 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2461f9-732a-448f-a5d7-7528bc3956e3-kube-api-access-jvrps" (OuterVolumeSpecName: "kube-api-access-jvrps") pod "8e2461f9-732a-448f-a5d7-7528bc3956e3" (UID: "8e2461f9-732a-448f-a5d7-7528bc3956e3"). InnerVolumeSpecName "kube-api-access-jvrps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.248658 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84508935-db34-4e2b-a3af-800ac432353b-kube-api-access-wwcns" (OuterVolumeSpecName: "kube-api-access-wwcns") pod "84508935-db34-4e2b-a3af-800ac432353b" (UID: "84508935-db34-4e2b-a3af-800ac432353b"). InnerVolumeSpecName "kube-api-access-wwcns". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.249466 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84924340-1dd2-488e-a6e6-adfe62b61f2f-scripts" (OuterVolumeSpecName: "scripts") pod "84924340-1dd2-488e-a6e6-adfe62b61f2f" (UID: "84924340-1dd2-488e-a6e6-adfe62b61f2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.250185 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2461f9-732a-448f-a5d7-7528bc3956e3-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "8e2461f9-732a-448f-a5d7-7528bc3956e3" (UID: "8e2461f9-732a-448f-a5d7-7528bc3956e3"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.250780 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e2461f9-732a-448f-a5d7-7528bc3956e3" (UID: "8e2461f9-732a-448f-a5d7-7528bc3956e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.251142 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-logs" (OuterVolumeSpecName: "logs") pod "3b29e384-bd86-4102-8a9e-4745cd0ae8d5" (UID: "3b29e384-bd86-4102-8a9e-4745cd0ae8d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.251193 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84924340-1dd2-488e-a6e6-adfe62b61f2f-config" (OuterVolumeSpecName: "config") pod "84924340-1dd2-488e-a6e6-adfe62b61f2f" (UID: "84924340-1dd2-488e-a6e6-adfe62b61f2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.251280 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84508935-db34-4e2b-a3af-800ac432353b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "84508935-db34-4e2b-a3af-800ac432353b" (UID: "84508935-db34-4e2b-a3af-800ac432353b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.251541 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84924340-1dd2-488e-a6e6-adfe62b61f2f-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "84924340-1dd2-488e-a6e6-adfe62b61f2f" (UID: "84924340-1dd2-488e-a6e6-adfe62b61f2f"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.259107 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3b29e384-bd86-4102-8a9e-4745cd0ae8d5" (UID: "3b29e384-bd86-4102-8a9e-4745cd0ae8d5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.263550 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-scripts" (OuterVolumeSpecName: "scripts") pod "84508935-db34-4e2b-a3af-800ac432353b" (UID: "84508935-db34-4e2b-a3af-800ac432353b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.271758 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84924340-1dd2-488e-a6e6-adfe62b61f2f-kube-api-access-9vlbb" (OuterVolumeSpecName: "kube-api-access-9vlbb") pod "84924340-1dd2-488e-a6e6-adfe62b61f2f" (UID: "84924340-1dd2-488e-a6e6-adfe62b61f2f"). InnerVolumeSpecName "kube-api-access-9vlbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.271877 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-kube-api-access-n4tm6" (OuterVolumeSpecName: "kube-api-access-n4tm6") pod "3b29e384-bd86-4102-8a9e-4745cd0ae8d5" (UID: "3b29e384-bd86-4102-8a9e-4745cd0ae8d5"). InnerVolumeSpecName "kube-api-access-n4tm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.281335 4841 scope.go:117] "RemoveContainer" containerID="e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7" Jan 30 05:29:50 crc kubenswrapper[4841]: E0130 05:29:50.282871 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7\": container with ID starting with e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7 not found: ID does not exist" containerID="e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.282916 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7"} err="failed to get container status \"e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7\": rpc error: code = NotFound desc = could not find container \"e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7\": container with ID starting with e23013ab9ee97266a1e6583eb715d316426b22aeddf9bed9223c2988065c1ff7 not found: ID does not exist" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.282942 4841 scope.go:117] "RemoveContainer" containerID="153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47" Jan 30 05:29:50 crc kubenswrapper[4841]: E0130 05:29:50.283984 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47\": container with ID starting with 153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47 not found: ID does not exist" containerID="153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.284048 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47"} err="failed to get container status \"153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47\": rpc error: code = NotFound desc = could not find container \"153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47\": container with ID starting with 153e13a739b90debddc5b89934f19037c13016122f4d9a0e3060510c7d3e9e47 not found: ID does not exist" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.284319 4841 scope.go:117] "RemoveContainer" containerID="d334449b0de1d005d626d94fd883ec0192a4936b40dc9ec24b425dde3a584637" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.292390 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "8e2461f9-732a-448f-a5d7-7528bc3956e3" (UID: "8e2461f9-732a-448f-a5d7-7528bc3956e3"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.296614 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xmpzp"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.312482 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xmpzp"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.318961 4841 scope.go:117] "RemoveContainer" containerID="2603ac57002b34136c88ddf281a8b6cb56fddccd4b183b0ab1effc47d15e9154" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.320645 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b29e384-bd86-4102-8a9e-4745cd0ae8d5" (UID: "3b29e384-bd86-4102-8a9e-4745cd0ae8d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.349137 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b781c5e-301c-4458-8dea-4494e6ff8ee1-utilities\") pod \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\" (UID: \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.349200 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2xf8\" (UniqueName: \"kubernetes.io/projected/2b781c5e-301c-4458-8dea-4494e6ff8ee1-kube-api-access-l2xf8\") pod \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\" (UID: \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.349343 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b781c5e-301c-4458-8dea-4494e6ff8ee1-catalog-content\") pod \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\" (UID: \"2b781c5e-301c-4458-8dea-4494e6ff8ee1\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.349937 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84508935-db34-4e2b-a3af-800ac432353b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.349951 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.349940 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b781c5e-301c-4458-8dea-4494e6ff8ee1-utilities" (OuterVolumeSpecName: "utilities") pod "2b781c5e-301c-4458-8dea-4494e6ff8ee1" (UID: "2b781c5e-301c-4458-8dea-4494e6ff8ee1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.349959 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350001 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84924340-1dd2-488e-a6e6-adfe62b61f2f-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350021 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84508935-db34-4e2b-a3af-800ac432353b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350031 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/84924340-1dd2-488e-a6e6-adfe62b61f2f-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350042 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e2461f9-732a-448f-a5d7-7528bc3956e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350067 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84924340-1dd2-488e-a6e6-adfe62b61f2f-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350076 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4tm6\" (UniqueName: \"kubernetes.io/projected/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-kube-api-access-n4tm6\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350086 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8e2461f9-732a-448f-a5d7-7528bc3956e3-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350095 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwcns\" (UniqueName: \"kubernetes.io/projected/84508935-db34-4e2b-a3af-800ac432353b-kube-api-access-wwcns\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350108 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350117 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvrps\" (UniqueName: \"kubernetes.io/projected/8e2461f9-732a-448f-a5d7-7528bc3956e3-kube-api-access-jvrps\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350126 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350216 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vlbb\" (UniqueName: \"kubernetes.io/projected/84924340-1dd2-488e-a6e6-adfe62b61f2f-kube-api-access-9vlbb\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.350242 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.362007 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84924340-1dd2-488e-a6e6-adfe62b61f2f" (UID: "84924340-1dd2-488e-a6e6-adfe62b61f2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.366121 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.371385 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.375563 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b781c5e-301c-4458-8dea-4494e6ff8ee1-kube-api-access-l2xf8" (OuterVolumeSpecName: "kube-api-access-l2xf8") pod "2b781c5e-301c-4458-8dea-4494e6ff8ee1" (UID: "2b781c5e-301c-4458-8dea-4494e6ff8ee1"). InnerVolumeSpecName "kube-api-access-l2xf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.382531 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "84508935-db34-4e2b-a3af-800ac432353b" (UID: "84508935-db34-4e2b-a3af-800ac432353b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.385650 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2e68-account-create-update-ldwvb"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.388364 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2e68-account-create-update-ldwvb"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.395293 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.396967 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.405944 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2461f9-732a-448f-a5d7-7528bc3956e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e2461f9-732a-448f-a5d7-7528bc3956e3" (UID: "8e2461f9-732a-448f-a5d7-7528bc3956e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.411820 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 30 05:29:50 crc kubenswrapper[4841]: E0130 05:29:50.432201 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:29:50 crc kubenswrapper[4841]: E0130 05:29:50.436123 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.444036 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e72b40-f22a-4602-a483-7653890ae089" path="/var/lib/kubelet/pods/06e72b40-f22a-4602-a483-7653890ae089/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: E0130 05:29:50.444075 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 05:29:50 crc kubenswrapper[4841]: E0130 05:29:50.444122 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="86140170-ca48-47e9-b587-43f98f3624c1" containerName="nova-scheduler-scheduler" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.444360 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1afed894-4dfb-4873-a45c-29b70507295a" path="/var/lib/kubelet/pods/1afed894-4dfb-4873-a45c-29b70507295a/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.445392 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28551500-d017-475a-aae4-8352782c0b4e" path="/var/lib/kubelet/pods/28551500-d017-475a-aae4-8352782c0b4e/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.446602 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61d87ff7-95c3-43dc-afab-c9ba144b69e9" path="/var/lib/kubelet/pods/61d87ff7-95c3-43dc-afab-c9ba144b69e9/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.446601 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "84924340-1dd2-488e-a6e6-adfe62b61f2f" (UID: "84924340-1dd2-488e-a6e6-adfe62b61f2f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.447699 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73fdf532-7bb7-43db-acbc-b949166ccd6b" path="/var/lib/kubelet/pods/73fdf532-7bb7-43db-acbc-b949166ccd6b/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.449234 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2461f9-732a-448f-a5d7-7528bc3956e3-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "8e2461f9-732a-448f-a5d7-7528bc3956e3" (UID: "8e2461f9-732a-448f-a5d7-7528bc3956e3"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.449661 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" path="/var/lib/kubelet/pods/7be8df86-7b8d-4741-ae13-ec1b243549b3/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.450252 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90edf3da-3cbc-407f-9cfa-de97879f3834" path="/var/lib/kubelet/pods/90edf3da-3cbc-407f-9cfa-de97879f3834/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.451344 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91db9edf-7d6d-4189-aaac-480a438900be" path="/var/lib/kubelet/pods/91db9edf-7d6d-4189-aaac-480a438900be/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.451975 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9baa24e8-552c-425b-a494-ca70b9bcff0c" path="/var/lib/kubelet/pods/9baa24e8-552c-425b-a494-ca70b9bcff0c/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.452057 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2461f9-732a-448f-a5d7-7528bc3956e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.452084 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.452095 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.452105 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.452114 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.452122 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b781c5e-301c-4458-8dea-4494e6ff8ee1-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.452131 4841 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e2461f9-732a-448f-a5d7-7528bc3956e3-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.452140 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2xf8\" (UniqueName: \"kubernetes.io/projected/2b781c5e-301c-4458-8dea-4494e6ff8ee1-kube-api-access-l2xf8\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.452555 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e49c58-8075-46a1-8bfd-44412a673589" path="/var/lib/kubelet/pods/a9e49c58-8075-46a1-8bfd-44412a673589/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.453689 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b781c5e-301c-4458-8dea-4494e6ff8ee1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b781c5e-301c-4458-8dea-4494e6ff8ee1" (UID: "2b781c5e-301c-4458-8dea-4494e6ff8ee1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.460738 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" path="/var/lib/kubelet/pods/ecdab3bb-c4de-4c49-9988-d9ed592a40a7/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.462102 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef067657-4804-406e-b45f-e19553dcd2d8" path="/var/lib/kubelet/pods/ef067657-4804-406e-b45f-e19553dcd2d8/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.462559 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff1ce10f-fa3e-4526-a823-f4defdaf9085" path="/var/lib/kubelet/pods/ff1ce10f-fa3e-4526-a823-f4defdaf9085/volumes" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.471355 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-config-data" (OuterVolumeSpecName: "config-data") pod "3b29e384-bd86-4102-8a9e-4745cd0ae8d5" (UID: "3b29e384-bd86-4102-8a9e-4745cd0ae8d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.474179 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "84508935-db34-4e2b-a3af-800ac432353b" (UID: "84508935-db34-4e2b-a3af-800ac432353b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.478990 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84508935-db34-4e2b-a3af-800ac432353b" (UID: "84508935-db34-4e2b-a3af-800ac432353b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.486459 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-config-data" (OuterVolumeSpecName: "config-data") pod "84508935-db34-4e2b-a3af-800ac432353b" (UID: "84508935-db34-4e2b-a3af-800ac432353b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.487478 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "84924340-1dd2-488e-a6e6-adfe62b61f2f" (UID: "84924340-1dd2-488e-a6e6-adfe62b61f2f"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.553784 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b781c5e-301c-4458-8dea-4494e6ff8ee1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.553810 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmqn6\" (UniqueName: \"kubernetes.io/projected/06e72b40-f22a-4602-a483-7653890ae089-kube-api-access-qmqn6\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.553820 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.553830 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b29e384-bd86-4102-8a9e-4745cd0ae8d5-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.553840 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.553848 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84508935-db34-4e2b-a3af-800ac432353b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.553855 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e72b40-f22a-4602-a483-7653890ae089-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.553863 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/84924340-1dd2-488e-a6e6-adfe62b61f2f-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.624926 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="cc423120-ba93-465b-8ef8-871904b901ef" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Jan 30 05:29:50 crc kubenswrapper[4841]: E0130 05:29:50.655613 4841 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 30 05:29:50 crc kubenswrapper[4841]: E0130 05:29:50.655679 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data podName:cc423120-ba93-465b-8ef8-871904b901ef nodeName:}" failed. No retries permitted until 2026-01-30 05:29:58.655661625 +0000 UTC m=+1335.649134263 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data") pod "rabbitmq-server-0" (UID: "cc423120-ba93-465b-8ef8-871904b901ef") : configmap "rabbitmq-config-data" not found Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.754106 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.799443 4841 generic.go:334] "Generic (PLEG): container finished" podID="ad7779ad-0912-4695-853f-3ce786c2e9ae" containerID="7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97" exitCode=0 Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.799489 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad7779ad-0912-4695-853f-3ce786c2e9ae","Type":"ContainerDied","Data":"7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97"} Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.804535 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.804537 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad7779ad-0912-4695-853f-3ce786c2e9ae","Type":"ContainerDied","Data":"8600967c2ade3e891b207c101977cc2ccfbdba76d18ef1953587344914c55439"} Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.804573 4841 scope.go:117] "RemoveContainer" containerID="7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.812245 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_84924340-1dd2-488e-a6e6-adfe62b61f2f/ovn-northd/0.log" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.812330 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"84924340-1dd2-488e-a6e6-adfe62b61f2f","Type":"ContainerDied","Data":"5d2d793ab0fb7ca6219d5551b99e77832cc435ad9e83e1edcc674406dd3badf9"} Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.812449 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.815642 4841 generic.go:334] "Generic (PLEG): container finished" podID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerID="639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35" exitCode=0 Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.815714 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtk2w" event={"ID":"2b781c5e-301c-4458-8dea-4494e6ff8ee1","Type":"ContainerDied","Data":"639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35"} Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.815747 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtk2w" event={"ID":"2b781c5e-301c-4458-8dea-4494e6ff8ee1","Type":"ContainerDied","Data":"09e2603ddea500778f0b4c77c640fd984a140cc5d12a09f0aca41206f54f915d"} Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.815780 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtk2w" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.820981 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.821122 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8e2461f9-732a-448f-a5d7-7528bc3956e3","Type":"ContainerDied","Data":"f380bc1b4527c3685cfad874202c3a7cdb8c53b0054c3b0f3a48d255f8ca2234"} Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.823864 4841 generic.go:334] "Generic (PLEG): container finished" podID="b191848f-8f16-40e6-8a2b-f66a0179f359" containerID="5f9a9849cbf3d28b7e6adfb866f7318a167f6a59eaf9156c8cc67f2d836ff57f" exitCode=0 Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.823912 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5445c58497-n245m" event={"ID":"b191848f-8f16-40e6-8a2b-f66a0179f359","Type":"ContainerDied","Data":"5f9a9849cbf3d28b7e6adfb866f7318a167f6a59eaf9156c8cc67f2d836ff57f"} Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.825042 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7b9cf755cd-5p4pk" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.825378 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.848234 4841 scope.go:117] "RemoveContainer" containerID="4dfd3bf8acad84bdc8434c960a6f69f824e54adf209164c30f09c4f21207c7b8" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.858257 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-tls\") pod \"ad7779ad-0912-4695-853f-3ce786c2e9ae\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.858468 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ad7779ad-0912-4695-853f-3ce786c2e9ae\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.861998 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad7779ad-0912-4695-853f-3ce786c2e9ae-erlang-cookie-secret\") pod \"ad7779ad-0912-4695-853f-3ce786c2e9ae\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.862119 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-erlang-cookie\") pod \"ad7779ad-0912-4695-853f-3ce786c2e9ae\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.862253 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-server-conf\") pod \"ad7779ad-0912-4695-853f-3ce786c2e9ae\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.862411 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg8w6\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-kube-api-access-vg8w6\") pod \"ad7779ad-0912-4695-853f-3ce786c2e9ae\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.862523 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-plugins\") pod \"ad7779ad-0912-4695-853f-3ce786c2e9ae\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.862630 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-plugins-conf\") pod \"ad7779ad-0912-4695-853f-3ce786c2e9ae\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.862728 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data\") pod \"ad7779ad-0912-4695-853f-3ce786c2e9ae\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.862876 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad7779ad-0912-4695-853f-3ce786c2e9ae-pod-info\") pod \"ad7779ad-0912-4695-853f-3ce786c2e9ae\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.863028 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-confd\") pod \"ad7779ad-0912-4695-853f-3ce786c2e9ae\" (UID: \"ad7779ad-0912-4695-853f-3ce786c2e9ae\") " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.865545 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.869039 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ad7779ad-0912-4695-853f-3ce786c2e9ae" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.871054 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ad7779ad-0912-4695-853f-3ce786c2e9ae" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.873073 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ad7779ad-0912-4695-853f-3ce786c2e9ae" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.873291 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ad7779ad-0912-4695-853f-3ce786c2e9ae" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.873571 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "ad7779ad-0912-4695-853f-3ce786c2e9ae" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.887456 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ad7779ad-0912-4695-853f-3ce786c2e9ae-pod-info" (OuterVolumeSpecName: "pod-info") pod "ad7779ad-0912-4695-853f-3ce786c2e9ae" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.887866 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad7779ad-0912-4695-853f-3ce786c2e9ae-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ad7779ad-0912-4695-853f-3ce786c2e9ae" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.891800 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-kube-api-access-vg8w6" (OuterVolumeSpecName: "kube-api-access-vg8w6") pod "ad7779ad-0912-4695-853f-3ce786c2e9ae" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae"). InnerVolumeSpecName "kube-api-access-vg8w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.898697 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.899545 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data" (OuterVolumeSpecName: "config-data") pod "ad7779ad-0912-4695-853f-3ce786c2e9ae" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.900259 4841 scope.go:117] "RemoveContainer" containerID="7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97" Jan 30 05:29:50 crc kubenswrapper[4841]: E0130 05:29:50.904508 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97\": container with ID starting with 7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97 not found: ID does not exist" containerID="7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.904575 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97"} err="failed to get container status \"7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97\": rpc error: code = NotFound desc = could not find container \"7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97\": container with ID starting with 7325c7173b0b6a66d9a826424cf22a0971ee026656c7a65fae4ecf05ffb26b97 not found: ID does not exist" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.904598 4841 scope.go:117] "RemoveContainer" containerID="4dfd3bf8acad84bdc8434c960a6f69f824e54adf209164c30f09c4f21207c7b8" Jan 30 05:29:50 crc kubenswrapper[4841]: E0130 05:29:50.905048 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dfd3bf8acad84bdc8434c960a6f69f824e54adf209164c30f09c4f21207c7b8\": container with ID starting with 4dfd3bf8acad84bdc8434c960a6f69f824e54adf209164c30f09c4f21207c7b8 not found: ID does not exist" containerID="4dfd3bf8acad84bdc8434c960a6f69f824e54adf209164c30f09c4f21207c7b8" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.905069 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dfd3bf8acad84bdc8434c960a6f69f824e54adf209164c30f09c4f21207c7b8"} err="failed to get container status \"4dfd3bf8acad84bdc8434c960a6f69f824e54adf209164c30f09c4f21207c7b8\": rpc error: code = NotFound desc = could not find container \"4dfd3bf8acad84bdc8434c960a6f69f824e54adf209164c30f09c4f21207c7b8\": container with ID starting with 4dfd3bf8acad84bdc8434c960a6f69f824e54adf209164c30f09c4f21207c7b8 not found: ID does not exist" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.905082 4841 scope.go:117] "RemoveContainer" containerID="465a65be37ff3c79b8c121932b87f14d4319db96cad76bedc4f91b6b55b13cf0" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.909093 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.914002 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.915350 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5445c58497-n245m" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.923031 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7b9cf755cd-5p4pk"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.923499 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-7b9cf755cd-5p4pk"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.927954 4841 scope.go:117] "RemoveContainer" containerID="b18eea1e80216e394517133207cd5dd3e554316b05364a6776862ef3821a0f05" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.929385 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mtk2w"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.937663 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mtk2w"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.942437 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.947475 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.948328 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-server-conf" (OuterVolumeSpecName: "server-conf") pod "ad7779ad-0912-4695-853f-3ce786c2e9ae" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.964309 4841 scope.go:117] "RemoveContainer" containerID="639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.965363 4841 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad7779ad-0912-4695-853f-3ce786c2e9ae-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.965405 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.965426 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.965436 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.965445 4841 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad7779ad-0912-4695-853f-3ce786c2e9ae-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.965453 4841 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.965461 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg8w6\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-kube-api-access-vg8w6\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.965469 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.965476 4841 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.965484 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad7779ad-0912-4695-853f-3ce786c2e9ae-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.978646 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ad7779ad-0912-4695-853f-3ce786c2e9ae" (UID: "ad7779ad-0912-4695-853f-3ce786c2e9ae"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.981615 4841 scope.go:117] "RemoveContainer" containerID="47ddf991a16144e1858ac88abfce03ab53252ff609af74874acca0efd37dde5c" Jan 30 05:29:50 crc kubenswrapper[4841]: I0130 05:29:50.983093 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.002274 4841 scope.go:117] "RemoveContainer" containerID="be2eaf76e1c546278544c503658d4e4057ccce574bfafaf72f4235735046ac36" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.018759 4841 scope.go:117] "RemoveContainer" containerID="639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35" Jan 30 05:29:51 crc kubenswrapper[4841]: E0130 05:29:51.019122 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35\": container with ID starting with 639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35 not found: ID does not exist" containerID="639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.019158 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35"} err="failed to get container status \"639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35\": rpc error: code = NotFound desc = could not find container \"639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35\": container with ID starting with 639c6882167e27c6097f6f149e1e6f0379c72c43092e5924fb57320aea0e3d35 not found: ID does not exist" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.019196 4841 scope.go:117] "RemoveContainer" containerID="47ddf991a16144e1858ac88abfce03ab53252ff609af74874acca0efd37dde5c" Jan 30 05:29:51 crc kubenswrapper[4841]: E0130 05:29:51.019504 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47ddf991a16144e1858ac88abfce03ab53252ff609af74874acca0efd37dde5c\": container with ID starting with 47ddf991a16144e1858ac88abfce03ab53252ff609af74874acca0efd37dde5c not found: ID does not exist" containerID="47ddf991a16144e1858ac88abfce03ab53252ff609af74874acca0efd37dde5c" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.019535 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ddf991a16144e1858ac88abfce03ab53252ff609af74874acca0efd37dde5c"} err="failed to get container status \"47ddf991a16144e1858ac88abfce03ab53252ff609af74874acca0efd37dde5c\": rpc error: code = NotFound desc = could not find container \"47ddf991a16144e1858ac88abfce03ab53252ff609af74874acca0efd37dde5c\": container with ID starting with 47ddf991a16144e1858ac88abfce03ab53252ff609af74874acca0efd37dde5c not found: ID does not exist" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.019559 4841 scope.go:117] "RemoveContainer" containerID="be2eaf76e1c546278544c503658d4e4057ccce574bfafaf72f4235735046ac36" Jan 30 05:29:51 crc kubenswrapper[4841]: E0130 05:29:51.019807 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be2eaf76e1c546278544c503658d4e4057ccce574bfafaf72f4235735046ac36\": container with ID starting with be2eaf76e1c546278544c503658d4e4057ccce574bfafaf72f4235735046ac36 not found: ID does not exist" containerID="be2eaf76e1c546278544c503658d4e4057ccce574bfafaf72f4235735046ac36" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.019848 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be2eaf76e1c546278544c503658d4e4057ccce574bfafaf72f4235735046ac36"} err="failed to get container status \"be2eaf76e1c546278544c503658d4e4057ccce574bfafaf72f4235735046ac36\": rpc error: code = NotFound desc = could not find container \"be2eaf76e1c546278544c503658d4e4057ccce574bfafaf72f4235735046ac36\": container with ID starting with be2eaf76e1c546278544c503658d4e4057ccce574bfafaf72f4235735046ac36 not found: ID does not exist" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.019864 4841 scope.go:117] "RemoveContainer" containerID="11b34a44cee08eb57f7dd12e3101d19de35b79f567abf07b3e86b4e5ced1412b" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.037340 4841 scope.go:117] "RemoveContainer" containerID="876a3fddd107a1f4afa02010cf302a5b9e18105c753d854a957c74cfaba89c34" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.066503 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-internal-tls-certs\") pod \"b191848f-8f16-40e6-8a2b-f66a0179f359\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.066578 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-fernet-keys\") pod \"b191848f-8f16-40e6-8a2b-f66a0179f359\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.066615 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-public-tls-certs\") pod \"b191848f-8f16-40e6-8a2b-f66a0179f359\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.066661 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-combined-ca-bundle\") pod \"b191848f-8f16-40e6-8a2b-f66a0179f359\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.066713 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-scripts\") pod \"b191848f-8f16-40e6-8a2b-f66a0179f359\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.066747 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-config-data\") pod \"b191848f-8f16-40e6-8a2b-f66a0179f359\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.066778 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-credential-keys\") pod \"b191848f-8f16-40e6-8a2b-f66a0179f359\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.066818 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq567\" (UniqueName: \"kubernetes.io/projected/b191848f-8f16-40e6-8a2b-f66a0179f359-kube-api-access-nq567\") pod \"b191848f-8f16-40e6-8a2b-f66a0179f359\" (UID: \"b191848f-8f16-40e6-8a2b-f66a0179f359\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.067090 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad7779ad-0912-4695-853f-3ce786c2e9ae-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.067105 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.070029 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b191848f-8f16-40e6-8a2b-f66a0179f359" (UID: "b191848f-8f16-40e6-8a2b-f66a0179f359"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.070205 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b191848f-8f16-40e6-8a2b-f66a0179f359-kube-api-access-nq567" (OuterVolumeSpecName: "kube-api-access-nq567") pod "b191848f-8f16-40e6-8a2b-f66a0179f359" (UID: "b191848f-8f16-40e6-8a2b-f66a0179f359"). InnerVolumeSpecName "kube-api-access-nq567". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.071525 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-scripts" (OuterVolumeSpecName: "scripts") pod "b191848f-8f16-40e6-8a2b-f66a0179f359" (UID: "b191848f-8f16-40e6-8a2b-f66a0179f359"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.072161 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b191848f-8f16-40e6-8a2b-f66a0179f359" (UID: "b191848f-8f16-40e6-8a2b-f66a0179f359"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.087736 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b191848f-8f16-40e6-8a2b-f66a0179f359" (UID: "b191848f-8f16-40e6-8a2b-f66a0179f359"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.090913 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-config-data" (OuterVolumeSpecName: "config-data") pod "b191848f-8f16-40e6-8a2b-f66a0179f359" (UID: "b191848f-8f16-40e6-8a2b-f66a0179f359"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.100674 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b191848f-8f16-40e6-8a2b-f66a0179f359" (UID: "b191848f-8f16-40e6-8a2b-f66a0179f359"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.102540 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b191848f-8f16-40e6-8a2b-f66a0179f359" (UID: "b191848f-8f16-40e6-8a2b-f66a0179f359"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.150549 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.158235 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.168935 4841 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.169039 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.169093 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.169145 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.169207 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.169257 4841 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.169382 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq567\" (UniqueName: \"kubernetes.io/projected/b191848f-8f16-40e6-8a2b-f66a0179f359-kube-api-access-nq567\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.169450 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b191848f-8f16-40e6-8a2b-f66a0179f359-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:51 crc kubenswrapper[4841]: E0130 05:29:51.517085 4841 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Jan 30 05:29:51 crc kubenswrapper[4841]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-01-30T05:29:44Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Jan 30 05:29:51 crc kubenswrapper[4841]: /etc/init.d/functions: line 589: 414 Alarm clock "$@" Jan 30 05:29:51 crc kubenswrapper[4841]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-88rdn" message=< Jan 30 05:29:51 crc kubenswrapper[4841]: Exiting ovn-controller (1) [FAILED] Jan 30 05:29:51 crc kubenswrapper[4841]: Killing ovn-controller (1) [ OK ] Jan 30 05:29:51 crc kubenswrapper[4841]: 2026-01-30T05:29:44Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Jan 30 05:29:51 crc kubenswrapper[4841]: /etc/init.d/functions: line 589: 414 Alarm clock "$@" Jan 30 05:29:51 crc kubenswrapper[4841]: > Jan 30 05:29:51 crc kubenswrapper[4841]: E0130 05:29:51.517141 4841 kuberuntime_container.go:691] "PreStop hook failed" err=< Jan 30 05:29:51 crc kubenswrapper[4841]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-01-30T05:29:44Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Jan 30 05:29:51 crc kubenswrapper[4841]: /etc/init.d/functions: line 589: 414 Alarm clock "$@" Jan 30 05:29:51 crc kubenswrapper[4841]: > pod="openstack/ovn-controller-88rdn" podUID="47d25b55-9643-45fd-b2fe-eb593334924d" containerName="ovn-controller" containerID="cri-o://6b3c23c498cb0fcf52b1b3ba92ec5b60803d90067523bd6519592f18536e1fc3" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.517234 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-88rdn" podUID="47d25b55-9643-45fd-b2fe-eb593334924d" containerName="ovn-controller" containerID="cri-o://6b3c23c498cb0fcf52b1b3ba92ec5b60803d90067523bd6519592f18536e1fc3" gracePeriod=23 Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.840210 4841 generic.go:334] "Generic (PLEG): container finished" podID="cc423120-ba93-465b-8ef8-871904b901ef" containerID="a19190f1ad60a61a31c984cadceaa8ab89c01149a3adc2e54a53efba5d740bd4" exitCode=0 Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.840306 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc423120-ba93-465b-8ef8-871904b901ef","Type":"ContainerDied","Data":"a19190f1ad60a61a31c984cadceaa8ab89c01149a3adc2e54a53efba5d740bd4"} Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.853017 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5445c58497-n245m" event={"ID":"b191848f-8f16-40e6-8a2b-f66a0179f359","Type":"ContainerDied","Data":"c76dedfa2ca22effdff62e518b1c1253e08c42c734b788c4024dfa5241c692cc"} Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.853068 4841 scope.go:117] "RemoveContainer" containerID="5f9a9849cbf3d28b7e6adfb866f7318a167f6a59eaf9156c8cc67f2d836ff57f" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.853188 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5445c58497-n245m" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.867616 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-88rdn_47d25b55-9643-45fd-b2fe-eb593334924d/ovn-controller/0.log" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.867665 4841 generic.go:334] "Generic (PLEG): container finished" podID="47d25b55-9643-45fd-b2fe-eb593334924d" containerID="6b3c23c498cb0fcf52b1b3ba92ec5b60803d90067523bd6519592f18536e1fc3" exitCode=139 Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.867732 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-88rdn" event={"ID":"47d25b55-9643-45fd-b2fe-eb593334924d","Type":"ContainerDied","Data":"6b3c23c498cb0fcf52b1b3ba92ec5b60803d90067523bd6519592f18536e1fc3"} Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.869795 4841 generic.go:334] "Generic (PLEG): container finished" podID="60a29cc8-4615-40e7-a687-1852db124ba0" containerID="3043b1132b1d8cd3d83be2ceb2adfb4bc8bd0b66f8fa3fcc542bda3812e357ec" exitCode=0 Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.869856 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"60a29cc8-4615-40e7-a687-1852db124ba0","Type":"ContainerDied","Data":"3043b1132b1d8cd3d83be2ceb2adfb4bc8bd0b66f8fa3fcc542bda3812e357ec"} Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.874242 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-88rdn_47d25b55-9643-45fd-b2fe-eb593334924d/ovn-controller/0.log" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.874306 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-88rdn" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.932595 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5445c58497-n245m"] Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.950924 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5445c58497-n245m"] Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.990725 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d25b55-9643-45fd-b2fe-eb593334924d-ovn-controller-tls-certs\") pod \"47d25b55-9643-45fd-b2fe-eb593334924d\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.990791 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d25b55-9643-45fd-b2fe-eb593334924d-combined-ca-bundle\") pod \"47d25b55-9643-45fd-b2fe-eb593334924d\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.990815 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47d25b55-9643-45fd-b2fe-eb593334924d-scripts\") pod \"47d25b55-9643-45fd-b2fe-eb593334924d\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.990839 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c9bs\" (UniqueName: \"kubernetes.io/projected/47d25b55-9643-45fd-b2fe-eb593334924d-kube-api-access-2c9bs\") pod \"47d25b55-9643-45fd-b2fe-eb593334924d\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.990897 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-run-ovn\") pod \"47d25b55-9643-45fd-b2fe-eb593334924d\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.990964 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-run\") pod \"47d25b55-9643-45fd-b2fe-eb593334924d\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.990988 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-log-ovn\") pod \"47d25b55-9643-45fd-b2fe-eb593334924d\" (UID: \"47d25b55-9643-45fd-b2fe-eb593334924d\") " Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.991290 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "47d25b55-9643-45fd-b2fe-eb593334924d" (UID: "47d25b55-9643-45fd-b2fe-eb593334924d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.994537 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "47d25b55-9643-45fd-b2fe-eb593334924d" (UID: "47d25b55-9643-45fd-b2fe-eb593334924d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.994645 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-run" (OuterVolumeSpecName: "var-run") pod "47d25b55-9643-45fd-b2fe-eb593334924d" (UID: "47d25b55-9643-45fd-b2fe-eb593334924d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:51 crc kubenswrapper[4841]: I0130 05:29:51.996322 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47d25b55-9643-45fd-b2fe-eb593334924d-scripts" (OuterVolumeSpecName: "scripts") pod "47d25b55-9643-45fd-b2fe-eb593334924d" (UID: "47d25b55-9643-45fd-b2fe-eb593334924d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.001841 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d25b55-9643-45fd-b2fe-eb593334924d-kube-api-access-2c9bs" (OuterVolumeSpecName: "kube-api-access-2c9bs") pod "47d25b55-9643-45fd-b2fe-eb593334924d" (UID: "47d25b55-9643-45fd-b2fe-eb593334924d"). InnerVolumeSpecName "kube-api-access-2c9bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.019188 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d25b55-9643-45fd-b2fe-eb593334924d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47d25b55-9643-45fd-b2fe-eb593334924d" (UID: "47d25b55-9643-45fd-b2fe-eb593334924d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.053736 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.059136 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d25b55-9643-45fd-b2fe-eb593334924d-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "47d25b55-9643-45fd-b2fe-eb593334924d" (UID: "47d25b55-9643-45fd-b2fe-eb593334924d"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.095036 4841 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.095068 4841 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.095079 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/47d25b55-9643-45fd-b2fe-eb593334924d-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.095088 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d25b55-9643-45fd-b2fe-eb593334924d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.095096 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47d25b55-9643-45fd-b2fe-eb593334924d-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.095104 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c9bs\" (UniqueName: \"kubernetes.io/projected/47d25b55-9643-45fd-b2fe-eb593334924d-kube-api-access-2c9bs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.095112 4841 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/47d25b55-9643-45fd-b2fe-eb593334924d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.120010 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.195808 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data\") pod \"cc423120-ba93-465b-8ef8-871904b901ef\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.195873 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpbqm\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-kube-api-access-cpbqm\") pod \"cc423120-ba93-465b-8ef8-871904b901ef\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.195918 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-plugins-conf\") pod \"cc423120-ba93-465b-8ef8-871904b901ef\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.195953 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cc423120-ba93-465b-8ef8-871904b901ef\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.195987 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-erlang-cookie\") pod \"cc423120-ba93-465b-8ef8-871904b901ef\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.196021 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-confd\") pod \"cc423120-ba93-465b-8ef8-871904b901ef\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.196052 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-plugins\") pod \"cc423120-ba93-465b-8ef8-871904b901ef\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.196094 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-tls\") pod \"cc423120-ba93-465b-8ef8-871904b901ef\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.196127 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-server-conf\") pod \"cc423120-ba93-465b-8ef8-871904b901ef\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.196178 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc423120-ba93-465b-8ef8-871904b901ef-pod-info\") pod \"cc423120-ba93-465b-8ef8-871904b901ef\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.196213 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc423120-ba93-465b-8ef8-871904b901ef-erlang-cookie-secret\") pod \"cc423120-ba93-465b-8ef8-871904b901ef\" (UID: \"cc423120-ba93-465b-8ef8-871904b901ef\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.196828 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cc423120-ba93-465b-8ef8-871904b901ef" (UID: "cc423120-ba93-465b-8ef8-871904b901ef"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.197068 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cc423120-ba93-465b-8ef8-871904b901ef" (UID: "cc423120-ba93-465b-8ef8-871904b901ef"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.197231 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cc423120-ba93-465b-8ef8-871904b901ef" (UID: "cc423120-ba93-465b-8ef8-871904b901ef"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.199274 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc423120-ba93-465b-8ef8-871904b901ef-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cc423120-ba93-465b-8ef8-871904b901ef" (UID: "cc423120-ba93-465b-8ef8-871904b901ef"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.199961 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-kube-api-access-cpbqm" (OuterVolumeSpecName: "kube-api-access-cpbqm") pod "cc423120-ba93-465b-8ef8-871904b901ef" (UID: "cc423120-ba93-465b-8ef8-871904b901ef"). InnerVolumeSpecName "kube-api-access-cpbqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.200780 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cc423120-ba93-465b-8ef8-871904b901ef-pod-info" (OuterVolumeSpecName: "pod-info") pod "cc423120-ba93-465b-8ef8-871904b901ef" (UID: "cc423120-ba93-465b-8ef8-871904b901ef"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.201951 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cc423120-ba93-465b-8ef8-871904b901ef" (UID: "cc423120-ba93-465b-8ef8-871904b901ef"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.205521 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "cc423120-ba93-465b-8ef8-871904b901ef" (UID: "cc423120-ba93-465b-8ef8-871904b901ef"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.212152 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data" (OuterVolumeSpecName: "config-data") pod "cc423120-ba93-465b-8ef8-871904b901ef" (UID: "cc423120-ba93-465b-8ef8-871904b901ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.240950 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-server-conf" (OuterVolumeSpecName: "server-conf") pod "cc423120-ba93-465b-8ef8-871904b901ef" (UID: "cc423120-ba93-465b-8ef8-871904b901ef"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.286638 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cc423120-ba93-465b-8ef8-871904b901ef" (UID: "cc423120-ba93-465b-8ef8-871904b901ef"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297059 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a29cc8-4615-40e7-a687-1852db124ba0-combined-ca-bundle\") pod \"60a29cc8-4615-40e7-a687-1852db124ba0\" (UID: \"60a29cc8-4615-40e7-a687-1852db124ba0\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297159 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq46c\" (UniqueName: \"kubernetes.io/projected/60a29cc8-4615-40e7-a687-1852db124ba0-kube-api-access-bq46c\") pod \"60a29cc8-4615-40e7-a687-1852db124ba0\" (UID: \"60a29cc8-4615-40e7-a687-1852db124ba0\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297266 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a29cc8-4615-40e7-a687-1852db124ba0-config-data\") pod \"60a29cc8-4615-40e7-a687-1852db124ba0\" (UID: \"60a29cc8-4615-40e7-a687-1852db124ba0\") " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297531 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297548 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpbqm\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-kube-api-access-cpbqm\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297558 4841 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297576 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297585 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297592 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297600 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297608 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc423120-ba93-465b-8ef8-871904b901ef-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297615 4841 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc423120-ba93-465b-8ef8-871904b901ef-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297623 4841 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc423120-ba93-465b-8ef8-871904b901ef-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.297631 4841 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc423120-ba93-465b-8ef8-871904b901ef-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.302553 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a29cc8-4615-40e7-a687-1852db124ba0-kube-api-access-bq46c" (OuterVolumeSpecName: "kube-api-access-bq46c") pod "60a29cc8-4615-40e7-a687-1852db124ba0" (UID: "60a29cc8-4615-40e7-a687-1852db124ba0"). InnerVolumeSpecName "kube-api-access-bq46c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.310808 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.314763 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a29cc8-4615-40e7-a687-1852db124ba0-config-data" (OuterVolumeSpecName: "config-data") pod "60a29cc8-4615-40e7-a687-1852db124ba0" (UID: "60a29cc8-4615-40e7-a687-1852db124ba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.319450 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a29cc8-4615-40e7-a687-1852db124ba0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60a29cc8-4615-40e7-a687-1852db124ba0" (UID: "60a29cc8-4615-40e7-a687-1852db124ba0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.398386 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq46c\" (UniqueName: \"kubernetes.io/projected/60a29cc8-4615-40e7-a687-1852db124ba0-kube-api-access-bq46c\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.398427 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.398439 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a29cc8-4615-40e7-a687-1852db124ba0-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.398451 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a29cc8-4615-40e7-a687-1852db124ba0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.442937 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" path="/var/lib/kubelet/pods/2b781c5e-301c-4458-8dea-4494e6ff8ee1/volumes" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.444895 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b29e384-bd86-4102-8a9e-4745cd0ae8d5" path="/var/lib/kubelet/pods/3b29e384-bd86-4102-8a9e-4745cd0ae8d5/volumes" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.446209 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84508935-db34-4e2b-a3af-800ac432353b" path="/var/lib/kubelet/pods/84508935-db34-4e2b-a3af-800ac432353b/volumes" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.448901 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84924340-1dd2-488e-a6e6-adfe62b61f2f" path="/var/lib/kubelet/pods/84924340-1dd2-488e-a6e6-adfe62b61f2f/volumes" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.462018 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e2461f9-732a-448f-a5d7-7528bc3956e3" path="/var/lib/kubelet/pods/8e2461f9-732a-448f-a5d7-7528bc3956e3/volumes" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.463742 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad7779ad-0912-4695-853f-3ce786c2e9ae" path="/var/lib/kubelet/pods/ad7779ad-0912-4695-853f-3ce786c2e9ae/volumes" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.465774 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b191848f-8f16-40e6-8a2b-f66a0179f359" path="/var/lib/kubelet/pods/b191848f-8f16-40e6-8a2b-f66a0179f359/volumes" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.890312 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-88rdn_47d25b55-9643-45fd-b2fe-eb593334924d/ovn-controller/0.log" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.890572 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-88rdn" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.891479 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-88rdn" event={"ID":"47d25b55-9643-45fd-b2fe-eb593334924d","Type":"ContainerDied","Data":"298c3e61574db7def40be6b74e7df4b9777b7f2eb9efaf52c249f4dc85769ee6"} Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.891523 4841 scope.go:117] "RemoveContainer" containerID="6b3c23c498cb0fcf52b1b3ba92ec5b60803d90067523bd6519592f18536e1fc3" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.895889 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.895906 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"60a29cc8-4615-40e7-a687-1852db124ba0","Type":"ContainerDied","Data":"3387db4692895bcb2dcd2388f34c0c36789dd3cf261768a3333ed24ddc7e2a34"} Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.900804 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc423120-ba93-465b-8ef8-871904b901ef","Type":"ContainerDied","Data":"4aad50bd4b29190901f79507be62ff07001fd77ed292118463a34ecb6a2de1f7"} Jan 30 05:29:52 crc kubenswrapper[4841]: I0130 05:29:52.900856 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.122844 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.127137 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.142311 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.143290 4841 scope.go:117] "RemoveContainer" containerID="3043b1132b1d8cd3d83be2ceb2adfb4bc8bd0b66f8fa3fcc542bda3812e357ec" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.147611 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.153556 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-88rdn"] Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.165971 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-88rdn"] Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.198721 4841 scope.go:117] "RemoveContainer" containerID="a19190f1ad60a61a31c984cadceaa8ab89c01149a3adc2e54a53efba5d740bd4" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.221543 4841 scope.go:117] "RemoveContainer" containerID="7f1ddbf696f7e72c83c4c0ee8f83a7e88cf635df1e2c2faa46855be05f07c1a9" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.293698 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.339457 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.425141 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xczf\" (UniqueName: \"kubernetes.io/projected/a035e8ef-e433-4c59-a0fd-09937eb5f226-kube-api-access-6xczf\") pod \"a035e8ef-e433-4c59-a0fd-09937eb5f226\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.425217 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-combined-ca-bundle\") pod \"a035e8ef-e433-4c59-a0fd-09937eb5f226\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.425274 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-config-data-custom\") pod \"a035e8ef-e433-4c59-a0fd-09937eb5f226\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.425292 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-config-data\") pod \"a035e8ef-e433-4c59-a0fd-09937eb5f226\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.425308 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a035e8ef-e433-4c59-a0fd-09937eb5f226-logs\") pod \"a035e8ef-e433-4c59-a0fd-09937eb5f226\" (UID: \"a035e8ef-e433-4c59-a0fd-09937eb5f226\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.426002 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a035e8ef-e433-4c59-a0fd-09937eb5f226-logs" (OuterVolumeSpecName: "logs") pod "a035e8ef-e433-4c59-a0fd-09937eb5f226" (UID: "a035e8ef-e433-4c59-a0fd-09937eb5f226"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.435544 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a035e8ef-e433-4c59-a0fd-09937eb5f226" (UID: "a035e8ef-e433-4c59-a0fd-09937eb5f226"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.435531 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a035e8ef-e433-4c59-a0fd-09937eb5f226-kube-api-access-6xczf" (OuterVolumeSpecName: "kube-api-access-6xczf") pod "a035e8ef-e433-4c59-a0fd-09937eb5f226" (UID: "a035e8ef-e433-4c59-a0fd-09937eb5f226"). InnerVolumeSpecName "kube-api-access-6xczf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.444925 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a035e8ef-e433-4c59-a0fd-09937eb5f226" (UID: "a035e8ef-e433-4c59-a0fd-09937eb5f226"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.474837 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-config-data" (OuterVolumeSpecName: "config-data") pod "a035e8ef-e433-4c59-a0fd-09937eb5f226" (UID: "a035e8ef-e433-4c59-a0fd-09937eb5f226"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.511966 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.532773 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-config-data\") pod \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.532931 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-etc-machine-id\") pod \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.533093 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-scripts\") pod \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.533016 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" (UID: "d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.533178 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-combined-ca-bundle\") pod \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.533683 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw55s\" (UniqueName: \"kubernetes.io/projected/86140170-ca48-47e9-b587-43f98f3624c1-kube-api-access-nw55s\") pod \"86140170-ca48-47e9-b587-43f98f3624c1\" (UID: \"86140170-ca48-47e9-b587-43f98f3624c1\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.533716 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwkbg\" (UniqueName: \"kubernetes.io/projected/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-kube-api-access-nwkbg\") pod \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.533823 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-config-data-custom\") pod \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\" (UID: \"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.534341 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xczf\" (UniqueName: \"kubernetes.io/projected/a035e8ef-e433-4c59-a0fd-09937eb5f226-kube-api-access-6xczf\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.534359 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.534369 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.534377 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.534469 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a035e8ef-e433-4c59-a0fd-09937eb5f226-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.534504 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a035e8ef-e433-4c59-a0fd-09937eb5f226-logs\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.536742 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-scripts" (OuterVolumeSpecName: "scripts") pod "d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" (UID: "d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.537508 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86140170-ca48-47e9-b587-43f98f3624c1-kube-api-access-nw55s" (OuterVolumeSpecName: "kube-api-access-nw55s") pod "86140170-ca48-47e9-b587-43f98f3624c1" (UID: "86140170-ca48-47e9-b587-43f98f3624c1"). InnerVolumeSpecName "kube-api-access-nw55s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.538588 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" (UID: "d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.547533 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-kube-api-access-nwkbg" (OuterVolumeSpecName: "kube-api-access-nwkbg") pod "d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" (UID: "d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc"). InnerVolumeSpecName "kube-api-access-nwkbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.591527 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" (UID: "d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.636242 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86140170-ca48-47e9-b587-43f98f3624c1-config-data\") pod \"86140170-ca48-47e9-b587-43f98f3624c1\" (UID: \"86140170-ca48-47e9-b587-43f98f3624c1\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.636280 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86140170-ca48-47e9-b587-43f98f3624c1-combined-ca-bundle\") pod \"86140170-ca48-47e9-b587-43f98f3624c1\" (UID: \"86140170-ca48-47e9-b587-43f98f3624c1\") " Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.636465 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.636480 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.636491 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw55s\" (UniqueName: \"kubernetes.io/projected/86140170-ca48-47e9-b587-43f98f3624c1-kube-api-access-nw55s\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.636501 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwkbg\" (UniqueName: \"kubernetes.io/projected/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-kube-api-access-nwkbg\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.636510 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.637735 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-config-data" (OuterVolumeSpecName: "config-data") pod "d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" (UID: "d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.663620 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86140170-ca48-47e9-b587-43f98f3624c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86140170-ca48-47e9-b587-43f98f3624c1" (UID: "86140170-ca48-47e9-b587-43f98f3624c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.665704 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86140170-ca48-47e9-b587-43f98f3624c1-config-data" (OuterVolumeSpecName: "config-data") pod "86140170-ca48-47e9-b587-43f98f3624c1" (UID: "86140170-ca48-47e9-b587-43f98f3624c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.737590 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.737621 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86140170-ca48-47e9-b587-43f98f3624c1-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.737633 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86140170-ca48-47e9-b587-43f98f3624c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.920776 4841 generic.go:334] "Generic (PLEG): container finished" podID="d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" containerID="008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25" exitCode=0 Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.920847 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc","Type":"ContainerDied","Data":"008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25"} Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.920875 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc","Type":"ContainerDied","Data":"3ef8c0e368d68aa24591968b393ecd2a6f1a6f912debf6a12b2028a83d4c98e8"} Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.920876 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.920893 4841 scope.go:117] "RemoveContainer" containerID="0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.927999 4841 generic.go:334] "Generic (PLEG): container finished" podID="a035e8ef-e433-4c59-a0fd-09937eb5f226" containerID="ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d" exitCode=0 Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.928056 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" event={"ID":"a035e8ef-e433-4c59-a0fd-09937eb5f226","Type":"ContainerDied","Data":"ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d"} Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.928078 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" event={"ID":"a035e8ef-e433-4c59-a0fd-09937eb5f226","Type":"ContainerDied","Data":"d7b374183505e48eddd61c21c0009dd11a4291e4f2b2d78965b774068964e6c4"} Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.928132 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69fd44f6fc-dpzfd" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.934808 4841 generic.go:334] "Generic (PLEG): container finished" podID="86140170-ca48-47e9-b587-43f98f3624c1" containerID="478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2" exitCode=0 Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.934881 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"86140170-ca48-47e9-b587-43f98f3624c1","Type":"ContainerDied","Data":"478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2"} Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.934896 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.934921 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"86140170-ca48-47e9-b587-43f98f3624c1","Type":"ContainerDied","Data":"ca260dccf3864aeff6c9a3062998d02be6a111667474abb112a582d6fba772e4"} Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.958481 4841 scope.go:117] "RemoveContainer" containerID="008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.973239 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-69fd44f6fc-dpzfd"] Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.978571 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-69fd44f6fc-dpzfd"] Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.988560 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.990306 4841 scope.go:117] "RemoveContainer" containerID="0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87" Jan 30 05:29:53 crc kubenswrapper[4841]: E0130 05:29:53.990872 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87\": container with ID starting with 0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87 not found: ID does not exist" containerID="0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.990936 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87"} err="failed to get container status \"0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87\": rpc error: code = NotFound desc = could not find container \"0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87\": container with ID starting with 0cadeb367f1e4dadf2e732cab98c3e040925a70be70907b8fdccf83a8428ff87 not found: ID does not exist" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.990990 4841 scope.go:117] "RemoveContainer" containerID="008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25" Jan 30 05:29:53 crc kubenswrapper[4841]: E0130 05:29:53.991518 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25\": container with ID starting with 008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25 not found: ID does not exist" containerID="008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.991571 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25"} err="failed to get container status \"008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25\": rpc error: code = NotFound desc = could not find container \"008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25\": container with ID starting with 008ada55ae0a3d1328a102907e6cc3b70508402afef78ef0612f96e388102b25 not found: ID does not exist" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.991612 4841 scope.go:117] "RemoveContainer" containerID="ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d" Jan 30 05:29:53 crc kubenswrapper[4841]: I0130 05:29:53.994323 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.003381 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.015649 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.025716 4841 scope.go:117] "RemoveContainer" containerID="da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.044584 4841 scope.go:117] "RemoveContainer" containerID="ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d" Jan 30 05:29:54 crc kubenswrapper[4841]: E0130 05:29:54.045114 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d\": container with ID starting with ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d not found: ID does not exist" containerID="ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.045166 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d"} err="failed to get container status \"ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d\": rpc error: code = NotFound desc = could not find container \"ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d\": container with ID starting with ffe926e305a205c6b1e2501be79995c85849f689f3be491055e2e20992836d1d not found: ID does not exist" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.045206 4841 scope.go:117] "RemoveContainer" containerID="da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184" Jan 30 05:29:54 crc kubenswrapper[4841]: E0130 05:29:54.045618 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184\": container with ID starting with da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184 not found: ID does not exist" containerID="da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.045655 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184"} err="failed to get container status \"da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184\": rpc error: code = NotFound desc = could not find container \"da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184\": container with ID starting with da2c73353a9e9de855384326c800ddf55cc65bd87e08abba1d165e11d98d0184 not found: ID does not exist" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.045680 4841 scope.go:117] "RemoveContainer" containerID="478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.067190 4841 scope.go:117] "RemoveContainer" containerID="478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2" Jan 30 05:29:54 crc kubenswrapper[4841]: E0130 05:29:54.067611 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2\": container with ID starting with 478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2 not found: ID does not exist" containerID="478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.067654 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2"} err="failed to get container status \"478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2\": rpc error: code = NotFound desc = could not find container \"478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2\": container with ID starting with 478616da8e7b4f7bb050f2daec156b5c290258c38d38b01656f94a0ca46f27b2 not found: ID does not exist" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.444002 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d25b55-9643-45fd-b2fe-eb593334924d" path="/var/lib/kubelet/pods/47d25b55-9643-45fd-b2fe-eb593334924d/volumes" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.445213 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a29cc8-4615-40e7-a687-1852db124ba0" path="/var/lib/kubelet/pods/60a29cc8-4615-40e7-a687-1852db124ba0/volumes" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.446239 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86140170-ca48-47e9-b587-43f98f3624c1" path="/var/lib/kubelet/pods/86140170-ca48-47e9-b587-43f98f3624c1/volumes" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.449190 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a035e8ef-e433-4c59-a0fd-09937eb5f226" path="/var/lib/kubelet/pods/a035e8ef-e433-4c59-a0fd-09937eb5f226/volumes" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.450652 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc423120-ba93-465b-8ef8-871904b901ef" path="/var/lib/kubelet/pods/cc423120-ba93-465b-8ef8-871904b901ef/volumes" Jan 30 05:29:54 crc kubenswrapper[4841]: I0130 05:29:54.451767 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" path="/var/lib/kubelet/pods/d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc/volumes" Jan 30 05:29:54 crc kubenswrapper[4841]: E0130 05:29:54.570189 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:54 crc kubenswrapper[4841]: E0130 05:29:54.570720 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:54 crc kubenswrapper[4841]: E0130 05:29:54.571670 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:54 crc kubenswrapper[4841]: E0130 05:29:54.571727 4841 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovsdb-server" Jan 30 05:29:54 crc kubenswrapper[4841]: E0130 05:29:54.572293 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:54 crc kubenswrapper[4841]: E0130 05:29:54.574270 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:54 crc kubenswrapper[4841]: E0130 05:29:54.576380 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:54 crc kubenswrapper[4841]: E0130 05:29:54.576478 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovs-vswitchd" Jan 30 05:29:59 crc kubenswrapper[4841]: E0130 05:29:59.569471 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:59 crc kubenswrapper[4841]: E0130 05:29:59.570574 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:59 crc kubenswrapper[4841]: E0130 05:29:59.574514 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:29:59 crc kubenswrapper[4841]: E0130 05:29:59.574570 4841 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovsdb-server" Jan 30 05:29:59 crc kubenswrapper[4841]: E0130 05:29:59.575429 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:59 crc kubenswrapper[4841]: E0130 05:29:59.578651 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:59 crc kubenswrapper[4841]: E0130 05:29:59.580032 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:29:59 crc kubenswrapper[4841]: E0130 05:29:59.580076 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovs-vswitchd" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147086 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz"] Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147518 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerName="extract-utilities" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147545 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerName="extract-utilities" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147567 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2461f9-732a-448f-a5d7-7528bc3956e3" containerName="galera" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147579 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2461f9-732a-448f-a5d7-7528bc3956e3" containerName="galera" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147596 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerName="nova-metadata-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147608 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerName="nova-metadata-log" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147625 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="proxy-httpd" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147637 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="proxy-httpd" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147654 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84924340-1dd2-488e-a6e6-adfe62b61f2f" containerName="openstack-network-exporter" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147665 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="84924340-1dd2-488e-a6e6-adfe62b61f2f" containerName="openstack-network-exporter" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147689 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9baa24e8-552c-425b-a494-ca70b9bcff0c" containerName="nova-cell0-conductor-conductor" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147702 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9baa24e8-552c-425b-a494-ca70b9bcff0c" containerName="nova-cell0-conductor-conductor" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147723 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerName="extract-content" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147735 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerName="extract-content" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147757 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc423120-ba93-465b-8ef8-871904b901ef" containerName="rabbitmq" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147769 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc423120-ba93-465b-8ef8-871904b901ef" containerName="rabbitmq" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147784 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b29e384-bd86-4102-8a9e-4745cd0ae8d5" containerName="barbican-keystone-listener" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147796 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b29e384-bd86-4102-8a9e-4745cd0ae8d5" containerName="barbican-keystone-listener" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147819 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a2724da-6b9b-4947-a4e3-894938742304" containerName="cinder-api-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147833 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a2724da-6b9b-4947-a4e3-894938742304" containerName="cinder-api-log" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147851 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc423120-ba93-465b-8ef8-871904b901ef" containerName="setup-container" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147863 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc423120-ba93-465b-8ef8-871904b901ef" containerName="setup-container" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147884 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a035e8ef-e433-4c59-a0fd-09937eb5f226" containerName="barbican-worker-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147895 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a035e8ef-e433-4c59-a0fd-09937eb5f226" containerName="barbican-worker-log" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147915 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7779ad-0912-4695-853f-3ce786c2e9ae" containerName="rabbitmq" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147926 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7779ad-0912-4695-853f-3ce786c2e9ae" containerName="rabbitmq" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147945 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" containerName="nova-api-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147958 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" containerName="nova-api-log" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.147978 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b29e384-bd86-4102-8a9e-4745cd0ae8d5" containerName="barbican-keystone-listener-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.147990 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b29e384-bd86-4102-8a9e-4745cd0ae8d5" containerName="barbican-keystone-listener-log" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148009 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91db9edf-7d6d-4189-aaac-480a438900be" containerName="barbican-api-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148021 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="91db9edf-7d6d-4189-aaac-480a438900be" containerName="barbican-api-log" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148043 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28551500-d017-475a-aae4-8352782c0b4e" containerName="placement-api" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148055 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="28551500-d017-475a-aae4-8352782c0b4e" containerName="placement-api" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148074 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91db9edf-7d6d-4189-aaac-480a438900be" containerName="barbican-api" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148086 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="91db9edf-7d6d-4189-aaac-480a438900be" containerName="barbican-api" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148098 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e49c58-8075-46a1-8bfd-44412a673589" containerName="glance-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148110 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e49c58-8075-46a1-8bfd-44412a673589" containerName="glance-log" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148131 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73fdf532-7bb7-43db-acbc-b949166ccd6b" containerName="glance-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148143 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="73fdf532-7bb7-43db-acbc-b949166ccd6b" containerName="glance-log" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148165 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84924340-1dd2-488e-a6e6-adfe62b61f2f" containerName="ovn-northd" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148177 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="84924340-1dd2-488e-a6e6-adfe62b61f2f" containerName="ovn-northd" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148194 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2461f9-732a-448f-a5d7-7528bc3956e3" containerName="mysql-bootstrap" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148205 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2461f9-732a-448f-a5d7-7528bc3956e3" containerName="mysql-bootstrap" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148227 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28551500-d017-475a-aae4-8352782c0b4e" containerName="placement-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148239 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="28551500-d017-475a-aae4-8352782c0b4e" containerName="placement-log" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148257 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" containerName="probe" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148268 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" containerName="probe" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148285 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="ceilometer-central-agent" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148298 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="ceilometer-central-agent" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148313 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b191848f-8f16-40e6-8a2b-f66a0179f359" containerName="keystone-api" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148324 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b191848f-8f16-40e6-8a2b-f66a0179f359" containerName="keystone-api" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148344 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerName="registry-server" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148356 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerName="registry-server" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148372 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7779ad-0912-4695-853f-3ce786c2e9ae" containerName="setup-container" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148383 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7779ad-0912-4695-853f-3ce786c2e9ae" containerName="setup-container" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148422 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e49c58-8075-46a1-8bfd-44412a673589" containerName="glance-httpd" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148434 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e49c58-8075-46a1-8bfd-44412a673589" containerName="glance-httpd" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148454 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a035e8ef-e433-4c59-a0fd-09937eb5f226" containerName="barbican-worker" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148466 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a035e8ef-e433-4c59-a0fd-09937eb5f226" containerName="barbican-worker" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148485 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90edf3da-3cbc-407f-9cfa-de97879f3834" containerName="memcached" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148496 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="90edf3da-3cbc-407f-9cfa-de97879f3834" containerName="memcached" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148510 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" containerName="nova-api-api" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148521 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" containerName="nova-api-api" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148537 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d25b55-9643-45fd-b2fe-eb593334924d" containerName="ovn-controller" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148548 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d25b55-9643-45fd-b2fe-eb593334924d" containerName="ovn-controller" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148572 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerName="nova-metadata-metadata" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148584 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerName="nova-metadata-metadata" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148604 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" containerName="cinder-scheduler" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148617 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" containerName="cinder-scheduler" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148631 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a29cc8-4615-40e7-a687-1852db124ba0" containerName="nova-cell1-conductor-conductor" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148646 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a29cc8-4615-40e7-a687-1852db124ba0" containerName="nova-cell1-conductor-conductor" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148668 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="ceilometer-notification-agent" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148679 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="ceilometer-notification-agent" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148695 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73fdf532-7bb7-43db-acbc-b949166ccd6b" containerName="glance-httpd" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148707 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="73fdf532-7bb7-43db-acbc-b949166ccd6b" containerName="glance-httpd" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148729 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86140170-ca48-47e9-b587-43f98f3624c1" containerName="nova-scheduler-scheduler" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148741 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="86140170-ca48-47e9-b587-43f98f3624c1" containerName="nova-scheduler-scheduler" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148755 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="sg-core" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148766 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="sg-core" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148785 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef067657-4804-406e-b45f-e19553dcd2d8" containerName="kube-state-metrics" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148797 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef067657-4804-406e-b45f-e19553dcd2d8" containerName="kube-state-metrics" Jan 30 05:30:00 crc kubenswrapper[4841]: E0130 05:30:00.148812 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a2724da-6b9b-4947-a4e3-894938742304" containerName="cinder-api" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.148823 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a2724da-6b9b-4947-a4e3-894938742304" containerName="cinder-api" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149062 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e49c58-8075-46a1-8bfd-44412a673589" containerName="glance-httpd" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149082 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="84924340-1dd2-488e-a6e6-adfe62b61f2f" containerName="openstack-network-exporter" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149096 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="86140170-ca48-47e9-b587-43f98f3624c1" containerName="nova-scheduler-scheduler" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149118 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e2461f9-732a-448f-a5d7-7528bc3956e3" containerName="galera" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149140 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a035e8ef-e433-4c59-a0fd-09937eb5f226" containerName="barbican-worker" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149161 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc423120-ba93-465b-8ef8-871904b901ef" containerName="rabbitmq" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149180 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="ceilometer-notification-agent" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149198 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerName="nova-metadata-metadata" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149213 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b781c5e-301c-4458-8dea-4494e6ff8ee1" containerName="registry-server" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149226 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="28551500-d017-475a-aae4-8352782c0b4e" containerName="placement-api" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149245 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b29e384-bd86-4102-8a9e-4745cd0ae8d5" containerName="barbican-keystone-listener" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149266 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="90edf3da-3cbc-407f-9cfa-de97879f3834" containerName="memcached" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149282 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="ceilometer-central-agent" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149299 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be8df86-7b8d-4741-ae13-ec1b243549b3" containerName="nova-metadata-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149315 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a29cc8-4615-40e7-a687-1852db124ba0" containerName="nova-cell1-conductor-conductor" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149337 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="28551500-d017-475a-aae4-8352782c0b4e" containerName="placement-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149372 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef067657-4804-406e-b45f-e19553dcd2d8" containerName="kube-state-metrics" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149392 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a035e8ef-e433-4c59-a0fd-09937eb5f226" containerName="barbican-worker-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149437 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="91db9edf-7d6d-4189-aaac-480a438900be" containerName="barbican-api" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149453 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9baa24e8-552c-425b-a494-ca70b9bcff0c" containerName="nova-cell0-conductor-conductor" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149493 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b29e384-bd86-4102-8a9e-4745cd0ae8d5" containerName="barbican-keystone-listener-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149511 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="73fdf532-7bb7-43db-acbc-b949166ccd6b" containerName="glance-httpd" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149525 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" containerName="nova-api-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149545 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e49c58-8075-46a1-8bfd-44412a673589" containerName="glance-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149557 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="91db9edf-7d6d-4189-aaac-480a438900be" containerName="barbican-api-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149569 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b191848f-8f16-40e6-8a2b-f66a0179f359" containerName="keystone-api" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149586 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="84924340-1dd2-488e-a6e6-adfe62b61f2f" containerName="ovn-northd" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149609 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="73fdf532-7bb7-43db-acbc-b949166ccd6b" containerName="glance-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149624 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad7779ad-0912-4695-853f-3ce786c2e9ae" containerName="rabbitmq" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149641 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" containerName="probe" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149660 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="proxy-httpd" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149676 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecdab3bb-c4de-4c49-9988-d9ed592a40a7" containerName="nova-api-api" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149695 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a2724da-6b9b-4947-a4e3-894938742304" containerName="cinder-api" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149710 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="84508935-db34-4e2b-a3af-800ac432353b" containerName="sg-core" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149724 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d25b55-9643-45fd-b2fe-eb593334924d" containerName="ovn-controller" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149739 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a2724da-6b9b-4947-a4e3-894938742304" containerName="cinder-api-log" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.149751 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d656fb8e-ddd3-4aab-8b1b-10ed00ed54fc" containerName="cinder-scheduler" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.150490 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.155708 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz"] Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.155759 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.158000 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.362454 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3827ca89-b447-4c79-a946-bb1170c1e039-config-volume\") pod \"collect-profiles-29495850-v5jgz\" (UID: \"3827ca89-b447-4c79-a946-bb1170c1e039\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.362554 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm2nt\" (UniqueName: \"kubernetes.io/projected/3827ca89-b447-4c79-a946-bb1170c1e039-kube-api-access-zm2nt\") pod \"collect-profiles-29495850-v5jgz\" (UID: \"3827ca89-b447-4c79-a946-bb1170c1e039\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.362659 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3827ca89-b447-4c79-a946-bb1170c1e039-secret-volume\") pod \"collect-profiles-29495850-v5jgz\" (UID: \"3827ca89-b447-4c79-a946-bb1170c1e039\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.465057 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3827ca89-b447-4c79-a946-bb1170c1e039-config-volume\") pod \"collect-profiles-29495850-v5jgz\" (UID: \"3827ca89-b447-4c79-a946-bb1170c1e039\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.465189 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm2nt\" (UniqueName: \"kubernetes.io/projected/3827ca89-b447-4c79-a946-bb1170c1e039-kube-api-access-zm2nt\") pod \"collect-profiles-29495850-v5jgz\" (UID: \"3827ca89-b447-4c79-a946-bb1170c1e039\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.465445 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3827ca89-b447-4c79-a946-bb1170c1e039-secret-volume\") pod \"collect-profiles-29495850-v5jgz\" (UID: \"3827ca89-b447-4c79-a946-bb1170c1e039\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.466513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3827ca89-b447-4c79-a946-bb1170c1e039-config-volume\") pod \"collect-profiles-29495850-v5jgz\" (UID: \"3827ca89-b447-4c79-a946-bb1170c1e039\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.486676 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3827ca89-b447-4c79-a946-bb1170c1e039-secret-volume\") pod \"collect-profiles-29495850-v5jgz\" (UID: \"3827ca89-b447-4c79-a946-bb1170c1e039\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.491310 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm2nt\" (UniqueName: \"kubernetes.io/projected/3827ca89-b447-4c79-a946-bb1170c1e039-kube-api-access-zm2nt\") pod \"collect-profiles-29495850-v5jgz\" (UID: \"3827ca89-b447-4c79-a946-bb1170c1e039\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:00 crc kubenswrapper[4841]: I0130 05:30:00.783217 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:01 crc kubenswrapper[4841]: I0130 05:30:01.066213 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz"] Jan 30 05:30:01 crc kubenswrapper[4841]: W0130 05:30:01.072945 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3827ca89_b447_4c79_a946_bb1170c1e039.slice/crio-3da918589706b4d29a096cbc8710586f6268d008bc4303e1bfdb96affb953b98 WatchSource:0}: Error finding container 3da918589706b4d29a096cbc8710586f6268d008bc4303e1bfdb96affb953b98: Status 404 returned error can't find the container with id 3da918589706b4d29a096cbc8710586f6268d008bc4303e1bfdb96affb953b98 Jan 30 05:30:02 crc kubenswrapper[4841]: I0130 05:30:02.023762 4841 generic.go:334] "Generic (PLEG): container finished" podID="3827ca89-b447-4c79-a946-bb1170c1e039" containerID="435aae0b9f4a9db91cf6f64a68c68cc9cf71c6c3a4aa3c5817295a3f6d932f9d" exitCode=0 Jan 30 05:30:02 crc kubenswrapper[4841]: I0130 05:30:02.024152 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" event={"ID":"3827ca89-b447-4c79-a946-bb1170c1e039","Type":"ContainerDied","Data":"435aae0b9f4a9db91cf6f64a68c68cc9cf71c6c3a4aa3c5817295a3f6d932f9d"} Jan 30 05:30:02 crc kubenswrapper[4841]: I0130 05:30:02.024241 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" event={"ID":"3827ca89-b447-4c79-a946-bb1170c1e039","Type":"ContainerStarted","Data":"3da918589706b4d29a096cbc8710586f6268d008bc4303e1bfdb96affb953b98"} Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.035000 4841 generic.go:334] "Generic (PLEG): container finished" podID="8ad9e30b-abf9-45fd-9088-103c94e4ed70" containerID="507b9c74c5025e94796600d70c3425d021076e350f65b3c2da7a6f02528353c9" exitCode=0 Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.035356 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65977b5879-qctf6" event={"ID":"8ad9e30b-abf9-45fd-9088-103c94e4ed70","Type":"ContainerDied","Data":"507b9c74c5025e94796600d70c3425d021076e350f65b3c2da7a6f02528353c9"} Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.229995 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.408879 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-combined-ca-bundle\") pod \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.409179 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-public-tls-certs\") pod \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.409213 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-config\") pod \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.409276 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc5nf\" (UniqueName: \"kubernetes.io/projected/8ad9e30b-abf9-45fd-9088-103c94e4ed70-kube-api-access-cc5nf\") pod \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.409313 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-internal-tls-certs\") pod \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.409347 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-ovndb-tls-certs\") pod \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.409448 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-httpd-config\") pod \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\" (UID: \"8ad9e30b-abf9-45fd-9088-103c94e4ed70\") " Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.417094 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad9e30b-abf9-45fd-9088-103c94e4ed70-kube-api-access-cc5nf" (OuterVolumeSpecName: "kube-api-access-cc5nf") pod "8ad9e30b-abf9-45fd-9088-103c94e4ed70" (UID: "8ad9e30b-abf9-45fd-9088-103c94e4ed70"). InnerVolumeSpecName "kube-api-access-cc5nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.430630 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8ad9e30b-abf9-45fd-9088-103c94e4ed70" (UID: "8ad9e30b-abf9-45fd-9088-103c94e4ed70"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.436895 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.511516 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.511578 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc5nf\" (UniqueName: \"kubernetes.io/projected/8ad9e30b-abf9-45fd-9088-103c94e4ed70-kube-api-access-cc5nf\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.531550 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-config" (OuterVolumeSpecName: "config") pod "8ad9e30b-abf9-45fd-9088-103c94e4ed70" (UID: "8ad9e30b-abf9-45fd-9088-103c94e4ed70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.540681 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ad9e30b-abf9-45fd-9088-103c94e4ed70" (UID: "8ad9e30b-abf9-45fd-9088-103c94e4ed70"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.558569 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8ad9e30b-abf9-45fd-9088-103c94e4ed70" (UID: "8ad9e30b-abf9-45fd-9088-103c94e4ed70"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.559197 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8ad9e30b-abf9-45fd-9088-103c94e4ed70" (UID: "8ad9e30b-abf9-45fd-9088-103c94e4ed70"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.565330 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ad9e30b-abf9-45fd-9088-103c94e4ed70" (UID: "8ad9e30b-abf9-45fd-9088-103c94e4ed70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.613173 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm2nt\" (UniqueName: \"kubernetes.io/projected/3827ca89-b447-4c79-a946-bb1170c1e039-kube-api-access-zm2nt\") pod \"3827ca89-b447-4c79-a946-bb1170c1e039\" (UID: \"3827ca89-b447-4c79-a946-bb1170c1e039\") " Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.613218 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3827ca89-b447-4c79-a946-bb1170c1e039-config-volume\") pod \"3827ca89-b447-4c79-a946-bb1170c1e039\" (UID: \"3827ca89-b447-4c79-a946-bb1170c1e039\") " Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.613293 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3827ca89-b447-4c79-a946-bb1170c1e039-secret-volume\") pod \"3827ca89-b447-4c79-a946-bb1170c1e039\" (UID: \"3827ca89-b447-4c79-a946-bb1170c1e039\") " Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.613952 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3827ca89-b447-4c79-a946-bb1170c1e039-config-volume" (OuterVolumeSpecName: "config-volume") pod "3827ca89-b447-4c79-a946-bb1170c1e039" (UID: "3827ca89-b447-4c79-a946-bb1170c1e039"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.614095 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.614115 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.614126 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-config\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.614136 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.614144 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3827ca89-b447-4c79-a946-bb1170c1e039-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.614152 4841 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ad9e30b-abf9-45fd-9088-103c94e4ed70-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.616516 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3827ca89-b447-4c79-a946-bb1170c1e039-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3827ca89-b447-4c79-a946-bb1170c1e039" (UID: "3827ca89-b447-4c79-a946-bb1170c1e039"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.617071 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3827ca89-b447-4c79-a946-bb1170c1e039-kube-api-access-zm2nt" (OuterVolumeSpecName: "kube-api-access-zm2nt") pod "3827ca89-b447-4c79-a946-bb1170c1e039" (UID: "3827ca89-b447-4c79-a946-bb1170c1e039"). InnerVolumeSpecName "kube-api-access-zm2nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.715426 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm2nt\" (UniqueName: \"kubernetes.io/projected/3827ca89-b447-4c79-a946-bb1170c1e039-kube-api-access-zm2nt\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:03 crc kubenswrapper[4841]: I0130 05:30:03.715464 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3827ca89-b447-4c79-a946-bb1170c1e039-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:04 crc kubenswrapper[4841]: I0130 05:30:04.048464 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" event={"ID":"3827ca89-b447-4c79-a946-bb1170c1e039","Type":"ContainerDied","Data":"3da918589706b4d29a096cbc8710586f6268d008bc4303e1bfdb96affb953b98"} Jan 30 05:30:04 crc kubenswrapper[4841]: I0130 05:30:04.048506 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz" Jan 30 05:30:04 crc kubenswrapper[4841]: I0130 05:30:04.048523 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3da918589706b4d29a096cbc8710586f6268d008bc4303e1bfdb96affb953b98" Jan 30 05:30:04 crc kubenswrapper[4841]: I0130 05:30:04.050964 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65977b5879-qctf6" event={"ID":"8ad9e30b-abf9-45fd-9088-103c94e4ed70","Type":"ContainerDied","Data":"707af2a63bdd05af37293d4e7a98f5b96492243fd1c69d8b55b8fce62b1f6704"} Jan 30 05:30:04 crc kubenswrapper[4841]: I0130 05:30:04.051028 4841 scope.go:117] "RemoveContainer" containerID="71a6a5266aac4658f33e51c0327009b65b98cc2a8a908dc821c307eb9aa11b89" Jan 30 05:30:04 crc kubenswrapper[4841]: I0130 05:30:04.051098 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65977b5879-qctf6" Jan 30 05:30:04 crc kubenswrapper[4841]: I0130 05:30:04.085646 4841 scope.go:117] "RemoveContainer" containerID="507b9c74c5025e94796600d70c3425d021076e350f65b3c2da7a6f02528353c9" Jan 30 05:30:04 crc kubenswrapper[4841]: I0130 05:30:04.108239 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-65977b5879-qctf6"] Jan 30 05:30:04 crc kubenswrapper[4841]: I0130 05:30:04.118660 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-65977b5879-qctf6"] Jan 30 05:30:04 crc kubenswrapper[4841]: I0130 05:30:04.448888 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad9e30b-abf9-45fd-9088-103c94e4ed70" path="/var/lib/kubelet/pods/8ad9e30b-abf9-45fd-9088-103c94e4ed70/volumes" Jan 30 05:30:04 crc kubenswrapper[4841]: E0130 05:30:04.568798 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:04 crc kubenswrapper[4841]: E0130 05:30:04.569372 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:04 crc kubenswrapper[4841]: E0130 05:30:04.570034 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:04 crc kubenswrapper[4841]: E0130 05:30:04.570101 4841 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovsdb-server" Jan 30 05:30:04 crc kubenswrapper[4841]: E0130 05:30:04.571081 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:04 crc kubenswrapper[4841]: E0130 05:30:04.573206 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:04 crc kubenswrapper[4841]: E0130 05:30:04.575021 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:04 crc kubenswrapper[4841]: E0130 05:30:04.575070 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovs-vswitchd" Jan 30 05:30:09 crc kubenswrapper[4841]: E0130 05:30:09.569291 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:09 crc kubenswrapper[4841]: E0130 05:30:09.570805 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:09 crc kubenswrapper[4841]: E0130 05:30:09.571163 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 30 05:30:09 crc kubenswrapper[4841]: E0130 05:30:09.571214 4841 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovsdb-server" Jan 30 05:30:09 crc kubenswrapper[4841]: E0130 05:30:09.573367 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:09 crc kubenswrapper[4841]: E0130 05:30:09.575386 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:09 crc kubenswrapper[4841]: E0130 05:30:09.577578 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 30 05:30:09 crc kubenswrapper[4841]: E0130 05:30:09.577642 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lbv2q" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovs-vswitchd" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.194799 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lbv2q_582a9577-0530-4793-8723-01681bdcfda4/ovs-vswitchd/0.log" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.197556 4841 generic.go:334] "Generic (PLEG): container finished" podID="582a9577-0530-4793-8723-01681bdcfda4" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" exitCode=137 Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.197615 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lbv2q" event={"ID":"582a9577-0530-4793-8723-01681bdcfda4","Type":"ContainerDied","Data":"6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857"} Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.425966 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lbv2q_582a9577-0530-4793-8723-01681bdcfda4/ovs-vswitchd/0.log" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.427264 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.593295 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/582a9577-0530-4793-8723-01681bdcfda4-scripts\") pod \"582a9577-0530-4793-8723-01681bdcfda4\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.593389 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-etc-ovs\") pod \"582a9577-0530-4793-8723-01681bdcfda4\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.593520 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-run\") pod \"582a9577-0530-4793-8723-01681bdcfda4\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.593512 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "582a9577-0530-4793-8723-01681bdcfda4" (UID: "582a9577-0530-4793-8723-01681bdcfda4"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.593548 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqz9b\" (UniqueName: \"kubernetes.io/projected/582a9577-0530-4793-8723-01681bdcfda4-kube-api-access-rqz9b\") pod \"582a9577-0530-4793-8723-01681bdcfda4\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.593569 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-log\") pod \"582a9577-0530-4793-8723-01681bdcfda4\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.593590 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-lib\") pod \"582a9577-0530-4793-8723-01681bdcfda4\" (UID: \"582a9577-0530-4793-8723-01681bdcfda4\") " Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.593592 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-run" (OuterVolumeSpecName: "var-run") pod "582a9577-0530-4793-8723-01681bdcfda4" (UID: "582a9577-0530-4793-8723-01681bdcfda4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.593923 4841 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-etc-ovs\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.593933 4841 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.594179 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-log" (OuterVolumeSpecName: "var-log") pod "582a9577-0530-4793-8723-01681bdcfda4" (UID: "582a9577-0530-4793-8723-01681bdcfda4"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.594223 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-lib" (OuterVolumeSpecName: "var-lib") pod "582a9577-0530-4793-8723-01681bdcfda4" (UID: "582a9577-0530-4793-8723-01681bdcfda4"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.596290 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/582a9577-0530-4793-8723-01681bdcfda4-scripts" (OuterVolumeSpecName: "scripts") pod "582a9577-0530-4793-8723-01681bdcfda4" (UID: "582a9577-0530-4793-8723-01681bdcfda4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.601248 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/582a9577-0530-4793-8723-01681bdcfda4-kube-api-access-rqz9b" (OuterVolumeSpecName: "kube-api-access-rqz9b") pod "582a9577-0530-4793-8723-01681bdcfda4" (UID: "582a9577-0530-4793-8723-01681bdcfda4"). InnerVolumeSpecName "kube-api-access-rqz9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.695022 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqz9b\" (UniqueName: \"kubernetes.io/projected/582a9577-0530-4793-8723-01681bdcfda4-kube-api-access-rqz9b\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.695056 4841 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-log\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.695065 4841 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/582a9577-0530-4793-8723-01681bdcfda4-var-lib\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.695074 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/582a9577-0530-4793-8723-01681bdcfda4-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:14 crc kubenswrapper[4841]: I0130 05:30:14.953808 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.099667 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b70c69eb-7b62-446a-8748-9a80d6fbe28b-lock\") pod \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.099738 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b70c69eb-7b62-446a-8748-9a80d6fbe28b-cache\") pod \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.099784 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b70c69eb-7b62-446a-8748-9a80d6fbe28b-combined-ca-bundle\") pod \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.099838 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2jw7\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-kube-api-access-g2jw7\") pod \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.099922 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.100025 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift\") pod \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\" (UID: \"b70c69eb-7b62-446a-8748-9a80d6fbe28b\") " Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.100457 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b70c69eb-7b62-446a-8748-9a80d6fbe28b-lock" (OuterVolumeSpecName: "lock") pod "b70c69eb-7b62-446a-8748-9a80d6fbe28b" (UID: "b70c69eb-7b62-446a-8748-9a80d6fbe28b"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.100628 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b70c69eb-7b62-446a-8748-9a80d6fbe28b-cache" (OuterVolumeSpecName: "cache") pod "b70c69eb-7b62-446a-8748-9a80d6fbe28b" (UID: "b70c69eb-7b62-446a-8748-9a80d6fbe28b"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.101568 4841 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b70c69eb-7b62-446a-8748-9a80d6fbe28b-lock\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.101609 4841 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b70c69eb-7b62-446a-8748-9a80d6fbe28b-cache\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.106193 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "swift") pod "b70c69eb-7b62-446a-8748-9a80d6fbe28b" (UID: "b70c69eb-7b62-446a-8748-9a80d6fbe28b"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.106260 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-kube-api-access-g2jw7" (OuterVolumeSpecName: "kube-api-access-g2jw7") pod "b70c69eb-7b62-446a-8748-9a80d6fbe28b" (UID: "b70c69eb-7b62-446a-8748-9a80d6fbe28b"). InnerVolumeSpecName "kube-api-access-g2jw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.113670 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b70c69eb-7b62-446a-8748-9a80d6fbe28b" (UID: "b70c69eb-7b62-446a-8748-9a80d6fbe28b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.202964 4841 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.205492 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2jw7\" (UniqueName: \"kubernetes.io/projected/b70c69eb-7b62-446a-8748-9a80d6fbe28b-kube-api-access-g2jw7\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.205556 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.214906 4841 generic.go:334] "Generic (PLEG): container finished" podID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerID="1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c" exitCode=137 Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.214966 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c"} Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.215026 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b70c69eb-7b62-446a-8748-9a80d6fbe28b","Type":"ContainerDied","Data":"b6be38aa978c80c7d631871602aee59c9e2747140cd9edc429408448302bfdaa"} Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.215063 4841 scope.go:117] "RemoveContainer" containerID="1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.215318 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.222388 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lbv2q_582a9577-0530-4793-8723-01681bdcfda4/ovs-vswitchd/0.log" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.223392 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lbv2q" event={"ID":"582a9577-0530-4793-8723-01681bdcfda4","Type":"ContainerDied","Data":"71c2d9813a015d8a631b3ab4999791ef8e0fb3377202fc4db173f9657230e629"} Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.223546 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lbv2q" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.229101 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.246294 4841 scope.go:117] "RemoveContainer" containerID="ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.261607 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-lbv2q"] Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.270303 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-lbv2q"] Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.282010 4841 scope.go:117] "RemoveContainer" containerID="d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.304161 4841 scope.go:117] "RemoveContainer" containerID="0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.307194 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.331634 4841 scope.go:117] "RemoveContainer" containerID="9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.359932 4841 scope.go:117] "RemoveContainer" containerID="cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.380730 4841 scope.go:117] "RemoveContainer" containerID="a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.402242 4841 scope.go:117] "RemoveContainer" containerID="4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.425216 4841 scope.go:117] "RemoveContainer" containerID="4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.426640 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b70c69eb-7b62-446a-8748-9a80d6fbe28b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b70c69eb-7b62-446a-8748-9a80d6fbe28b" (UID: "b70c69eb-7b62-446a-8748-9a80d6fbe28b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.444576 4841 scope.go:117] "RemoveContainer" containerID="3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.466463 4841 scope.go:117] "RemoveContainer" containerID="a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.491346 4841 scope.go:117] "RemoveContainer" containerID="092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.509228 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b70c69eb-7b62-446a-8748-9a80d6fbe28b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.514166 4841 scope.go:117] "RemoveContainer" containerID="0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.541554 4841 scope.go:117] "RemoveContainer" containerID="dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.566378 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.577128 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.598591 4841 scope.go:117] "RemoveContainer" containerID="31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.620185 4841 scope.go:117] "RemoveContainer" containerID="1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.620659 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c\": container with ID starting with 1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c not found: ID does not exist" containerID="1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.620798 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c"} err="failed to get container status \"1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c\": rpc error: code = NotFound desc = could not find container \"1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c\": container with ID starting with 1d0d3ea5cd4f17462a0e58531670232fc05e76c5f395917b0323050cb0af836c not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.621013 4841 scope.go:117] "RemoveContainer" containerID="ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.621445 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864\": container with ID starting with ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864 not found: ID does not exist" containerID="ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.621486 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864"} err="failed to get container status \"ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864\": rpc error: code = NotFound desc = could not find container \"ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864\": container with ID starting with ea247d63ceae89c06639294d33105caa7494f7025ba9cc0ff2baacdd113aa864 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.621515 4841 scope.go:117] "RemoveContainer" containerID="d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.622195 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403\": container with ID starting with d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403 not found: ID does not exist" containerID="d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.622323 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403"} err="failed to get container status \"d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403\": rpc error: code = NotFound desc = could not find container \"d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403\": container with ID starting with d7187920aa13a5f4e8c9b3d06bafca1007a2f86b063bb5a42b3b09a5fbb84403 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.622450 4841 scope.go:117] "RemoveContainer" containerID="0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.622841 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e\": container with ID starting with 0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e not found: ID does not exist" containerID="0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.622875 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e"} err="failed to get container status \"0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e\": rpc error: code = NotFound desc = could not find container \"0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e\": container with ID starting with 0a3d11b426d7eb63a85438318a5b1e6e9bf58396c55e7a0d02b0e45d8f02237e not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.622892 4841 scope.go:117] "RemoveContainer" containerID="9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.623483 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab\": container with ID starting with 9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab not found: ID does not exist" containerID="9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.623611 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab"} err="failed to get container status \"9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab\": rpc error: code = NotFound desc = could not find container \"9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab\": container with ID starting with 9d865f6e7a91c94c5b1279862159ae8f1feabc9e7acaf18b9d844a8b0503feab not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.623696 4841 scope.go:117] "RemoveContainer" containerID="cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.624203 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63\": container with ID starting with cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63 not found: ID does not exist" containerID="cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.624273 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63"} err="failed to get container status \"cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63\": rpc error: code = NotFound desc = could not find container \"cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63\": container with ID starting with cc96689a0aaf41b7825f3cb58d836803df16f44fe7f7ea35f679f361593b5d63 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.624328 4841 scope.go:117] "RemoveContainer" containerID="a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.624796 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07\": container with ID starting with a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07 not found: ID does not exist" containerID="a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.624825 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07"} err="failed to get container status \"a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07\": rpc error: code = NotFound desc = could not find container \"a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07\": container with ID starting with a5754928e32c1ebcd28e53bbec22ec5bc8acf82d9fbb8cecab814066b4867e07 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.624848 4841 scope.go:117] "RemoveContainer" containerID="4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.625505 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284\": container with ID starting with 4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284 not found: ID does not exist" containerID="4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.625570 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284"} err="failed to get container status \"4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284\": rpc error: code = NotFound desc = could not find container \"4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284\": container with ID starting with 4271b3971428a40d5d7ac4163e3b7ff54fbd9b7fb00021894518810060938284 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.625610 4841 scope.go:117] "RemoveContainer" containerID="4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.626585 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131\": container with ID starting with 4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131 not found: ID does not exist" containerID="4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.626706 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131"} err="failed to get container status \"4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131\": rpc error: code = NotFound desc = could not find container \"4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131\": container with ID starting with 4b603aa8b908f309f1bb944ace86fd98a9b59bd96b719e28188920d6e77fe131 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.626791 4841 scope.go:117] "RemoveContainer" containerID="3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.627188 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4\": container with ID starting with 3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4 not found: ID does not exist" containerID="3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.627220 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4"} err="failed to get container status \"3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4\": rpc error: code = NotFound desc = could not find container \"3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4\": container with ID starting with 3a62b9350c6733bb4530efc264b9c6d464ada907d34ba9bc60c798b347c98ce4 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.627240 4841 scope.go:117] "RemoveContainer" containerID="a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.627790 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252\": container with ID starting with a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252 not found: ID does not exist" containerID="a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.627901 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252"} err="failed to get container status \"a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252\": rpc error: code = NotFound desc = could not find container \"a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252\": container with ID starting with a3cc8bc115880061618975c4df841a965c783571294930e67f66659f5fba2252 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.627999 4841 scope.go:117] "RemoveContainer" containerID="092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.628427 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c\": container with ID starting with 092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c not found: ID does not exist" containerID="092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.628564 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c"} err="failed to get container status \"092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c\": rpc error: code = NotFound desc = could not find container \"092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c\": container with ID starting with 092f938ad84d3c6473988d000c9c5ec32377301ee3a6725bdba811fb95161d6c not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.628688 4841 scope.go:117] "RemoveContainer" containerID="0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.629056 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405\": container with ID starting with 0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405 not found: ID does not exist" containerID="0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.629091 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405"} err="failed to get container status \"0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405\": rpc error: code = NotFound desc = could not find container \"0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405\": container with ID starting with 0888fc6d1dfe5c7443f78b23e447d53429bc8db154e4c4a260b031b7bb82e405 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.629109 4841 scope.go:117] "RemoveContainer" containerID="dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.629687 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14\": container with ID starting with dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14 not found: ID does not exist" containerID="dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.629753 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14"} err="failed to get container status \"dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14\": rpc error: code = NotFound desc = could not find container \"dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14\": container with ID starting with dfd07a9b38321970ef1b982959a79cf5910b54a3aee1ab7adc92cbee46989e14 not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.629802 4841 scope.go:117] "RemoveContainer" containerID="31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb" Jan 30 05:30:15 crc kubenswrapper[4841]: E0130 05:30:15.630275 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb\": container with ID starting with 31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb not found: ID does not exist" containerID="31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.630338 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb"} err="failed to get container status \"31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb\": rpc error: code = NotFound desc = could not find container \"31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb\": container with ID starting with 31ce0586d7ea8a51e7b677267c8b37cafa30ce59cba52ce3faeb7d6944fa68fb not found: ID does not exist" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.630377 4841 scope.go:117] "RemoveContainer" containerID="6d078f19b01c3c192f2211ce36a9f80e12b7e1cc3bb74c9b38ead0ef1d7f3857" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.660156 4841 scope.go:117] "RemoveContainer" containerID="0574b86f1e2ad387b2e6285017e67a16694cf4bcd9a8fbe33792017f5fe5cb15" Jan 30 05:30:15 crc kubenswrapper[4841]: I0130 05:30:15.686644 4841 scope.go:117] "RemoveContainer" containerID="f264178a5a1264f97f0566a592589097cee3ffb7247b2cd741139e99d585ed2f" Jan 30 05:30:16 crc kubenswrapper[4841]: I0130 05:30:16.447751 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="582a9577-0530-4793-8723-01681bdcfda4" path="/var/lib/kubelet/pods/582a9577-0530-4793-8723-01681bdcfda4/volumes" Jan 30 05:30:16 crc kubenswrapper[4841]: I0130 05:30:16.449258 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" path="/var/lib/kubelet/pods/b70c69eb-7b62-446a-8748-9a80d6fbe28b/volumes" Jan 30 05:30:16 crc kubenswrapper[4841]: I0130 05:30:16.485359 4841 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podf198eff9-f493-43d9-9b64-06196b205142"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podf198eff9-f493-43d9-9b64-06196b205142] : Timed out while waiting for systemd to remove kubepods-besteffort-podf198eff9_f493_43d9_9b64_06196b205142.slice" Jan 30 05:30:18 crc kubenswrapper[4841]: I0130 05:30:18.769315 4841 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod1a2724da-6b9b-4947-a4e3-894938742304"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod1a2724da-6b9b-4947-a4e3-894938742304] : Timed out while waiting for systemd to remove kubepods-besteffort-pod1a2724da_6b9b_4947_a4e3_894938742304.slice" Jan 30 05:30:18 crc kubenswrapper[4841]: E0130 05:30:18.769585 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod1a2724da-6b9b-4947-a4e3-894938742304] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod1a2724da-6b9b-4947-a4e3-894938742304] : Timed out while waiting for systemd to remove kubepods-besteffort-pod1a2724da_6b9b_4947_a4e3_894938742304.slice" pod="openstack/cinder-api-0" podUID="1a2724da-6b9b-4947-a4e3-894938742304" Jan 30 05:30:19 crc kubenswrapper[4841]: I0130 05:30:19.047833 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="73fdf532-7bb7-43db-acbc-b949166ccd6b" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.177:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 05:30:19 crc kubenswrapper[4841]: I0130 05:30:19.047933 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="73fdf532-7bb7-43db-acbc-b949166ccd6b" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.177:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 05:30:19 crc kubenswrapper[4841]: I0130 05:30:19.273347 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 05:30:19 crc kubenswrapper[4841]: I0130 05:30:19.299913 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:30:19 crc kubenswrapper[4841]: I0130 05:30:19.306985 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 30 05:30:20 crc kubenswrapper[4841]: I0130 05:30:20.447150 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a2724da-6b9b-4947-a4e3-894938742304" path="/var/lib/kubelet/pods/1a2724da-6b9b-4947-a4e3-894938742304/volumes" Jan 30 05:31:40 crc kubenswrapper[4841]: I0130 05:31:40.463618 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:31:40 crc kubenswrapper[4841]: I0130 05:31:40.464496 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.041249 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2v8s8"] Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042081 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-reaper" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042103 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-reaper" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042126 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-auditor" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042138 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-auditor" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042159 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3827ca89-b447-4c79-a946-bb1170c1e039" containerName="collect-profiles" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042172 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3827ca89-b447-4c79-a946-bb1170c1e039" containerName="collect-profiles" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042192 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-auditor" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042204 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-auditor" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042224 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-updater" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042235 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-updater" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042256 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovsdb-server-init" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042268 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovsdb-server-init" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042292 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovs-vswitchd" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042304 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovs-vswitchd" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042326 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-replicator" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042338 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-replicator" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042359 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-server" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042371 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-server" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042391 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-replicator" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042428 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-replicator" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042451 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-auditor" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042463 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-auditor" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042481 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-server" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042493 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-server" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042506 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-expirer" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042518 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-expirer" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042538 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="swift-recon-cron" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042551 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="swift-recon-cron" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042569 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad9e30b-abf9-45fd-9088-103c94e4ed70" containerName="neutron-httpd" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042582 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad9e30b-abf9-45fd-9088-103c94e4ed70" containerName="neutron-httpd" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042596 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="rsync" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042607 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="rsync" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042625 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovsdb-server" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042637 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovsdb-server" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042654 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-replicator" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042665 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-replicator" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042678 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad9e30b-abf9-45fd-9088-103c94e4ed70" containerName="neutron-api" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042690 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad9e30b-abf9-45fd-9088-103c94e4ed70" containerName="neutron-api" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042708 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-server" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042719 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-server" Jan 30 05:31:44 crc kubenswrapper[4841]: E0130 05:31:44.042738 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-updater" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042749 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-updater" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042974 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad9e30b-abf9-45fd-9088-103c94e4ed70" containerName="neutron-api" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.042998 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovs-vswitchd" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043015 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-server" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043038 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-reaper" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043054 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad9e30b-abf9-45fd-9088-103c94e4ed70" containerName="neutron-httpd" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043069 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-server" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043084 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="rsync" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043105 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-auditor" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043128 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-replicator" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043144 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3827ca89-b447-4c79-a946-bb1170c1e039" containerName="collect-profiles" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043161 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-replicator" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043176 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-updater" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043193 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-expirer" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043216 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="582a9577-0530-4793-8723-01681bdcfda4" containerName="ovsdb-server" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043232 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-auditor" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043248 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="object-auditor" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043265 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-server" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043288 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="container-updater" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043305 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="account-replicator" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.043318 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70c69eb-7b62-446a-8748-9a80d6fbe28b" containerName="swift-recon-cron" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.045063 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.073072 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v8s8"] Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.174770 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-catalog-content\") pod \"redhat-marketplace-2v8s8\" (UID: \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\") " pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.174918 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bhbz\" (UniqueName: \"kubernetes.io/projected/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-kube-api-access-8bhbz\") pod \"redhat-marketplace-2v8s8\" (UID: \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\") " pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.175063 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-utilities\") pod \"redhat-marketplace-2v8s8\" (UID: \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\") " pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.276865 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-catalog-content\") pod \"redhat-marketplace-2v8s8\" (UID: \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\") " pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.277017 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bhbz\" (UniqueName: \"kubernetes.io/projected/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-kube-api-access-8bhbz\") pod \"redhat-marketplace-2v8s8\" (UID: \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\") " pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.277160 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-utilities\") pod \"redhat-marketplace-2v8s8\" (UID: \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\") " pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.277774 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-catalog-content\") pod \"redhat-marketplace-2v8s8\" (UID: \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\") " pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.277961 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-utilities\") pod \"redhat-marketplace-2v8s8\" (UID: \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\") " pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.306586 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bhbz\" (UniqueName: \"kubernetes.io/projected/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-kube-api-access-8bhbz\") pod \"redhat-marketplace-2v8s8\" (UID: \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\") " pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.375334 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:44 crc kubenswrapper[4841]: I0130 05:31:44.705481 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v8s8"] Jan 30 05:31:45 crc kubenswrapper[4841]: I0130 05:31:45.191511 4841 generic.go:334] "Generic (PLEG): container finished" podID="8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" containerID="4b391406b3a2a70d730a0a29e69c70de19f4ca9b16fb6076b02268b433f42fd7" exitCode=0 Jan 30 05:31:45 crc kubenswrapper[4841]: I0130 05:31:45.191575 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v8s8" event={"ID":"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1","Type":"ContainerDied","Data":"4b391406b3a2a70d730a0a29e69c70de19f4ca9b16fb6076b02268b433f42fd7"} Jan 30 05:31:45 crc kubenswrapper[4841]: I0130 05:31:45.191613 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v8s8" event={"ID":"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1","Type":"ContainerStarted","Data":"33ea52fe2b45ea1959347bdc4ea96fb9799a5ac2b24d227dfeb365305459d1f5"} Jan 30 05:31:46 crc kubenswrapper[4841]: I0130 05:31:46.782324 4841 scope.go:117] "RemoveContainer" containerID="2733c25272bc77b1dfc05208dea14e07e8ff595e8adce5963f17598a5feb122b" Jan 30 05:31:46 crc kubenswrapper[4841]: I0130 05:31:46.824309 4841 scope.go:117] "RemoveContainer" containerID="c9437d4bfc1aa568db9f236839b9880300b298866644c5b7a56808c9053266e9" Jan 30 05:31:46 crc kubenswrapper[4841]: I0130 05:31:46.867841 4841 scope.go:117] "RemoveContainer" containerID="9c08f840a59acac8f70270c43eea5ff5025ee709df5670b4390941105ac52fdb" Jan 30 05:31:46 crc kubenswrapper[4841]: I0130 05:31:46.904893 4841 scope.go:117] "RemoveContainer" containerID="116957a31ecc7810eee65d27433abb88c8fec810845e23d0583e85d00bf7ce89" Jan 30 05:31:46 crc kubenswrapper[4841]: I0130 05:31:46.938071 4841 scope.go:117] "RemoveContainer" containerID="2fb5720129240f51365edf86e5b5e4984e61c72f72916a58ebebcb8a9d93bad3" Jan 30 05:31:46 crc kubenswrapper[4841]: I0130 05:31:46.972794 4841 scope.go:117] "RemoveContainer" containerID="0182770a249e71ae68983a8beb058b839e3edb9972bf7c96097b9a39e6587d5f" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.004524 4841 scope.go:117] "RemoveContainer" containerID="beddd0e5331c71a232c6ceda578e9012b0f3d2f15d7b1915e9466873b9283ff4" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.032486 4841 scope.go:117] "RemoveContainer" containerID="e3a84a22abdca1f0ccd6cb67b6a898ad9c44aaeffcafccdee75399832d892e6a" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.062765 4841 scope.go:117] "RemoveContainer" containerID="34e0d86f2d28c1bd9d913121757607f09ba8a336c043340479e6bcb08471c2ac" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.143368 4841 scope.go:117] "RemoveContainer" containerID="91f9480848574a237405ae1da5561a7e4158f6f071fd8b1259fa2288ebb97e00" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.171622 4841 scope.go:117] "RemoveContainer" containerID="ccf928f9462b3ee5fb5e26c65672c931a7fab29d56b8e9e63d2a1c130071a792" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.199314 4841 scope.go:117] "RemoveContainer" containerID="d429e82bce41d5cf25298d70fbda502e5d597b897e823ca0e7f4e7bee8495c64" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.226942 4841 scope.go:117] "RemoveContainer" containerID="3848f383886419071be51c8b532717e09cd0c6ae367b14c933b86749a45d48de" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.229000 4841 generic.go:334] "Generic (PLEG): container finished" podID="8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" containerID="bd21c4d6a9ba6b99352e7d732ca438a36699c7dc821b7a5da92d6b89d8ffd000" exitCode=0 Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.229043 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v8s8" event={"ID":"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1","Type":"ContainerDied","Data":"bd21c4d6a9ba6b99352e7d732ca438a36699c7dc821b7a5da92d6b89d8ffd000"} Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.251316 4841 scope.go:117] "RemoveContainer" containerID="85111eda2e1bde25e3a60cad27ea30e9be9e11c5f182fa65d1bb53519be46498" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.291915 4841 scope.go:117] "RemoveContainer" containerID="99b725593da174a758fc675f27f806a5cc026e7e2b8a1e1f72cd1b69bbef305c" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.332854 4841 scope.go:117] "RemoveContainer" containerID="a56f6d727a91da435870dbaf6b07fb62f03128af7ecaf22716aced269514b47c" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.364419 4841 scope.go:117] "RemoveContainer" containerID="7c2452ee2b7d6245999af2b7e81b2287c244e2079fa771faf3530fcfb5a47e79" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.392791 4841 scope.go:117] "RemoveContainer" containerID="0a70a0f6a798c8a514a3ce1eb1cd476bac1ed23f3497cd8cc84628ea9151c1f3" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.426763 4841 scope.go:117] "RemoveContainer" containerID="15adcc49aa8a94852a40b3a0596b418fd77363d9823b75d779030e55e32d5311" Jan 30 05:31:47 crc kubenswrapper[4841]: I0130 05:31:47.502285 4841 scope.go:117] "RemoveContainer" containerID="bf835ac6f606ee2b27b85b5eed4b650030810616e051c84d480b09ee6b694955" Jan 30 05:31:48 crc kubenswrapper[4841]: I0130 05:31:48.246726 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v8s8" event={"ID":"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1","Type":"ContainerStarted","Data":"5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc"} Jan 30 05:31:48 crc kubenswrapper[4841]: I0130 05:31:48.283343 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2v8s8" podStartSLOduration=1.739109867 podStartE2EDuration="4.283318086s" podCreationTimestamp="2026-01-30 05:31:44 +0000 UTC" firstStartedPulling="2026-01-30 05:31:45.194115724 +0000 UTC m=+1442.187588392" lastFinishedPulling="2026-01-30 05:31:47.738323963 +0000 UTC m=+1444.731796611" observedRunningTime="2026-01-30 05:31:48.274849489 +0000 UTC m=+1445.268322127" watchObservedRunningTime="2026-01-30 05:31:48.283318086 +0000 UTC m=+1445.276790764" Jan 30 05:31:54 crc kubenswrapper[4841]: I0130 05:31:54.375779 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:54 crc kubenswrapper[4841]: I0130 05:31:54.376444 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:54 crc kubenswrapper[4841]: I0130 05:31:54.452241 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:55 crc kubenswrapper[4841]: I0130 05:31:55.381605 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:55 crc kubenswrapper[4841]: I0130 05:31:55.449513 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v8s8"] Jan 30 05:31:57 crc kubenswrapper[4841]: I0130 05:31:57.328454 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2v8s8" podUID="8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" containerName="registry-server" containerID="cri-o://5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc" gracePeriod=2 Jan 30 05:31:57 crc kubenswrapper[4841]: I0130 05:31:57.786239 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:57 crc kubenswrapper[4841]: I0130 05:31:57.953489 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-catalog-content\") pod \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\" (UID: \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\") " Jan 30 05:31:57 crc kubenswrapper[4841]: I0130 05:31:57.953713 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-utilities\") pod \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\" (UID: \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\") " Jan 30 05:31:57 crc kubenswrapper[4841]: I0130 05:31:57.953796 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bhbz\" (UniqueName: \"kubernetes.io/projected/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-kube-api-access-8bhbz\") pod \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\" (UID: \"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1\") " Jan 30 05:31:57 crc kubenswrapper[4841]: I0130 05:31:57.955226 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-utilities" (OuterVolumeSpecName: "utilities") pod "8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" (UID: "8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:31:57 crc kubenswrapper[4841]: I0130 05:31:57.965774 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-kube-api-access-8bhbz" (OuterVolumeSpecName: "kube-api-access-8bhbz") pod "8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" (UID: "8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1"). InnerVolumeSpecName "kube-api-access-8bhbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:31:57 crc kubenswrapper[4841]: I0130 05:31:57.980991 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" (UID: "8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.055820 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.055872 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bhbz\" (UniqueName: \"kubernetes.io/projected/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-kube-api-access-8bhbz\") on node \"crc\" DevicePath \"\"" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.055899 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.342503 4841 generic.go:334] "Generic (PLEG): container finished" podID="8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" containerID="5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc" exitCode=0 Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.342580 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v8s8" event={"ID":"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1","Type":"ContainerDied","Data":"5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc"} Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.342664 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v8s8" event={"ID":"8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1","Type":"ContainerDied","Data":"33ea52fe2b45ea1959347bdc4ea96fb9799a5ac2b24d227dfeb365305459d1f5"} Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.342696 4841 scope.go:117] "RemoveContainer" containerID="5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.344352 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v8s8" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.377715 4841 scope.go:117] "RemoveContainer" containerID="bd21c4d6a9ba6b99352e7d732ca438a36699c7dc821b7a5da92d6b89d8ffd000" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.397854 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v8s8"] Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.405054 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v8s8"] Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.410436 4841 scope.go:117] "RemoveContainer" containerID="4b391406b3a2a70d730a0a29e69c70de19f4ca9b16fb6076b02268b433f42fd7" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.444245 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" path="/var/lib/kubelet/pods/8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1/volumes" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.458488 4841 scope.go:117] "RemoveContainer" containerID="5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc" Jan 30 05:31:58 crc kubenswrapper[4841]: E0130 05:31:58.459203 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc\": container with ID starting with 5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc not found: ID does not exist" containerID="5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.459268 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc"} err="failed to get container status \"5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc\": rpc error: code = NotFound desc = could not find container \"5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc\": container with ID starting with 5c6c6b9ffedb131a311acc2f865374e7226119f605b7228ae780d3f7715a6ffc not found: ID does not exist" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.459313 4841 scope.go:117] "RemoveContainer" containerID="bd21c4d6a9ba6b99352e7d732ca438a36699c7dc821b7a5da92d6b89d8ffd000" Jan 30 05:31:58 crc kubenswrapper[4841]: E0130 05:31:58.459999 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd21c4d6a9ba6b99352e7d732ca438a36699c7dc821b7a5da92d6b89d8ffd000\": container with ID starting with bd21c4d6a9ba6b99352e7d732ca438a36699c7dc821b7a5da92d6b89d8ffd000 not found: ID does not exist" containerID="bd21c4d6a9ba6b99352e7d732ca438a36699c7dc821b7a5da92d6b89d8ffd000" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.460047 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd21c4d6a9ba6b99352e7d732ca438a36699c7dc821b7a5da92d6b89d8ffd000"} err="failed to get container status \"bd21c4d6a9ba6b99352e7d732ca438a36699c7dc821b7a5da92d6b89d8ffd000\": rpc error: code = NotFound desc = could not find container \"bd21c4d6a9ba6b99352e7d732ca438a36699c7dc821b7a5da92d6b89d8ffd000\": container with ID starting with bd21c4d6a9ba6b99352e7d732ca438a36699c7dc821b7a5da92d6b89d8ffd000 not found: ID does not exist" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.460077 4841 scope.go:117] "RemoveContainer" containerID="4b391406b3a2a70d730a0a29e69c70de19f4ca9b16fb6076b02268b433f42fd7" Jan 30 05:31:58 crc kubenswrapper[4841]: E0130 05:31:58.460466 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b391406b3a2a70d730a0a29e69c70de19f4ca9b16fb6076b02268b433f42fd7\": container with ID starting with 4b391406b3a2a70d730a0a29e69c70de19f4ca9b16fb6076b02268b433f42fd7 not found: ID does not exist" containerID="4b391406b3a2a70d730a0a29e69c70de19f4ca9b16fb6076b02268b433f42fd7" Jan 30 05:31:58 crc kubenswrapper[4841]: I0130 05:31:58.460505 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b391406b3a2a70d730a0a29e69c70de19f4ca9b16fb6076b02268b433f42fd7"} err="failed to get container status \"4b391406b3a2a70d730a0a29e69c70de19f4ca9b16fb6076b02268b433f42fd7\": rpc error: code = NotFound desc = could not find container \"4b391406b3a2a70d730a0a29e69c70de19f4ca9b16fb6076b02268b433f42fd7\": container with ID starting with 4b391406b3a2a70d730a0a29e69c70de19f4ca9b16fb6076b02268b433f42fd7 not found: ID does not exist" Jan 30 05:32:10 crc kubenswrapper[4841]: I0130 05:32:10.464266 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:32:10 crc kubenswrapper[4841]: I0130 05:32:10.465027 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.386726 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f8lbn"] Jan 30 05:32:18 crc kubenswrapper[4841]: E0130 05:32:18.389186 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" containerName="registry-server" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.389339 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" containerName="registry-server" Jan 30 05:32:18 crc kubenswrapper[4841]: E0130 05:32:18.389502 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" containerName="extract-utilities" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.389623 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" containerName="extract-utilities" Jan 30 05:32:18 crc kubenswrapper[4841]: E0130 05:32:18.389909 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" containerName="extract-content" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.390070 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" containerName="extract-content" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.390606 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bef2445-8a9f-48ae-bc7c-d6a05d7f73d1" containerName="registry-server" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.392743 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.407588 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f8lbn"] Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.498296 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlg7c\" (UniqueName: \"kubernetes.io/projected/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-kube-api-access-zlg7c\") pod \"certified-operators-f8lbn\" (UID: \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\") " pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.498583 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-utilities\") pod \"certified-operators-f8lbn\" (UID: \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\") " pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.498720 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-catalog-content\") pod \"certified-operators-f8lbn\" (UID: \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\") " pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.600308 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-catalog-content\") pod \"certified-operators-f8lbn\" (UID: \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\") " pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.600530 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlg7c\" (UniqueName: \"kubernetes.io/projected/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-kube-api-access-zlg7c\") pod \"certified-operators-f8lbn\" (UID: \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\") " pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.600552 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-utilities\") pod \"certified-operators-f8lbn\" (UID: \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\") " pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.601092 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-catalog-content\") pod \"certified-operators-f8lbn\" (UID: \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\") " pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.601571 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-utilities\") pod \"certified-operators-f8lbn\" (UID: \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\") " pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.619594 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlg7c\" (UniqueName: \"kubernetes.io/projected/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-kube-api-access-zlg7c\") pod \"certified-operators-f8lbn\" (UID: \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\") " pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:18 crc kubenswrapper[4841]: I0130 05:32:18.723971 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:19 crc kubenswrapper[4841]: I0130 05:32:19.173205 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f8lbn"] Jan 30 05:32:19 crc kubenswrapper[4841]: I0130 05:32:19.552353 4841 generic.go:334] "Generic (PLEG): container finished" podID="f72a5b9d-fa64-4506-8312-ed76d0c4eb21" containerID="acec743d406b5da5bfbe97bf1fa1d0618609d35fc7229468be555021af4dfc05" exitCode=0 Jan 30 05:32:19 crc kubenswrapper[4841]: I0130 05:32:19.552577 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8lbn" event={"ID":"f72a5b9d-fa64-4506-8312-ed76d0c4eb21","Type":"ContainerDied","Data":"acec743d406b5da5bfbe97bf1fa1d0618609d35fc7229468be555021af4dfc05"} Jan 30 05:32:19 crc kubenswrapper[4841]: I0130 05:32:19.554235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8lbn" event={"ID":"f72a5b9d-fa64-4506-8312-ed76d0c4eb21","Type":"ContainerStarted","Data":"95dc4c8e4364f292c4ab7495496cec902030d0e827529732771e9232cfbed995"} Jan 30 05:32:20 crc kubenswrapper[4841]: I0130 05:32:20.567866 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8lbn" event={"ID":"f72a5b9d-fa64-4506-8312-ed76d0c4eb21","Type":"ContainerStarted","Data":"d692c0e026703113f67f6caf1a80c5af7aa8e331e83667d8efb9c3a61ee1db8a"} Jan 30 05:32:21 crc kubenswrapper[4841]: I0130 05:32:21.584061 4841 generic.go:334] "Generic (PLEG): container finished" podID="f72a5b9d-fa64-4506-8312-ed76d0c4eb21" containerID="d692c0e026703113f67f6caf1a80c5af7aa8e331e83667d8efb9c3a61ee1db8a" exitCode=0 Jan 30 05:32:21 crc kubenswrapper[4841]: I0130 05:32:21.584122 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8lbn" event={"ID":"f72a5b9d-fa64-4506-8312-ed76d0c4eb21","Type":"ContainerDied","Data":"d692c0e026703113f67f6caf1a80c5af7aa8e331e83667d8efb9c3a61ee1db8a"} Jan 30 05:32:22 crc kubenswrapper[4841]: I0130 05:32:22.600721 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8lbn" event={"ID":"f72a5b9d-fa64-4506-8312-ed76d0c4eb21","Type":"ContainerStarted","Data":"871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692"} Jan 30 05:32:22 crc kubenswrapper[4841]: I0130 05:32:22.636347 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f8lbn" podStartSLOduration=2.182094182 podStartE2EDuration="4.636321667s" podCreationTimestamp="2026-01-30 05:32:18 +0000 UTC" firstStartedPulling="2026-01-30 05:32:19.554811842 +0000 UTC m=+1476.548284520" lastFinishedPulling="2026-01-30 05:32:22.009039317 +0000 UTC m=+1479.002512005" observedRunningTime="2026-01-30 05:32:22.631514168 +0000 UTC m=+1479.624986846" watchObservedRunningTime="2026-01-30 05:32:22.636321667 +0000 UTC m=+1479.629794345" Jan 30 05:32:28 crc kubenswrapper[4841]: I0130 05:32:28.724312 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:28 crc kubenswrapper[4841]: I0130 05:32:28.724975 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:28 crc kubenswrapper[4841]: I0130 05:32:28.800506 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:29 crc kubenswrapper[4841]: I0130 05:32:29.747795 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:29 crc kubenswrapper[4841]: I0130 05:32:29.821050 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f8lbn"] Jan 30 05:32:31 crc kubenswrapper[4841]: I0130 05:32:31.688353 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f8lbn" podUID="f72a5b9d-fa64-4506-8312-ed76d0c4eb21" containerName="registry-server" containerID="cri-o://871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692" gracePeriod=2 Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.224388 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.320281 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-catalog-content\") pod \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\" (UID: \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\") " Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.321054 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlg7c\" (UniqueName: \"kubernetes.io/projected/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-kube-api-access-zlg7c\") pod \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\" (UID: \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\") " Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.321112 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-utilities\") pod \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\" (UID: \"f72a5b9d-fa64-4506-8312-ed76d0c4eb21\") " Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.323356 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-utilities" (OuterVolumeSpecName: "utilities") pod "f72a5b9d-fa64-4506-8312-ed76d0c4eb21" (UID: "f72a5b9d-fa64-4506-8312-ed76d0c4eb21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.326954 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-kube-api-access-zlg7c" (OuterVolumeSpecName: "kube-api-access-zlg7c") pod "f72a5b9d-fa64-4506-8312-ed76d0c4eb21" (UID: "f72a5b9d-fa64-4506-8312-ed76d0c4eb21"). InnerVolumeSpecName "kube-api-access-zlg7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.409092 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f72a5b9d-fa64-4506-8312-ed76d0c4eb21" (UID: "f72a5b9d-fa64-4506-8312-ed76d0c4eb21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.425541 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlg7c\" (UniqueName: \"kubernetes.io/projected/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-kube-api-access-zlg7c\") on node \"crc\" DevicePath \"\"" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.425621 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.425646 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f72a5b9d-fa64-4506-8312-ed76d0c4eb21-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.699718 4841 generic.go:334] "Generic (PLEG): container finished" podID="f72a5b9d-fa64-4506-8312-ed76d0c4eb21" containerID="871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692" exitCode=0 Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.699762 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8lbn" event={"ID":"f72a5b9d-fa64-4506-8312-ed76d0c4eb21","Type":"ContainerDied","Data":"871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692"} Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.699782 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f8lbn" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.699800 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8lbn" event={"ID":"f72a5b9d-fa64-4506-8312-ed76d0c4eb21","Type":"ContainerDied","Data":"95dc4c8e4364f292c4ab7495496cec902030d0e827529732771e9232cfbed995"} Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.699821 4841 scope.go:117] "RemoveContainer" containerID="871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.724266 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f8lbn"] Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.731575 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f8lbn"] Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.736568 4841 scope.go:117] "RemoveContainer" containerID="d692c0e026703113f67f6caf1a80c5af7aa8e331e83667d8efb9c3a61ee1db8a" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.765953 4841 scope.go:117] "RemoveContainer" containerID="acec743d406b5da5bfbe97bf1fa1d0618609d35fc7229468be555021af4dfc05" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.785024 4841 scope.go:117] "RemoveContainer" containerID="871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692" Jan 30 05:32:32 crc kubenswrapper[4841]: E0130 05:32:32.785376 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692\": container with ID starting with 871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692 not found: ID does not exist" containerID="871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.785421 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692"} err="failed to get container status \"871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692\": rpc error: code = NotFound desc = could not find container \"871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692\": container with ID starting with 871892f7a51bd1142ad417abac955cd71ec074e25092ebcfdc6cc4d6442b6692 not found: ID does not exist" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.785443 4841 scope.go:117] "RemoveContainer" containerID="d692c0e026703113f67f6caf1a80c5af7aa8e331e83667d8efb9c3a61ee1db8a" Jan 30 05:32:32 crc kubenswrapper[4841]: E0130 05:32:32.785730 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d692c0e026703113f67f6caf1a80c5af7aa8e331e83667d8efb9c3a61ee1db8a\": container with ID starting with d692c0e026703113f67f6caf1a80c5af7aa8e331e83667d8efb9c3a61ee1db8a not found: ID does not exist" containerID="d692c0e026703113f67f6caf1a80c5af7aa8e331e83667d8efb9c3a61ee1db8a" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.785791 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d692c0e026703113f67f6caf1a80c5af7aa8e331e83667d8efb9c3a61ee1db8a"} err="failed to get container status \"d692c0e026703113f67f6caf1a80c5af7aa8e331e83667d8efb9c3a61ee1db8a\": rpc error: code = NotFound desc = could not find container \"d692c0e026703113f67f6caf1a80c5af7aa8e331e83667d8efb9c3a61ee1db8a\": container with ID starting with d692c0e026703113f67f6caf1a80c5af7aa8e331e83667d8efb9c3a61ee1db8a not found: ID does not exist" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.785841 4841 scope.go:117] "RemoveContainer" containerID="acec743d406b5da5bfbe97bf1fa1d0618609d35fc7229468be555021af4dfc05" Jan 30 05:32:32 crc kubenswrapper[4841]: E0130 05:32:32.786199 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acec743d406b5da5bfbe97bf1fa1d0618609d35fc7229468be555021af4dfc05\": container with ID starting with acec743d406b5da5bfbe97bf1fa1d0618609d35fc7229468be555021af4dfc05 not found: ID does not exist" containerID="acec743d406b5da5bfbe97bf1fa1d0618609d35fc7229468be555021af4dfc05" Jan 30 05:32:32 crc kubenswrapper[4841]: I0130 05:32:32.786252 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acec743d406b5da5bfbe97bf1fa1d0618609d35fc7229468be555021af4dfc05"} err="failed to get container status \"acec743d406b5da5bfbe97bf1fa1d0618609d35fc7229468be555021af4dfc05\": rpc error: code = NotFound desc = could not find container \"acec743d406b5da5bfbe97bf1fa1d0618609d35fc7229468be555021af4dfc05\": container with ID starting with acec743d406b5da5bfbe97bf1fa1d0618609d35fc7229468be555021af4dfc05 not found: ID does not exist" Jan 30 05:32:34 crc kubenswrapper[4841]: I0130 05:32:34.448354 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f72a5b9d-fa64-4506-8312-ed76d0c4eb21" path="/var/lib/kubelet/pods/f72a5b9d-fa64-4506-8312-ed76d0c4eb21/volumes" Jan 30 05:32:40 crc kubenswrapper[4841]: I0130 05:32:40.463871 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:32:40 crc kubenswrapper[4841]: I0130 05:32:40.464497 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:32:40 crc kubenswrapper[4841]: I0130 05:32:40.464562 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:32:40 crc kubenswrapper[4841]: I0130 05:32:40.465429 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:32:40 crc kubenswrapper[4841]: I0130 05:32:40.465514 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" gracePeriod=600 Jan 30 05:32:40 crc kubenswrapper[4841]: E0130 05:32:40.605877 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:32:40 crc kubenswrapper[4841]: I0130 05:32:40.790590 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" exitCode=0 Jan 30 05:32:40 crc kubenswrapper[4841]: I0130 05:32:40.790688 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be"} Jan 30 05:32:40 crc kubenswrapper[4841]: I0130 05:32:40.790791 4841 scope.go:117] "RemoveContainer" containerID="8bf8b4cf30e4dca128fb9c700ac455cfd4ad66705f2809226e958756cecd6fcb" Jan 30 05:32:40 crc kubenswrapper[4841]: I0130 05:32:40.791731 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:32:40 crc kubenswrapper[4841]: E0130 05:32:40.792223 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:32:47 crc kubenswrapper[4841]: I0130 05:32:47.974447 4841 scope.go:117] "RemoveContainer" containerID="5c4314bd5d4e8c9a32d1faa3556ab7cd3dae6013e3228dbb71483e6295221f1f" Jan 30 05:32:48 crc kubenswrapper[4841]: I0130 05:32:48.007220 4841 scope.go:117] "RemoveContainer" containerID="142f87d2cb0e9372bbcdecb138aed934cb74505996af8c75485dcbe012426e16" Jan 30 05:32:48 crc kubenswrapper[4841]: I0130 05:32:48.068672 4841 scope.go:117] "RemoveContainer" containerID="5ea394b127fd970af53caa39f1534450199ef44d3f1786176954b4e967668dcb" Jan 30 05:32:48 crc kubenswrapper[4841]: I0130 05:32:48.115883 4841 scope.go:117] "RemoveContainer" containerID="b3f1445caa2309dc539440752e7e25f18a4dc9dad577904f8bc72b78c43bd48f" Jan 30 05:32:48 crc kubenswrapper[4841]: I0130 05:32:48.167419 4841 scope.go:117] "RemoveContainer" containerID="4917c8f1f511e2ab1d1cbd400696ac027e9a598fabd5f2c438a5fb3e801553af" Jan 30 05:32:48 crc kubenswrapper[4841]: I0130 05:32:48.227784 4841 scope.go:117] "RemoveContainer" containerID="3d6a7ca7f9fa1770de4767c7a65014825dae033a2797b29ddd22537c9692650e" Jan 30 05:32:48 crc kubenswrapper[4841]: I0130 05:32:48.267481 4841 scope.go:117] "RemoveContainer" containerID="0a7c1a689c4e10c27ddb7c3c12fcc8d9ee61127c2c5b8d9fdc13d03072e0a7c1" Jan 30 05:32:48 crc kubenswrapper[4841]: I0130 05:32:48.295803 4841 scope.go:117] "RemoveContainer" containerID="0fcfa87f68875bfddaa93b739bc572583fbc92347ff5282b6f4b67fd22982107" Jan 30 05:32:48 crc kubenswrapper[4841]: I0130 05:32:48.338188 4841 scope.go:117] "RemoveContainer" containerID="2b1a0e1c3a5b1bc5a2cea93c3da88301a3042badaf0a1dbc97c75b42d3595c7d" Jan 30 05:32:48 crc kubenswrapper[4841]: I0130 05:32:48.354325 4841 scope.go:117] "RemoveContainer" containerID="6b1c61cf113a9d02dc8e84edb3cccdb64ff41343455a288f83fa9a65400016f9" Jan 30 05:32:55 crc kubenswrapper[4841]: I0130 05:32:55.433458 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:32:55 crc kubenswrapper[4841]: E0130 05:32:55.434389 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:33:09 crc kubenswrapper[4841]: I0130 05:33:09.431789 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:33:09 crc kubenswrapper[4841]: E0130 05:33:09.432782 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:33:23 crc kubenswrapper[4841]: I0130 05:33:23.431956 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:33:23 crc kubenswrapper[4841]: E0130 05:33:23.433084 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:33:37 crc kubenswrapper[4841]: I0130 05:33:37.432482 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:33:37 crc kubenswrapper[4841]: E0130 05:33:37.433344 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:33:48 crc kubenswrapper[4841]: I0130 05:33:48.431909 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:33:48 crc kubenswrapper[4841]: E0130 05:33:48.433054 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:33:48 crc kubenswrapper[4841]: I0130 05:33:48.566664 4841 scope.go:117] "RemoveContainer" containerID="324bbfef51820fed27231247bde9ff6d0f8f2aab3fd13c87c30968e9cff134b6" Jan 30 05:33:48 crc kubenswrapper[4841]: I0130 05:33:48.600230 4841 scope.go:117] "RemoveContainer" containerID="423863723ace49a11675767356ead1a32dc7658d7769ea7c64a96cba5343a3ca" Jan 30 05:33:48 crc kubenswrapper[4841]: I0130 05:33:48.641974 4841 scope.go:117] "RemoveContainer" containerID="f7006000010c70fd3939a321e93d354c55c7a81ade457ce7388432e79c0a136c" Jan 30 05:33:48 crc kubenswrapper[4841]: I0130 05:33:48.670775 4841 scope.go:117] "RemoveContainer" containerID="f6437c7d036b9ccabd53ece2b8030d2d059235373e2a041d4eb65c202968d653" Jan 30 05:33:48 crc kubenswrapper[4841]: I0130 05:33:48.697970 4841 scope.go:117] "RemoveContainer" containerID="31ef83de4f5eec8256ad9a9714b034f94d5c23abce8bac25f0acaf273fe574d8" Jan 30 05:33:48 crc kubenswrapper[4841]: I0130 05:33:48.727531 4841 scope.go:117] "RemoveContainer" containerID="516c5fd4cd75d7f12188d52666c9e2f8ee88173bfc6842c75dfd525085b58059" Jan 30 05:33:48 crc kubenswrapper[4841]: I0130 05:33:48.756327 4841 scope.go:117] "RemoveContainer" containerID="4569b4b5e37876eda335dcbd62e5a674767059e5c757482cdf828e323ff293d5" Jan 30 05:33:48 crc kubenswrapper[4841]: I0130 05:33:48.784528 4841 scope.go:117] "RemoveContainer" containerID="4d21eef9c1c9ed046a97390ad3dd4aed5f37df0881ec47b58bf4b020f79db06f" Jan 30 05:33:48 crc kubenswrapper[4841]: I0130 05:33:48.812111 4841 scope.go:117] "RemoveContainer" containerID="995cb343dd7386b355ae3b1c076e4fe5c75a2ee001f0d1549fd9e15f90bb22a0" Jan 30 05:34:01 crc kubenswrapper[4841]: I0130 05:34:01.432735 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:34:01 crc kubenswrapper[4841]: E0130 05:34:01.433710 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:34:16 crc kubenswrapper[4841]: I0130 05:34:16.431867 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:34:16 crc kubenswrapper[4841]: E0130 05:34:16.432876 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.270257 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wgw4n"] Jan 30 05:34:23 crc kubenswrapper[4841]: E0130 05:34:23.270792 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f72a5b9d-fa64-4506-8312-ed76d0c4eb21" containerName="registry-server" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.270806 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f72a5b9d-fa64-4506-8312-ed76d0c4eb21" containerName="registry-server" Jan 30 05:34:23 crc kubenswrapper[4841]: E0130 05:34:23.270840 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f72a5b9d-fa64-4506-8312-ed76d0c4eb21" containerName="extract-content" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.270848 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f72a5b9d-fa64-4506-8312-ed76d0c4eb21" containerName="extract-content" Jan 30 05:34:23 crc kubenswrapper[4841]: E0130 05:34:23.270869 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f72a5b9d-fa64-4506-8312-ed76d0c4eb21" containerName="extract-utilities" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.270877 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f72a5b9d-fa64-4506-8312-ed76d0c4eb21" containerName="extract-utilities" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.271034 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f72a5b9d-fa64-4506-8312-ed76d0c4eb21" containerName="registry-server" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.272119 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wgw4n"] Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.272201 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.435493 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32decd79-d595-4a77-9963-d1efaf5eae97-catalog-content\") pod \"community-operators-wgw4n\" (UID: \"32decd79-d595-4a77-9963-d1efaf5eae97\") " pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.435822 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl622\" (UniqueName: \"kubernetes.io/projected/32decd79-d595-4a77-9963-d1efaf5eae97-kube-api-access-bl622\") pod \"community-operators-wgw4n\" (UID: \"32decd79-d595-4a77-9963-d1efaf5eae97\") " pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.435952 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32decd79-d595-4a77-9963-d1efaf5eae97-utilities\") pod \"community-operators-wgw4n\" (UID: \"32decd79-d595-4a77-9963-d1efaf5eae97\") " pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.537300 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32decd79-d595-4a77-9963-d1efaf5eae97-catalog-content\") pod \"community-operators-wgw4n\" (UID: \"32decd79-d595-4a77-9963-d1efaf5eae97\") " pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.537667 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl622\" (UniqueName: \"kubernetes.io/projected/32decd79-d595-4a77-9963-d1efaf5eae97-kube-api-access-bl622\") pod \"community-operators-wgw4n\" (UID: \"32decd79-d595-4a77-9963-d1efaf5eae97\") " pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.537787 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32decd79-d595-4a77-9963-d1efaf5eae97-utilities\") pod \"community-operators-wgw4n\" (UID: \"32decd79-d595-4a77-9963-d1efaf5eae97\") " pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.537747 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32decd79-d595-4a77-9963-d1efaf5eae97-catalog-content\") pod \"community-operators-wgw4n\" (UID: \"32decd79-d595-4a77-9963-d1efaf5eae97\") " pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.538463 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32decd79-d595-4a77-9963-d1efaf5eae97-utilities\") pod \"community-operators-wgw4n\" (UID: \"32decd79-d595-4a77-9963-d1efaf5eae97\") " pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.555734 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl622\" (UniqueName: \"kubernetes.io/projected/32decd79-d595-4a77-9963-d1efaf5eae97-kube-api-access-bl622\") pod \"community-operators-wgw4n\" (UID: \"32decd79-d595-4a77-9963-d1efaf5eae97\") " pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:23 crc kubenswrapper[4841]: I0130 05:34:23.616608 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:24 crc kubenswrapper[4841]: I0130 05:34:24.135191 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wgw4n"] Jan 30 05:34:24 crc kubenswrapper[4841]: I0130 05:34:24.238282 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgw4n" event={"ID":"32decd79-d595-4a77-9963-d1efaf5eae97","Type":"ContainerStarted","Data":"75389c515d9d737721484b2e168a76b148c7b52fb80090abb3081a1529a7c2a7"} Jan 30 05:34:25 crc kubenswrapper[4841]: I0130 05:34:25.251083 4841 generic.go:334] "Generic (PLEG): container finished" podID="32decd79-d595-4a77-9963-d1efaf5eae97" containerID="549df22f21f188652ece0446052372d621d98356d872f57d53c72b52536a0a6b" exitCode=0 Jan 30 05:34:25 crc kubenswrapper[4841]: I0130 05:34:25.251145 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgw4n" event={"ID":"32decd79-d595-4a77-9963-d1efaf5eae97","Type":"ContainerDied","Data":"549df22f21f188652ece0446052372d621d98356d872f57d53c72b52536a0a6b"} Jan 30 05:34:25 crc kubenswrapper[4841]: I0130 05:34:25.257929 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:34:26 crc kubenswrapper[4841]: I0130 05:34:26.265095 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgw4n" event={"ID":"32decd79-d595-4a77-9963-d1efaf5eae97","Type":"ContainerStarted","Data":"88913a32ab19756c87e4c2db542d9418babd6957702516168ae700b58a0f355e"} Jan 30 05:34:27 crc kubenswrapper[4841]: I0130 05:34:27.278762 4841 generic.go:334] "Generic (PLEG): container finished" podID="32decd79-d595-4a77-9963-d1efaf5eae97" containerID="88913a32ab19756c87e4c2db542d9418babd6957702516168ae700b58a0f355e" exitCode=0 Jan 30 05:34:27 crc kubenswrapper[4841]: I0130 05:34:27.278873 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgw4n" event={"ID":"32decd79-d595-4a77-9963-d1efaf5eae97","Type":"ContainerDied","Data":"88913a32ab19756c87e4c2db542d9418babd6957702516168ae700b58a0f355e"} Jan 30 05:34:28 crc kubenswrapper[4841]: I0130 05:34:28.291852 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgw4n" event={"ID":"32decd79-d595-4a77-9963-d1efaf5eae97","Type":"ContainerStarted","Data":"3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d"} Jan 30 05:34:28 crc kubenswrapper[4841]: I0130 05:34:28.338670 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wgw4n" podStartSLOduration=2.850629253 podStartE2EDuration="5.338643996s" podCreationTimestamp="2026-01-30 05:34:23 +0000 UTC" firstStartedPulling="2026-01-30 05:34:25.254876018 +0000 UTC m=+1602.248348696" lastFinishedPulling="2026-01-30 05:34:27.742890771 +0000 UTC m=+1604.736363439" observedRunningTime="2026-01-30 05:34:28.328198796 +0000 UTC m=+1605.321671444" watchObservedRunningTime="2026-01-30 05:34:28.338643996 +0000 UTC m=+1605.332116674" Jan 30 05:34:28 crc kubenswrapper[4841]: I0130 05:34:28.432360 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:34:28 crc kubenswrapper[4841]: E0130 05:34:28.432762 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:34:33 crc kubenswrapper[4841]: I0130 05:34:33.617658 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:33 crc kubenswrapper[4841]: I0130 05:34:33.618358 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:33 crc kubenswrapper[4841]: I0130 05:34:33.690502 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:34 crc kubenswrapper[4841]: I0130 05:34:34.413824 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:34 crc kubenswrapper[4841]: I0130 05:34:34.488870 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wgw4n"] Jan 30 05:34:36 crc kubenswrapper[4841]: I0130 05:34:36.362755 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wgw4n" podUID="32decd79-d595-4a77-9963-d1efaf5eae97" containerName="registry-server" containerID="cri-o://3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d" gracePeriod=2 Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.325052 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.374633 4841 generic.go:334] "Generic (PLEG): container finished" podID="32decd79-d595-4a77-9963-d1efaf5eae97" containerID="3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d" exitCode=0 Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.374693 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgw4n" event={"ID":"32decd79-d595-4a77-9963-d1efaf5eae97","Type":"ContainerDied","Data":"3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d"} Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.374733 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgw4n" event={"ID":"32decd79-d595-4a77-9963-d1efaf5eae97","Type":"ContainerDied","Data":"75389c515d9d737721484b2e168a76b148c7b52fb80090abb3081a1529a7c2a7"} Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.374761 4841 scope.go:117] "RemoveContainer" containerID="3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.374924 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgw4n" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.382384 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32decd79-d595-4a77-9963-d1efaf5eae97-utilities\") pod \"32decd79-d595-4a77-9963-d1efaf5eae97\" (UID: \"32decd79-d595-4a77-9963-d1efaf5eae97\") " Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.382549 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl622\" (UniqueName: \"kubernetes.io/projected/32decd79-d595-4a77-9963-d1efaf5eae97-kube-api-access-bl622\") pod \"32decd79-d595-4a77-9963-d1efaf5eae97\" (UID: \"32decd79-d595-4a77-9963-d1efaf5eae97\") " Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.382608 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32decd79-d595-4a77-9963-d1efaf5eae97-catalog-content\") pod \"32decd79-d595-4a77-9963-d1efaf5eae97\" (UID: \"32decd79-d595-4a77-9963-d1efaf5eae97\") " Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.387916 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32decd79-d595-4a77-9963-d1efaf5eae97-utilities" (OuterVolumeSpecName: "utilities") pod "32decd79-d595-4a77-9963-d1efaf5eae97" (UID: "32decd79-d595-4a77-9963-d1efaf5eae97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.393686 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32decd79-d595-4a77-9963-d1efaf5eae97-kube-api-access-bl622" (OuterVolumeSpecName: "kube-api-access-bl622") pod "32decd79-d595-4a77-9963-d1efaf5eae97" (UID: "32decd79-d595-4a77-9963-d1efaf5eae97"). InnerVolumeSpecName "kube-api-access-bl622". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.414901 4841 scope.go:117] "RemoveContainer" containerID="88913a32ab19756c87e4c2db542d9418babd6957702516168ae700b58a0f355e" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.444635 4841 scope.go:117] "RemoveContainer" containerID="549df22f21f188652ece0446052372d621d98356d872f57d53c72b52536a0a6b" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.466359 4841 scope.go:117] "RemoveContainer" containerID="3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d" Jan 30 05:34:37 crc kubenswrapper[4841]: E0130 05:34:37.468221 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d\": container with ID starting with 3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d not found: ID does not exist" containerID="3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.468265 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d"} err="failed to get container status \"3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d\": rpc error: code = NotFound desc = could not find container \"3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d\": container with ID starting with 3f39196baa37407f24960723307c4fa110fbede88f055bba85d1d1c9a6aa285d not found: ID does not exist" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.468289 4841 scope.go:117] "RemoveContainer" containerID="88913a32ab19756c87e4c2db542d9418babd6957702516168ae700b58a0f355e" Jan 30 05:34:37 crc kubenswrapper[4841]: E0130 05:34:37.468788 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88913a32ab19756c87e4c2db542d9418babd6957702516168ae700b58a0f355e\": container with ID starting with 88913a32ab19756c87e4c2db542d9418babd6957702516168ae700b58a0f355e not found: ID does not exist" containerID="88913a32ab19756c87e4c2db542d9418babd6957702516168ae700b58a0f355e" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.468862 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88913a32ab19756c87e4c2db542d9418babd6957702516168ae700b58a0f355e"} err="failed to get container status \"88913a32ab19756c87e4c2db542d9418babd6957702516168ae700b58a0f355e\": rpc error: code = NotFound desc = could not find container \"88913a32ab19756c87e4c2db542d9418babd6957702516168ae700b58a0f355e\": container with ID starting with 88913a32ab19756c87e4c2db542d9418babd6957702516168ae700b58a0f355e not found: ID does not exist" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.468906 4841 scope.go:117] "RemoveContainer" containerID="549df22f21f188652ece0446052372d621d98356d872f57d53c72b52536a0a6b" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.468911 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32decd79-d595-4a77-9963-d1efaf5eae97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32decd79-d595-4a77-9963-d1efaf5eae97" (UID: "32decd79-d595-4a77-9963-d1efaf5eae97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:34:37 crc kubenswrapper[4841]: E0130 05:34:37.469269 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"549df22f21f188652ece0446052372d621d98356d872f57d53c72b52536a0a6b\": container with ID starting with 549df22f21f188652ece0446052372d621d98356d872f57d53c72b52536a0a6b not found: ID does not exist" containerID="549df22f21f188652ece0446052372d621d98356d872f57d53c72b52536a0a6b" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.469289 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"549df22f21f188652ece0446052372d621d98356d872f57d53c72b52536a0a6b"} err="failed to get container status \"549df22f21f188652ece0446052372d621d98356d872f57d53c72b52536a0a6b\": rpc error: code = NotFound desc = could not find container \"549df22f21f188652ece0446052372d621d98356d872f57d53c72b52536a0a6b\": container with ID starting with 549df22f21f188652ece0446052372d621d98356d872f57d53c72b52536a0a6b not found: ID does not exist" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.484099 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl622\" (UniqueName: \"kubernetes.io/projected/32decd79-d595-4a77-9963-d1efaf5eae97-kube-api-access-bl622\") on node \"crc\" DevicePath \"\"" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.484140 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32decd79-d595-4a77-9963-d1efaf5eae97-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.484150 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32decd79-d595-4a77-9963-d1efaf5eae97-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.719112 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wgw4n"] Jan 30 05:34:37 crc kubenswrapper[4841]: I0130 05:34:37.732312 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wgw4n"] Jan 30 05:34:38 crc kubenswrapper[4841]: I0130 05:34:38.439289 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32decd79-d595-4a77-9963-d1efaf5eae97" path="/var/lib/kubelet/pods/32decd79-d595-4a77-9963-d1efaf5eae97/volumes" Jan 30 05:34:40 crc kubenswrapper[4841]: I0130 05:34:40.432647 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:34:40 crc kubenswrapper[4841]: E0130 05:34:40.433475 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:34:49 crc kubenswrapper[4841]: I0130 05:34:49.031254 4841 scope.go:117] "RemoveContainer" containerID="23495fc7312790940a345e75d7fefde2b7f84aea9ac21773cd38a0282daaaf9f" Jan 30 05:34:49 crc kubenswrapper[4841]: I0130 05:34:49.097321 4841 scope.go:117] "RemoveContainer" containerID="38df243ff6c49b1e54302c7946a385effd3bf2fb3d74c73e327e4d81515afc49" Jan 30 05:34:51 crc kubenswrapper[4841]: I0130 05:34:51.433015 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:34:51 crc kubenswrapper[4841]: E0130 05:34:51.433823 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:35:04 crc kubenswrapper[4841]: I0130 05:35:04.440190 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:35:04 crc kubenswrapper[4841]: E0130 05:35:04.441129 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:35:17 crc kubenswrapper[4841]: I0130 05:35:17.431893 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:35:17 crc kubenswrapper[4841]: E0130 05:35:17.432880 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:35:30 crc kubenswrapper[4841]: I0130 05:35:30.431901 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:35:30 crc kubenswrapper[4841]: E0130 05:35:30.432708 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:35:41 crc kubenswrapper[4841]: I0130 05:35:41.432495 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:35:41 crc kubenswrapper[4841]: E0130 05:35:41.433466 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:35:49 crc kubenswrapper[4841]: I0130 05:35:49.207872 4841 scope.go:117] "RemoveContainer" containerID="e8fd8e866eed90045f99756e46172a271386a959f2b77da427ad4735cdb79473" Jan 30 05:35:49 crc kubenswrapper[4841]: I0130 05:35:49.247430 4841 scope.go:117] "RemoveContainer" containerID="2888bc3f6d872d8a4049937fb5ba2193d6506521580fd4ed32cf36760940a837" Jan 30 05:35:49 crc kubenswrapper[4841]: I0130 05:35:49.273520 4841 scope.go:117] "RemoveContainer" containerID="a09f001bb78e76ebfd5fd7c52c6c812ab40aeee42dc914fb41353c36d1cdd59c" Jan 30 05:35:49 crc kubenswrapper[4841]: I0130 05:35:49.334037 4841 scope.go:117] "RemoveContainer" containerID="d71848d69df407548c21d416cd1905e03bc4273aed56307ec048215b6bd60f64" Jan 30 05:35:49 crc kubenswrapper[4841]: I0130 05:35:49.363095 4841 scope.go:117] "RemoveContainer" containerID="cfc0ae345e05eb3f086c878d95ec4681a8950e4f7bd6e49a010f4a0710f07baa" Jan 30 05:35:56 crc kubenswrapper[4841]: I0130 05:35:56.432821 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:35:56 crc kubenswrapper[4841]: E0130 05:35:56.433777 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:36:08 crc kubenswrapper[4841]: I0130 05:36:08.432618 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:36:08 crc kubenswrapper[4841]: E0130 05:36:08.433613 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:36:21 crc kubenswrapper[4841]: I0130 05:36:21.433757 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:36:21 crc kubenswrapper[4841]: E0130 05:36:21.435102 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:36:36 crc kubenswrapper[4841]: I0130 05:36:36.433018 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:36:36 crc kubenswrapper[4841]: E0130 05:36:36.433990 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:36:49 crc kubenswrapper[4841]: I0130 05:36:49.432496 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:36:49 crc kubenswrapper[4841]: E0130 05:36:49.433723 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:37:02 crc kubenswrapper[4841]: I0130 05:37:02.433754 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:37:02 crc kubenswrapper[4841]: E0130 05:37:02.435270 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:37:13 crc kubenswrapper[4841]: I0130 05:37:13.432503 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:37:13 crc kubenswrapper[4841]: E0130 05:37:13.433556 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:37:24 crc kubenswrapper[4841]: I0130 05:37:24.440169 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:37:24 crc kubenswrapper[4841]: E0130 05:37:24.443170 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:37:36 crc kubenswrapper[4841]: I0130 05:37:36.431569 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:37:36 crc kubenswrapper[4841]: E0130 05:37:36.432100 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:37:51 crc kubenswrapper[4841]: I0130 05:37:51.433207 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:37:52 crc kubenswrapper[4841]: I0130 05:37:52.239821 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"d93d7aff4ef3c7b3da19fa4258cb49faa7a94bfc0cfa7cccce220bc86d0e267b"} Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.088424 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j2j4r"] Jan 30 05:39:17 crc kubenswrapper[4841]: E0130 05:39:17.089540 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32decd79-d595-4a77-9963-d1efaf5eae97" containerName="extract-utilities" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.089563 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="32decd79-d595-4a77-9963-d1efaf5eae97" containerName="extract-utilities" Jan 30 05:39:17 crc kubenswrapper[4841]: E0130 05:39:17.089597 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32decd79-d595-4a77-9963-d1efaf5eae97" containerName="extract-content" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.089611 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="32decd79-d595-4a77-9963-d1efaf5eae97" containerName="extract-content" Jan 30 05:39:17 crc kubenswrapper[4841]: E0130 05:39:17.089652 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32decd79-d595-4a77-9963-d1efaf5eae97" containerName="registry-server" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.089665 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="32decd79-d595-4a77-9963-d1efaf5eae97" containerName="registry-server" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.089951 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="32decd79-d595-4a77-9963-d1efaf5eae97" containerName="registry-server" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.091852 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.104745 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j2j4r"] Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.215214 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-utilities\") pod \"redhat-operators-j2j4r\" (UID: \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\") " pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.215337 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-catalog-content\") pod \"redhat-operators-j2j4r\" (UID: \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\") " pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.215430 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4tkc\" (UniqueName: \"kubernetes.io/projected/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-kube-api-access-f4tkc\") pod \"redhat-operators-j2j4r\" (UID: \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\") " pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.316705 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-utilities\") pod \"redhat-operators-j2j4r\" (UID: \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\") " pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.316770 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-catalog-content\") pod \"redhat-operators-j2j4r\" (UID: \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\") " pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.316810 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4tkc\" (UniqueName: \"kubernetes.io/projected/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-kube-api-access-f4tkc\") pod \"redhat-operators-j2j4r\" (UID: \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\") " pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.317352 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-utilities\") pod \"redhat-operators-j2j4r\" (UID: \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\") " pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.317602 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-catalog-content\") pod \"redhat-operators-j2j4r\" (UID: \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\") " pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.348813 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4tkc\" (UniqueName: \"kubernetes.io/projected/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-kube-api-access-f4tkc\") pod \"redhat-operators-j2j4r\" (UID: \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\") " pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.411743 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:17 crc kubenswrapper[4841]: I0130 05:39:17.876807 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j2j4r"] Jan 30 05:39:18 crc kubenswrapper[4841]: I0130 05:39:18.059144 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2j4r" event={"ID":"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf","Type":"ContainerStarted","Data":"39cac2504dbbfa6dc0116cf27fd792cfd133892090de6afdb6ab0451d9e067f0"} Jan 30 05:39:19 crc kubenswrapper[4841]: I0130 05:39:19.072352 4841 generic.go:334] "Generic (PLEG): container finished" podID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" containerID="99fe1c583a6489519d85640fda12deb72ce2d6e300d7ec20055dc858ecca4e86" exitCode=0 Jan 30 05:39:19 crc kubenswrapper[4841]: I0130 05:39:19.072503 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2j4r" event={"ID":"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf","Type":"ContainerDied","Data":"99fe1c583a6489519d85640fda12deb72ce2d6e300d7ec20055dc858ecca4e86"} Jan 30 05:39:20 crc kubenswrapper[4841]: I0130 05:39:20.082446 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2j4r" event={"ID":"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf","Type":"ContainerStarted","Data":"730e2e320727c0bc8fb9a78ff9a5823eba1d83569f6b1d46c32e214af6b2b0b5"} Jan 30 05:39:21 crc kubenswrapper[4841]: I0130 05:39:21.092050 4841 generic.go:334] "Generic (PLEG): container finished" podID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" containerID="730e2e320727c0bc8fb9a78ff9a5823eba1d83569f6b1d46c32e214af6b2b0b5" exitCode=0 Jan 30 05:39:21 crc kubenswrapper[4841]: I0130 05:39:21.092123 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2j4r" event={"ID":"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf","Type":"ContainerDied","Data":"730e2e320727c0bc8fb9a78ff9a5823eba1d83569f6b1d46c32e214af6b2b0b5"} Jan 30 05:39:22 crc kubenswrapper[4841]: I0130 05:39:22.101963 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2j4r" event={"ID":"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf","Type":"ContainerStarted","Data":"03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909"} Jan 30 05:39:22 crc kubenswrapper[4841]: I0130 05:39:22.125293 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j2j4r" podStartSLOduration=2.703049766 podStartE2EDuration="5.125274243s" podCreationTimestamp="2026-01-30 05:39:17 +0000 UTC" firstStartedPulling="2026-01-30 05:39:19.075215065 +0000 UTC m=+1896.068687743" lastFinishedPulling="2026-01-30 05:39:21.497439582 +0000 UTC m=+1898.490912220" observedRunningTime="2026-01-30 05:39:22.12295661 +0000 UTC m=+1899.116429278" watchObservedRunningTime="2026-01-30 05:39:22.125274243 +0000 UTC m=+1899.118746891" Jan 30 05:39:27 crc kubenswrapper[4841]: I0130 05:39:27.412116 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:27 crc kubenswrapper[4841]: I0130 05:39:27.412575 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:28 crc kubenswrapper[4841]: I0130 05:39:28.484566 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j2j4r" podUID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" containerName="registry-server" probeResult="failure" output=< Jan 30 05:39:28 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 05:39:28 crc kubenswrapper[4841]: > Jan 30 05:39:37 crc kubenswrapper[4841]: I0130 05:39:37.491699 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:37 crc kubenswrapper[4841]: I0130 05:39:37.558076 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:37 crc kubenswrapper[4841]: I0130 05:39:37.724825 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j2j4r"] Jan 30 05:39:39 crc kubenswrapper[4841]: I0130 05:39:39.241209 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j2j4r" podUID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" containerName="registry-server" containerID="cri-o://03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909" gracePeriod=2 Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.028027 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.077220 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-catalog-content\") pod \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\" (UID: \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\") " Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.077313 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-utilities\") pod \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\" (UID: \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\") " Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.077365 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4tkc\" (UniqueName: \"kubernetes.io/projected/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-kube-api-access-f4tkc\") pod \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\" (UID: \"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf\") " Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.078624 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-utilities" (OuterVolumeSpecName: "utilities") pod "a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" (UID: "a3bdcb59-ce42-4f4e-8755-58dc8347e6bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.086698 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-kube-api-access-f4tkc" (OuterVolumeSpecName: "kube-api-access-f4tkc") pod "a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" (UID: "a3bdcb59-ce42-4f4e-8755-58dc8347e6bf"). InnerVolumeSpecName "kube-api-access-f4tkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.179525 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.179564 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4tkc\" (UniqueName: \"kubernetes.io/projected/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-kube-api-access-f4tkc\") on node \"crc\" DevicePath \"\"" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.221778 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" (UID: "a3bdcb59-ce42-4f4e-8755-58dc8347e6bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.252810 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2j4r" event={"ID":"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf","Type":"ContainerDied","Data":"03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909"} Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.252879 4841 scope.go:117] "RemoveContainer" containerID="03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.252837 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2j4r" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.252871 4841 generic.go:334] "Generic (PLEG): container finished" podID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" containerID="03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909" exitCode=0 Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.252990 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2j4r" event={"ID":"a3bdcb59-ce42-4f4e-8755-58dc8347e6bf","Type":"ContainerDied","Data":"39cac2504dbbfa6dc0116cf27fd792cfd133892090de6afdb6ab0451d9e067f0"} Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.284893 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.289287 4841 scope.go:117] "RemoveContainer" containerID="730e2e320727c0bc8fb9a78ff9a5823eba1d83569f6b1d46c32e214af6b2b0b5" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.294205 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j2j4r"] Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.303655 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j2j4r"] Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.319849 4841 scope.go:117] "RemoveContainer" containerID="99fe1c583a6489519d85640fda12deb72ce2d6e300d7ec20055dc858ecca4e86" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.351329 4841 scope.go:117] "RemoveContainer" containerID="03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909" Jan 30 05:39:40 crc kubenswrapper[4841]: E0130 05:39:40.351917 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909\": container with ID starting with 03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909 not found: ID does not exist" containerID="03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.351973 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909"} err="failed to get container status \"03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909\": rpc error: code = NotFound desc = could not find container \"03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909\": container with ID starting with 03a894cc1c19c45eb3f20ae1a8733e7e9ef1cf8c333609ec666a0f4f55653909 not found: ID does not exist" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.352010 4841 scope.go:117] "RemoveContainer" containerID="730e2e320727c0bc8fb9a78ff9a5823eba1d83569f6b1d46c32e214af6b2b0b5" Jan 30 05:39:40 crc kubenswrapper[4841]: E0130 05:39:40.352556 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"730e2e320727c0bc8fb9a78ff9a5823eba1d83569f6b1d46c32e214af6b2b0b5\": container with ID starting with 730e2e320727c0bc8fb9a78ff9a5823eba1d83569f6b1d46c32e214af6b2b0b5 not found: ID does not exist" containerID="730e2e320727c0bc8fb9a78ff9a5823eba1d83569f6b1d46c32e214af6b2b0b5" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.352589 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"730e2e320727c0bc8fb9a78ff9a5823eba1d83569f6b1d46c32e214af6b2b0b5"} err="failed to get container status \"730e2e320727c0bc8fb9a78ff9a5823eba1d83569f6b1d46c32e214af6b2b0b5\": rpc error: code = NotFound desc = could not find container \"730e2e320727c0bc8fb9a78ff9a5823eba1d83569f6b1d46c32e214af6b2b0b5\": container with ID starting with 730e2e320727c0bc8fb9a78ff9a5823eba1d83569f6b1d46c32e214af6b2b0b5 not found: ID does not exist" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.352608 4841 scope.go:117] "RemoveContainer" containerID="99fe1c583a6489519d85640fda12deb72ce2d6e300d7ec20055dc858ecca4e86" Jan 30 05:39:40 crc kubenswrapper[4841]: E0130 05:39:40.352896 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99fe1c583a6489519d85640fda12deb72ce2d6e300d7ec20055dc858ecca4e86\": container with ID starting with 99fe1c583a6489519d85640fda12deb72ce2d6e300d7ec20055dc858ecca4e86 not found: ID does not exist" containerID="99fe1c583a6489519d85640fda12deb72ce2d6e300d7ec20055dc858ecca4e86" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.352937 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99fe1c583a6489519d85640fda12deb72ce2d6e300d7ec20055dc858ecca4e86"} err="failed to get container status \"99fe1c583a6489519d85640fda12deb72ce2d6e300d7ec20055dc858ecca4e86\": rpc error: code = NotFound desc = could not find container \"99fe1c583a6489519d85640fda12deb72ce2d6e300d7ec20055dc858ecca4e86\": container with ID starting with 99fe1c583a6489519d85640fda12deb72ce2d6e300d7ec20055dc858ecca4e86 not found: ID does not exist" Jan 30 05:39:40 crc kubenswrapper[4841]: I0130 05:39:40.444238 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" path="/var/lib/kubelet/pods/a3bdcb59-ce42-4f4e-8755-58dc8347e6bf/volumes" Jan 30 05:40:10 crc kubenswrapper[4841]: I0130 05:40:10.463652 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:40:10 crc kubenswrapper[4841]: I0130 05:40:10.466076 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:40:40 crc kubenswrapper[4841]: I0130 05:40:40.463979 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:40:40 crc kubenswrapper[4841]: I0130 05:40:40.464534 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:41:10 crc kubenswrapper[4841]: I0130 05:41:10.464258 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:41:10 crc kubenswrapper[4841]: I0130 05:41:10.464898 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:41:10 crc kubenswrapper[4841]: I0130 05:41:10.465005 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:41:10 crc kubenswrapper[4841]: I0130 05:41:10.465902 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d93d7aff4ef3c7b3da19fa4258cb49faa7a94bfc0cfa7cccce220bc86d0e267b"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:41:10 crc kubenswrapper[4841]: I0130 05:41:10.466002 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://d93d7aff4ef3c7b3da19fa4258cb49faa7a94bfc0cfa7cccce220bc86d0e267b" gracePeriod=600 Jan 30 05:41:10 crc kubenswrapper[4841]: I0130 05:41:10.835783 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="d93d7aff4ef3c7b3da19fa4258cb49faa7a94bfc0cfa7cccce220bc86d0e267b" exitCode=0 Jan 30 05:41:10 crc kubenswrapper[4841]: I0130 05:41:10.835861 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"d93d7aff4ef3c7b3da19fa4258cb49faa7a94bfc0cfa7cccce220bc86d0e267b"} Jan 30 05:41:10 crc kubenswrapper[4841]: I0130 05:41:10.836285 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44"} Jan 30 05:41:10 crc kubenswrapper[4841]: I0130 05:41:10.836319 4841 scope.go:117] "RemoveContainer" containerID="1f16ef898848d5084f7058a34665c9d0913ef216d48c1aef6d0f6d033f52a8be" Jan 30 05:43:02 crc kubenswrapper[4841]: I0130 05:43:02.883735 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-brtkj"] Jan 30 05:43:02 crc kubenswrapper[4841]: E0130 05:43:02.884825 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" containerName="extract-content" Jan 30 05:43:02 crc kubenswrapper[4841]: I0130 05:43:02.884847 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" containerName="extract-content" Jan 30 05:43:02 crc kubenswrapper[4841]: E0130 05:43:02.884881 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" containerName="extract-utilities" Jan 30 05:43:02 crc kubenswrapper[4841]: I0130 05:43:02.884894 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" containerName="extract-utilities" Jan 30 05:43:02 crc kubenswrapper[4841]: E0130 05:43:02.884927 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" containerName="registry-server" Jan 30 05:43:02 crc kubenswrapper[4841]: I0130 05:43:02.884940 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" containerName="registry-server" Jan 30 05:43:02 crc kubenswrapper[4841]: I0130 05:43:02.885192 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3bdcb59-ce42-4f4e-8755-58dc8347e6bf" containerName="registry-server" Jan 30 05:43:02 crc kubenswrapper[4841]: I0130 05:43:02.886870 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:02 crc kubenswrapper[4841]: I0130 05:43:02.898591 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-brtkj"] Jan 30 05:43:02 crc kubenswrapper[4841]: I0130 05:43:02.928528 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvwmr\" (UniqueName: \"kubernetes.io/projected/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-kube-api-access-rvwmr\") pod \"redhat-marketplace-brtkj\" (UID: \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\") " pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:02 crc kubenswrapper[4841]: I0130 05:43:02.928616 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-utilities\") pod \"redhat-marketplace-brtkj\" (UID: \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\") " pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:02 crc kubenswrapper[4841]: I0130 05:43:02.928732 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-catalog-content\") pod \"redhat-marketplace-brtkj\" (UID: \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\") " pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:03 crc kubenswrapper[4841]: I0130 05:43:03.030025 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-catalog-content\") pod \"redhat-marketplace-brtkj\" (UID: \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\") " pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:03 crc kubenswrapper[4841]: I0130 05:43:03.030377 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvwmr\" (UniqueName: \"kubernetes.io/projected/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-kube-api-access-rvwmr\") pod \"redhat-marketplace-brtkj\" (UID: \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\") " pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:03 crc kubenswrapper[4841]: I0130 05:43:03.030547 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-utilities\") pod \"redhat-marketplace-brtkj\" (UID: \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\") " pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:03 crc kubenswrapper[4841]: I0130 05:43:03.030667 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-catalog-content\") pod \"redhat-marketplace-brtkj\" (UID: \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\") " pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:03 crc kubenswrapper[4841]: I0130 05:43:03.031045 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-utilities\") pod \"redhat-marketplace-brtkj\" (UID: \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\") " pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:03 crc kubenswrapper[4841]: I0130 05:43:03.053582 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvwmr\" (UniqueName: \"kubernetes.io/projected/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-kube-api-access-rvwmr\") pod \"redhat-marketplace-brtkj\" (UID: \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\") " pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:03 crc kubenswrapper[4841]: I0130 05:43:03.241176 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:03 crc kubenswrapper[4841]: I0130 05:43:03.766253 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-brtkj"] Jan 30 05:43:03 crc kubenswrapper[4841]: I0130 05:43:03.865730 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brtkj" event={"ID":"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e","Type":"ContainerStarted","Data":"3b87ad7fb39062511943b3710f1602d95440ba58de02abe1bbf34a9bde74aa30"} Jan 30 05:43:04 crc kubenswrapper[4841]: I0130 05:43:04.882669 4841 generic.go:334] "Generic (PLEG): container finished" podID="ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" containerID="8939d6f420a3f3246ed1afc342085f3efcace3a8ce9a7bae549e4f497d594a5e" exitCode=0 Jan 30 05:43:04 crc kubenswrapper[4841]: I0130 05:43:04.882708 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brtkj" event={"ID":"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e","Type":"ContainerDied","Data":"8939d6f420a3f3246ed1afc342085f3efcace3a8ce9a7bae549e4f497d594a5e"} Jan 30 05:43:04 crc kubenswrapper[4841]: I0130 05:43:04.886590 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:43:05 crc kubenswrapper[4841]: I0130 05:43:05.894192 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brtkj" event={"ID":"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e","Type":"ContainerStarted","Data":"9ee76b8b3657651ccd5025a5104860a6ea99a38918673604187694be119c1047"} Jan 30 05:43:06 crc kubenswrapper[4841]: I0130 05:43:06.904120 4841 generic.go:334] "Generic (PLEG): container finished" podID="ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" containerID="9ee76b8b3657651ccd5025a5104860a6ea99a38918673604187694be119c1047" exitCode=0 Jan 30 05:43:06 crc kubenswrapper[4841]: I0130 05:43:06.904189 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brtkj" event={"ID":"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e","Type":"ContainerDied","Data":"9ee76b8b3657651ccd5025a5104860a6ea99a38918673604187694be119c1047"} Jan 30 05:43:08 crc kubenswrapper[4841]: I0130 05:43:08.925760 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brtkj" event={"ID":"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e","Type":"ContainerStarted","Data":"59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3"} Jan 30 05:43:08 crc kubenswrapper[4841]: I0130 05:43:08.955111 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-brtkj" podStartSLOduration=3.917517368 podStartE2EDuration="6.955079756s" podCreationTimestamp="2026-01-30 05:43:02 +0000 UTC" firstStartedPulling="2026-01-30 05:43:04.886388272 +0000 UTC m=+2121.879860910" lastFinishedPulling="2026-01-30 05:43:07.92395062 +0000 UTC m=+2124.917423298" observedRunningTime="2026-01-30 05:43:08.9515211 +0000 UTC m=+2125.944993778" watchObservedRunningTime="2026-01-30 05:43:08.955079756 +0000 UTC m=+2125.948552434" Jan 30 05:43:10 crc kubenswrapper[4841]: I0130 05:43:10.463379 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:43:10 crc kubenswrapper[4841]: I0130 05:43:10.463830 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:43:13 crc kubenswrapper[4841]: I0130 05:43:13.241484 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:13 crc kubenswrapper[4841]: I0130 05:43:13.241580 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:13 crc kubenswrapper[4841]: I0130 05:43:13.317217 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:14 crc kubenswrapper[4841]: I0130 05:43:14.030554 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:14 crc kubenswrapper[4841]: I0130 05:43:14.088360 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-brtkj"] Jan 30 05:43:15 crc kubenswrapper[4841]: I0130 05:43:15.981387 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-brtkj" podUID="ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" containerName="registry-server" containerID="cri-o://59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3" gracePeriod=2 Jan 30 05:43:15 crc kubenswrapper[4841]: I0130 05:43:15.996521 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xmch5"] Jan 30 05:43:15 crc kubenswrapper[4841]: I0130 05:43:15.999001 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.026135 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xmch5"] Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.052714 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74fc80cd-7f95-4b5d-a626-4c519bc37248-catalog-content\") pod \"certified-operators-xmch5\" (UID: \"74fc80cd-7f95-4b5d-a626-4c519bc37248\") " pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.052757 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74fc80cd-7f95-4b5d-a626-4c519bc37248-utilities\") pod \"certified-operators-xmch5\" (UID: \"74fc80cd-7f95-4b5d-a626-4c519bc37248\") " pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.052834 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgkd5\" (UniqueName: \"kubernetes.io/projected/74fc80cd-7f95-4b5d-a626-4c519bc37248-kube-api-access-dgkd5\") pod \"certified-operators-xmch5\" (UID: \"74fc80cd-7f95-4b5d-a626-4c519bc37248\") " pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.154271 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgkd5\" (UniqueName: \"kubernetes.io/projected/74fc80cd-7f95-4b5d-a626-4c519bc37248-kube-api-access-dgkd5\") pod \"certified-operators-xmch5\" (UID: \"74fc80cd-7f95-4b5d-a626-4c519bc37248\") " pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.154649 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74fc80cd-7f95-4b5d-a626-4c519bc37248-catalog-content\") pod \"certified-operators-xmch5\" (UID: \"74fc80cd-7f95-4b5d-a626-4c519bc37248\") " pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.154829 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74fc80cd-7f95-4b5d-a626-4c519bc37248-utilities\") pod \"certified-operators-xmch5\" (UID: \"74fc80cd-7f95-4b5d-a626-4c519bc37248\") " pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.155138 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74fc80cd-7f95-4b5d-a626-4c519bc37248-catalog-content\") pod \"certified-operators-xmch5\" (UID: \"74fc80cd-7f95-4b5d-a626-4c519bc37248\") " pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.155283 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74fc80cd-7f95-4b5d-a626-4c519bc37248-utilities\") pod \"certified-operators-xmch5\" (UID: \"74fc80cd-7f95-4b5d-a626-4c519bc37248\") " pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.173672 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgkd5\" (UniqueName: \"kubernetes.io/projected/74fc80cd-7f95-4b5d-a626-4c519bc37248-kube-api-access-dgkd5\") pod \"certified-operators-xmch5\" (UID: \"74fc80cd-7f95-4b5d-a626-4c519bc37248\") " pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.347794 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.880402 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xmch5"] Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.941081 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.968932 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-utilities\") pod \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\" (UID: \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\") " Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.969005 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-catalog-content\") pod \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\" (UID: \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\") " Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.969035 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvwmr\" (UniqueName: \"kubernetes.io/projected/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-kube-api-access-rvwmr\") pod \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\" (UID: \"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e\") " Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.971516 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-utilities" (OuterVolumeSpecName: "utilities") pod "ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" (UID: "ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.973546 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-kube-api-access-rvwmr" (OuterVolumeSpecName: "kube-api-access-rvwmr") pod "ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" (UID: "ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e"). InnerVolumeSpecName "kube-api-access-rvwmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.990088 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmch5" event={"ID":"74fc80cd-7f95-4b5d-a626-4c519bc37248","Type":"ContainerStarted","Data":"cebae42b57221b2f370c3a5b89ae04b8354e61e346edafdc7d98854e32928184"} Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.995243 4841 generic.go:334] "Generic (PLEG): container finished" podID="ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" containerID="59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3" exitCode=0 Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.995304 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brtkj" event={"ID":"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e","Type":"ContainerDied","Data":"59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3"} Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.995327 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brtkj" event={"ID":"ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e","Type":"ContainerDied","Data":"3b87ad7fb39062511943b3710f1602d95440ba58de02abe1bbf34a9bde74aa30"} Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.995347 4841 scope.go:117] "RemoveContainer" containerID="59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3" Jan 30 05:43:16 crc kubenswrapper[4841]: I0130 05:43:16.995538 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brtkj" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.012532 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" (UID: "ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.018867 4841 scope.go:117] "RemoveContainer" containerID="9ee76b8b3657651ccd5025a5104860a6ea99a38918673604187694be119c1047" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.037483 4841 scope.go:117] "RemoveContainer" containerID="8939d6f420a3f3246ed1afc342085f3efcace3a8ce9a7bae549e4f497d594a5e" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.051594 4841 scope.go:117] "RemoveContainer" containerID="59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3" Jan 30 05:43:17 crc kubenswrapper[4841]: E0130 05:43:17.052039 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3\": container with ID starting with 59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3 not found: ID does not exist" containerID="59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.052087 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3"} err="failed to get container status \"59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3\": rpc error: code = NotFound desc = could not find container \"59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3\": container with ID starting with 59a98dd7cea1f659e04f03dec8625b3befee4a959932c7f85b9ccba77908b4f3 not found: ID does not exist" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.052113 4841 scope.go:117] "RemoveContainer" containerID="9ee76b8b3657651ccd5025a5104860a6ea99a38918673604187694be119c1047" Jan 30 05:43:17 crc kubenswrapper[4841]: E0130 05:43:17.052438 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ee76b8b3657651ccd5025a5104860a6ea99a38918673604187694be119c1047\": container with ID starting with 9ee76b8b3657651ccd5025a5104860a6ea99a38918673604187694be119c1047 not found: ID does not exist" containerID="9ee76b8b3657651ccd5025a5104860a6ea99a38918673604187694be119c1047" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.052469 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ee76b8b3657651ccd5025a5104860a6ea99a38918673604187694be119c1047"} err="failed to get container status \"9ee76b8b3657651ccd5025a5104860a6ea99a38918673604187694be119c1047\": rpc error: code = NotFound desc = could not find container \"9ee76b8b3657651ccd5025a5104860a6ea99a38918673604187694be119c1047\": container with ID starting with 9ee76b8b3657651ccd5025a5104860a6ea99a38918673604187694be119c1047 not found: ID does not exist" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.052490 4841 scope.go:117] "RemoveContainer" containerID="8939d6f420a3f3246ed1afc342085f3efcace3a8ce9a7bae549e4f497d594a5e" Jan 30 05:43:17 crc kubenswrapper[4841]: E0130 05:43:17.052745 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8939d6f420a3f3246ed1afc342085f3efcace3a8ce9a7bae549e4f497d594a5e\": container with ID starting with 8939d6f420a3f3246ed1afc342085f3efcace3a8ce9a7bae549e4f497d594a5e not found: ID does not exist" containerID="8939d6f420a3f3246ed1afc342085f3efcace3a8ce9a7bae549e4f497d594a5e" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.052787 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8939d6f420a3f3246ed1afc342085f3efcace3a8ce9a7bae549e4f497d594a5e"} err="failed to get container status \"8939d6f420a3f3246ed1afc342085f3efcace3a8ce9a7bae549e4f497d594a5e\": rpc error: code = NotFound desc = could not find container \"8939d6f420a3f3246ed1afc342085f3efcace3a8ce9a7bae549e4f497d594a5e\": container with ID starting with 8939d6f420a3f3246ed1afc342085f3efcace3a8ce9a7bae549e4f497d594a5e not found: ID does not exist" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.071005 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.071034 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.071044 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvwmr\" (UniqueName: \"kubernetes.io/projected/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e-kube-api-access-rvwmr\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.334097 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-brtkj"] Jan 30 05:43:17 crc kubenswrapper[4841]: I0130 05:43:17.338288 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-brtkj"] Jan 30 05:43:18 crc kubenswrapper[4841]: I0130 05:43:18.006005 4841 generic.go:334] "Generic (PLEG): container finished" podID="74fc80cd-7f95-4b5d-a626-4c519bc37248" containerID="5349a15e5128a2d5cb3fe3cce0069de9dd56a2447da8fb09b588ee17f156b788" exitCode=0 Jan 30 05:43:18 crc kubenswrapper[4841]: I0130 05:43:18.006144 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmch5" event={"ID":"74fc80cd-7f95-4b5d-a626-4c519bc37248","Type":"ContainerDied","Data":"5349a15e5128a2d5cb3fe3cce0069de9dd56a2447da8fb09b588ee17f156b788"} Jan 30 05:43:18 crc kubenswrapper[4841]: I0130 05:43:18.449705 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" path="/var/lib/kubelet/pods/ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e/volumes" Jan 30 05:43:19 crc kubenswrapper[4841]: I0130 05:43:19.019694 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmch5" event={"ID":"74fc80cd-7f95-4b5d-a626-4c519bc37248","Type":"ContainerStarted","Data":"16a5438d48ac25527485a5aee1c70aafebaf7793e42e8961a448695fb0c69d71"} Jan 30 05:43:20 crc kubenswrapper[4841]: I0130 05:43:20.033705 4841 generic.go:334] "Generic (PLEG): container finished" podID="74fc80cd-7f95-4b5d-a626-4c519bc37248" containerID="16a5438d48ac25527485a5aee1c70aafebaf7793e42e8961a448695fb0c69d71" exitCode=0 Jan 30 05:43:20 crc kubenswrapper[4841]: I0130 05:43:20.033776 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmch5" event={"ID":"74fc80cd-7f95-4b5d-a626-4c519bc37248","Type":"ContainerDied","Data":"16a5438d48ac25527485a5aee1c70aafebaf7793e42e8961a448695fb0c69d71"} Jan 30 05:43:21 crc kubenswrapper[4841]: I0130 05:43:21.044389 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmch5" event={"ID":"74fc80cd-7f95-4b5d-a626-4c519bc37248","Type":"ContainerStarted","Data":"e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b"} Jan 30 05:43:21 crc kubenswrapper[4841]: I0130 05:43:21.071344 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xmch5" podStartSLOduration=3.631454899 podStartE2EDuration="6.071316326s" podCreationTimestamp="2026-01-30 05:43:15 +0000 UTC" firstStartedPulling="2026-01-30 05:43:18.008957317 +0000 UTC m=+2135.002429985" lastFinishedPulling="2026-01-30 05:43:20.448818774 +0000 UTC m=+2137.442291412" observedRunningTime="2026-01-30 05:43:21.064890541 +0000 UTC m=+2138.058363189" watchObservedRunningTime="2026-01-30 05:43:21.071316326 +0000 UTC m=+2138.064788974" Jan 30 05:43:26 crc kubenswrapper[4841]: I0130 05:43:26.348244 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:26 crc kubenswrapper[4841]: I0130 05:43:26.349064 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:26 crc kubenswrapper[4841]: I0130 05:43:26.422948 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:27 crc kubenswrapper[4841]: I0130 05:43:27.153395 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:27 crc kubenswrapper[4841]: I0130 05:43:27.208715 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xmch5"] Jan 30 05:43:29 crc kubenswrapper[4841]: I0130 05:43:29.107627 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xmch5" podUID="74fc80cd-7f95-4b5d-a626-4c519bc37248" containerName="registry-server" containerID="cri-o://e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b" gracePeriod=2 Jan 30 05:43:29 crc kubenswrapper[4841]: I0130 05:43:29.516173 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:29 crc kubenswrapper[4841]: I0130 05:43:29.583561 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74fc80cd-7f95-4b5d-a626-4c519bc37248-utilities\") pod \"74fc80cd-7f95-4b5d-a626-4c519bc37248\" (UID: \"74fc80cd-7f95-4b5d-a626-4c519bc37248\") " Jan 30 05:43:29 crc kubenswrapper[4841]: I0130 05:43:29.583648 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74fc80cd-7f95-4b5d-a626-4c519bc37248-catalog-content\") pod \"74fc80cd-7f95-4b5d-a626-4c519bc37248\" (UID: \"74fc80cd-7f95-4b5d-a626-4c519bc37248\") " Jan 30 05:43:29 crc kubenswrapper[4841]: I0130 05:43:29.583688 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgkd5\" (UniqueName: \"kubernetes.io/projected/74fc80cd-7f95-4b5d-a626-4c519bc37248-kube-api-access-dgkd5\") pod \"74fc80cd-7f95-4b5d-a626-4c519bc37248\" (UID: \"74fc80cd-7f95-4b5d-a626-4c519bc37248\") " Jan 30 05:43:29 crc kubenswrapper[4841]: I0130 05:43:29.584667 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74fc80cd-7f95-4b5d-a626-4c519bc37248-utilities" (OuterVolumeSpecName: "utilities") pod "74fc80cd-7f95-4b5d-a626-4c519bc37248" (UID: "74fc80cd-7f95-4b5d-a626-4c519bc37248"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:43:29 crc kubenswrapper[4841]: I0130 05:43:29.596028 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74fc80cd-7f95-4b5d-a626-4c519bc37248-kube-api-access-dgkd5" (OuterVolumeSpecName: "kube-api-access-dgkd5") pod "74fc80cd-7f95-4b5d-a626-4c519bc37248" (UID: "74fc80cd-7f95-4b5d-a626-4c519bc37248"). InnerVolumeSpecName "kube-api-access-dgkd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:43:29 crc kubenswrapper[4841]: I0130 05:43:29.685671 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgkd5\" (UniqueName: \"kubernetes.io/projected/74fc80cd-7f95-4b5d-a626-4c519bc37248-kube-api-access-dgkd5\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:29 crc kubenswrapper[4841]: I0130 05:43:29.685713 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74fc80cd-7f95-4b5d-a626-4c519bc37248-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.021476 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74fc80cd-7f95-4b5d-a626-4c519bc37248-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74fc80cd-7f95-4b5d-a626-4c519bc37248" (UID: "74fc80cd-7f95-4b5d-a626-4c519bc37248"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.091479 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74fc80cd-7f95-4b5d-a626-4c519bc37248-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.121888 4841 generic.go:334] "Generic (PLEG): container finished" podID="74fc80cd-7f95-4b5d-a626-4c519bc37248" containerID="e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b" exitCode=0 Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.121934 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmch5" event={"ID":"74fc80cd-7f95-4b5d-a626-4c519bc37248","Type":"ContainerDied","Data":"e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b"} Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.121961 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmch5" event={"ID":"74fc80cd-7f95-4b5d-a626-4c519bc37248","Type":"ContainerDied","Data":"cebae42b57221b2f370c3a5b89ae04b8354e61e346edafdc7d98854e32928184"} Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.121980 4841 scope.go:117] "RemoveContainer" containerID="e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.122099 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmch5" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.143973 4841 scope.go:117] "RemoveContainer" containerID="16a5438d48ac25527485a5aee1c70aafebaf7793e42e8961a448695fb0c69d71" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.170145 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xmch5"] Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.177277 4841 scope.go:117] "RemoveContainer" containerID="5349a15e5128a2d5cb3fe3cce0069de9dd56a2447da8fb09b588ee17f156b788" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.182069 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xmch5"] Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.208375 4841 scope.go:117] "RemoveContainer" containerID="e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b" Jan 30 05:43:30 crc kubenswrapper[4841]: E0130 05:43:30.209024 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b\": container with ID starting with e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b not found: ID does not exist" containerID="e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.209063 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b"} err="failed to get container status \"e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b\": rpc error: code = NotFound desc = could not find container \"e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b\": container with ID starting with e0718e0a275d2947eec1931a70d4547fcbb4084b1f6375cd3fad0bab9472008b not found: ID does not exist" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.209087 4841 scope.go:117] "RemoveContainer" containerID="16a5438d48ac25527485a5aee1c70aafebaf7793e42e8961a448695fb0c69d71" Jan 30 05:43:30 crc kubenswrapper[4841]: E0130 05:43:30.209549 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a5438d48ac25527485a5aee1c70aafebaf7793e42e8961a448695fb0c69d71\": container with ID starting with 16a5438d48ac25527485a5aee1c70aafebaf7793e42e8961a448695fb0c69d71 not found: ID does not exist" containerID="16a5438d48ac25527485a5aee1c70aafebaf7793e42e8961a448695fb0c69d71" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.209669 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a5438d48ac25527485a5aee1c70aafebaf7793e42e8961a448695fb0c69d71"} err="failed to get container status \"16a5438d48ac25527485a5aee1c70aafebaf7793e42e8961a448695fb0c69d71\": rpc error: code = NotFound desc = could not find container \"16a5438d48ac25527485a5aee1c70aafebaf7793e42e8961a448695fb0c69d71\": container with ID starting with 16a5438d48ac25527485a5aee1c70aafebaf7793e42e8961a448695fb0c69d71 not found: ID does not exist" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.209714 4841 scope.go:117] "RemoveContainer" containerID="5349a15e5128a2d5cb3fe3cce0069de9dd56a2447da8fb09b588ee17f156b788" Jan 30 05:43:30 crc kubenswrapper[4841]: E0130 05:43:30.210047 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5349a15e5128a2d5cb3fe3cce0069de9dd56a2447da8fb09b588ee17f156b788\": container with ID starting with 5349a15e5128a2d5cb3fe3cce0069de9dd56a2447da8fb09b588ee17f156b788 not found: ID does not exist" containerID="5349a15e5128a2d5cb3fe3cce0069de9dd56a2447da8fb09b588ee17f156b788" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.210082 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5349a15e5128a2d5cb3fe3cce0069de9dd56a2447da8fb09b588ee17f156b788"} err="failed to get container status \"5349a15e5128a2d5cb3fe3cce0069de9dd56a2447da8fb09b588ee17f156b788\": rpc error: code = NotFound desc = could not find container \"5349a15e5128a2d5cb3fe3cce0069de9dd56a2447da8fb09b588ee17f156b788\": container with ID starting with 5349a15e5128a2d5cb3fe3cce0069de9dd56a2447da8fb09b588ee17f156b788 not found: ID does not exist" Jan 30 05:43:30 crc kubenswrapper[4841]: I0130 05:43:30.445328 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74fc80cd-7f95-4b5d-a626-4c519bc37248" path="/var/lib/kubelet/pods/74fc80cd-7f95-4b5d-a626-4c519bc37248/volumes" Jan 30 05:43:40 crc kubenswrapper[4841]: I0130 05:43:40.464308 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:43:40 crc kubenswrapper[4841]: I0130 05:43:40.465187 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:44:10 crc kubenswrapper[4841]: I0130 05:44:10.463539 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:44:10 crc kubenswrapper[4841]: I0130 05:44:10.464286 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:44:10 crc kubenswrapper[4841]: I0130 05:44:10.464346 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:44:10 crc kubenswrapper[4841]: I0130 05:44:10.465256 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:44:10 crc kubenswrapper[4841]: I0130 05:44:10.465339 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" gracePeriod=600 Jan 30 05:44:10 crc kubenswrapper[4841]: E0130 05:44:10.608880 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:44:11 crc kubenswrapper[4841]: I0130 05:44:11.495352 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" exitCode=0 Jan 30 05:44:11 crc kubenswrapper[4841]: I0130 05:44:11.495483 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44"} Jan 30 05:44:11 crc kubenswrapper[4841]: I0130 05:44:11.495772 4841 scope.go:117] "RemoveContainer" containerID="d93d7aff4ef3c7b3da19fa4258cb49faa7a94bfc0cfa7cccce220bc86d0e267b" Jan 30 05:44:11 crc kubenswrapper[4841]: I0130 05:44:11.496944 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:44:11 crc kubenswrapper[4841]: E0130 05:44:11.497393 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:44:25 crc kubenswrapper[4841]: I0130 05:44:25.432808 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:44:25 crc kubenswrapper[4841]: E0130 05:44:25.433730 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:44:39 crc kubenswrapper[4841]: I0130 05:44:39.432048 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:44:39 crc kubenswrapper[4841]: E0130 05:44:39.433109 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:44:51 crc kubenswrapper[4841]: I0130 05:44:51.432278 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:44:51 crc kubenswrapper[4841]: E0130 05:44:51.433210 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.405707 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vxqv8"] Jan 30 05:44:53 crc kubenswrapper[4841]: E0130 05:44:53.406560 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74fc80cd-7f95-4b5d-a626-4c519bc37248" containerName="registry-server" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.406583 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="74fc80cd-7f95-4b5d-a626-4c519bc37248" containerName="registry-server" Jan 30 05:44:53 crc kubenswrapper[4841]: E0130 05:44:53.406617 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" containerName="extract-utilities" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.406631 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" containerName="extract-utilities" Jan 30 05:44:53 crc kubenswrapper[4841]: E0130 05:44:53.406652 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" containerName="extract-content" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.406671 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" containerName="extract-content" Jan 30 05:44:53 crc kubenswrapper[4841]: E0130 05:44:53.406704 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74fc80cd-7f95-4b5d-a626-4c519bc37248" containerName="extract-utilities" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.406717 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="74fc80cd-7f95-4b5d-a626-4c519bc37248" containerName="extract-utilities" Jan 30 05:44:53 crc kubenswrapper[4841]: E0130 05:44:53.406736 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74fc80cd-7f95-4b5d-a626-4c519bc37248" containerName="extract-content" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.406749 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="74fc80cd-7f95-4b5d-a626-4c519bc37248" containerName="extract-content" Jan 30 05:44:53 crc kubenswrapper[4841]: E0130 05:44:53.406774 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" containerName="registry-server" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.406788 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" containerName="registry-server" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.407056 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff30e9a8-3d31-4c6b-b7da-c91a3092fb6e" containerName="registry-server" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.407106 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="74fc80cd-7f95-4b5d-a626-4c519bc37248" containerName="registry-server" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.408948 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.416885 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vxqv8"] Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.521586 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59c0732b-a645-4f8a-8bc8-47713a7c0165-utilities\") pod \"community-operators-vxqv8\" (UID: \"59c0732b-a645-4f8a-8bc8-47713a7c0165\") " pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.521876 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59c0732b-a645-4f8a-8bc8-47713a7c0165-catalog-content\") pod \"community-operators-vxqv8\" (UID: \"59c0732b-a645-4f8a-8bc8-47713a7c0165\") " pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.522030 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49qdd\" (UniqueName: \"kubernetes.io/projected/59c0732b-a645-4f8a-8bc8-47713a7c0165-kube-api-access-49qdd\") pod \"community-operators-vxqv8\" (UID: \"59c0732b-a645-4f8a-8bc8-47713a7c0165\") " pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.623213 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59c0732b-a645-4f8a-8bc8-47713a7c0165-catalog-content\") pod \"community-operators-vxqv8\" (UID: \"59c0732b-a645-4f8a-8bc8-47713a7c0165\") " pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.623331 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49qdd\" (UniqueName: \"kubernetes.io/projected/59c0732b-a645-4f8a-8bc8-47713a7c0165-kube-api-access-49qdd\") pod \"community-operators-vxqv8\" (UID: \"59c0732b-a645-4f8a-8bc8-47713a7c0165\") " pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.623515 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59c0732b-a645-4f8a-8bc8-47713a7c0165-utilities\") pod \"community-operators-vxqv8\" (UID: \"59c0732b-a645-4f8a-8bc8-47713a7c0165\") " pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.623855 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59c0732b-a645-4f8a-8bc8-47713a7c0165-catalog-content\") pod \"community-operators-vxqv8\" (UID: \"59c0732b-a645-4f8a-8bc8-47713a7c0165\") " pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.624141 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59c0732b-a645-4f8a-8bc8-47713a7c0165-utilities\") pod \"community-operators-vxqv8\" (UID: \"59c0732b-a645-4f8a-8bc8-47713a7c0165\") " pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.648875 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49qdd\" (UniqueName: \"kubernetes.io/projected/59c0732b-a645-4f8a-8bc8-47713a7c0165-kube-api-access-49qdd\") pod \"community-operators-vxqv8\" (UID: \"59c0732b-a645-4f8a-8bc8-47713a7c0165\") " pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:44:53 crc kubenswrapper[4841]: I0130 05:44:53.745582 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:44:54 crc kubenswrapper[4841]: I0130 05:44:54.308216 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vxqv8"] Jan 30 05:44:55 crc kubenswrapper[4841]: I0130 05:44:55.073578 4841 generic.go:334] "Generic (PLEG): container finished" podID="59c0732b-a645-4f8a-8bc8-47713a7c0165" containerID="4d5e8af5a9167d2d51c571d71034be1b7fabef02edb2f8134684aca13b9121f3" exitCode=0 Jan 30 05:44:55 crc kubenswrapper[4841]: I0130 05:44:55.073644 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxqv8" event={"ID":"59c0732b-a645-4f8a-8bc8-47713a7c0165","Type":"ContainerDied","Data":"4d5e8af5a9167d2d51c571d71034be1b7fabef02edb2f8134684aca13b9121f3"} Jan 30 05:44:55 crc kubenswrapper[4841]: I0130 05:44:55.074014 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxqv8" event={"ID":"59c0732b-a645-4f8a-8bc8-47713a7c0165","Type":"ContainerStarted","Data":"0e2d72b3f756bbff1106fc80caf78fbc01b3529a1e7ec2ed23ebfb2eb22dacc5"} Jan 30 05:44:57 crc kubenswrapper[4841]: I0130 05:44:57.102309 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxqv8" event={"ID":"59c0732b-a645-4f8a-8bc8-47713a7c0165","Type":"ContainerStarted","Data":"34a207d19032470535076e0f08ab0c3a3eac505936e3645a8033e2c6114ec7a3"} Jan 30 05:44:58 crc kubenswrapper[4841]: I0130 05:44:58.112329 4841 generic.go:334] "Generic (PLEG): container finished" podID="59c0732b-a645-4f8a-8bc8-47713a7c0165" containerID="34a207d19032470535076e0f08ab0c3a3eac505936e3645a8033e2c6114ec7a3" exitCode=0 Jan 30 05:44:58 crc kubenswrapper[4841]: I0130 05:44:58.112382 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxqv8" event={"ID":"59c0732b-a645-4f8a-8bc8-47713a7c0165","Type":"ContainerDied","Data":"34a207d19032470535076e0f08ab0c3a3eac505936e3645a8033e2c6114ec7a3"} Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.130447 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxqv8" event={"ID":"59c0732b-a645-4f8a-8bc8-47713a7c0165","Type":"ContainerStarted","Data":"0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa"} Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.169081 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vxqv8" podStartSLOduration=2.949739353 podStartE2EDuration="7.16905052s" podCreationTimestamp="2026-01-30 05:44:53 +0000 UTC" firstStartedPulling="2026-01-30 05:44:55.075898752 +0000 UTC m=+2232.069371420" lastFinishedPulling="2026-01-30 05:44:59.295209949 +0000 UTC m=+2236.288682587" observedRunningTime="2026-01-30 05:45:00.160006646 +0000 UTC m=+2237.153479324" watchObservedRunningTime="2026-01-30 05:45:00.16905052 +0000 UTC m=+2237.162523198" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.188315 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk"] Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.190528 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.193789 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.195127 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.196341 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk"] Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.231326 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhtzk\" (UniqueName: \"kubernetes.io/projected/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-kube-api-access-qhtzk\") pod \"collect-profiles-29495865-f8fjk\" (UID: \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.231422 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-secret-volume\") pod \"collect-profiles-29495865-f8fjk\" (UID: \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.231457 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-config-volume\") pod \"collect-profiles-29495865-f8fjk\" (UID: \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.333093 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhtzk\" (UniqueName: \"kubernetes.io/projected/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-kube-api-access-qhtzk\") pod \"collect-profiles-29495865-f8fjk\" (UID: \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.333548 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-secret-volume\") pod \"collect-profiles-29495865-f8fjk\" (UID: \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.333748 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-config-volume\") pod \"collect-profiles-29495865-f8fjk\" (UID: \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.334708 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-config-volume\") pod \"collect-profiles-29495865-f8fjk\" (UID: \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.352670 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-secret-volume\") pod \"collect-profiles-29495865-f8fjk\" (UID: \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.367058 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhtzk\" (UniqueName: \"kubernetes.io/projected/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-kube-api-access-qhtzk\") pod \"collect-profiles-29495865-f8fjk\" (UID: \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:00 crc kubenswrapper[4841]: I0130 05:45:00.514147 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:01 crc kubenswrapper[4841]: W0130 05:45:01.019881 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeedeb5d2_eeb0_44f1_919d_23fe3b5c0316.slice/crio-189cad61fca395ce671fd7acbec9101b04ed7e83e10bce8ce23171c5a03aa621 WatchSource:0}: Error finding container 189cad61fca395ce671fd7acbec9101b04ed7e83e10bce8ce23171c5a03aa621: Status 404 returned error can't find the container with id 189cad61fca395ce671fd7acbec9101b04ed7e83e10bce8ce23171c5a03aa621 Jan 30 05:45:01 crc kubenswrapper[4841]: I0130 05:45:01.021459 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk"] Jan 30 05:45:01 crc kubenswrapper[4841]: I0130 05:45:01.142171 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" event={"ID":"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316","Type":"ContainerStarted","Data":"189cad61fca395ce671fd7acbec9101b04ed7e83e10bce8ce23171c5a03aa621"} Jan 30 05:45:02 crc kubenswrapper[4841]: I0130 05:45:02.151625 4841 generic.go:334] "Generic (PLEG): container finished" podID="eedeb5d2-eeb0-44f1-919d-23fe3b5c0316" containerID="be274802c25223a4118606561eb92b9f1ec833d4a6088d395d3838762b889ac4" exitCode=0 Jan 30 05:45:02 crc kubenswrapper[4841]: I0130 05:45:02.151720 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" event={"ID":"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316","Type":"ContainerDied","Data":"be274802c25223a4118606561eb92b9f1ec833d4a6088d395d3838762b889ac4"} Jan 30 05:45:02 crc kubenswrapper[4841]: I0130 05:45:02.431707 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:45:02 crc kubenswrapper[4841]: E0130 05:45:02.432153 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.467903 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.591876 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhtzk\" (UniqueName: \"kubernetes.io/projected/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-kube-api-access-qhtzk\") pod \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\" (UID: \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\") " Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.591968 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-config-volume\") pod \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\" (UID: \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\") " Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.592082 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-secret-volume\") pod \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\" (UID: \"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316\") " Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.592805 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-config-volume" (OuterVolumeSpecName: "config-volume") pod "eedeb5d2-eeb0-44f1-919d-23fe3b5c0316" (UID: "eedeb5d2-eeb0-44f1-919d-23fe3b5c0316"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.598173 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-kube-api-access-qhtzk" (OuterVolumeSpecName: "kube-api-access-qhtzk") pod "eedeb5d2-eeb0-44f1-919d-23fe3b5c0316" (UID: "eedeb5d2-eeb0-44f1-919d-23fe3b5c0316"). InnerVolumeSpecName "kube-api-access-qhtzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.599088 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eedeb5d2-eeb0-44f1-919d-23fe3b5c0316" (UID: "eedeb5d2-eeb0-44f1-919d-23fe3b5c0316"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.694635 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhtzk\" (UniqueName: \"kubernetes.io/projected/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-kube-api-access-qhtzk\") on node \"crc\" DevicePath \"\"" Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.694688 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.694705 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.746511 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.746573 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:45:03 crc kubenswrapper[4841]: I0130 05:45:03.821859 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:45:04 crc kubenswrapper[4841]: I0130 05:45:04.170323 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" Jan 30 05:45:04 crc kubenswrapper[4841]: I0130 05:45:04.170328 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk" event={"ID":"eedeb5d2-eeb0-44f1-919d-23fe3b5c0316","Type":"ContainerDied","Data":"189cad61fca395ce671fd7acbec9101b04ed7e83e10bce8ce23171c5a03aa621"} Jan 30 05:45:04 crc kubenswrapper[4841]: I0130 05:45:04.170428 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="189cad61fca395ce671fd7acbec9101b04ed7e83e10bce8ce23171c5a03aa621" Jan 30 05:45:04 crc kubenswrapper[4841]: I0130 05:45:04.242413 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:45:04 crc kubenswrapper[4841]: I0130 05:45:04.304563 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vxqv8"] Jan 30 05:45:04 crc kubenswrapper[4841]: I0130 05:45:04.562693 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh"] Jan 30 05:45:04 crc kubenswrapper[4841]: I0130 05:45:04.569071 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495820-dbjqh"] Jan 30 05:45:06 crc kubenswrapper[4841]: I0130 05:45:06.188699 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vxqv8" podUID="59c0732b-a645-4f8a-8bc8-47713a7c0165" containerName="registry-server" containerID="cri-o://0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa" gracePeriod=2 Jan 30 05:45:06 crc kubenswrapper[4841]: I0130 05:45:06.449771 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="705df608-7f08-4d29-aaf2-c39ae4f0e0cd" path="/var/lib/kubelet/pods/705df608-7f08-4d29-aaf2-c39ae4f0e0cd/volumes" Jan 30 05:45:06 crc kubenswrapper[4841]: I0130 05:45:06.669422 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:45:06 crc kubenswrapper[4841]: I0130 05:45:06.744459 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49qdd\" (UniqueName: \"kubernetes.io/projected/59c0732b-a645-4f8a-8bc8-47713a7c0165-kube-api-access-49qdd\") pod \"59c0732b-a645-4f8a-8bc8-47713a7c0165\" (UID: \"59c0732b-a645-4f8a-8bc8-47713a7c0165\") " Jan 30 05:45:06 crc kubenswrapper[4841]: I0130 05:45:06.750709 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c0732b-a645-4f8a-8bc8-47713a7c0165-kube-api-access-49qdd" (OuterVolumeSpecName: "kube-api-access-49qdd") pod "59c0732b-a645-4f8a-8bc8-47713a7c0165" (UID: "59c0732b-a645-4f8a-8bc8-47713a7c0165"). InnerVolumeSpecName "kube-api-access-49qdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:45:06 crc kubenswrapper[4841]: I0130 05:45:06.845695 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59c0732b-a645-4f8a-8bc8-47713a7c0165-catalog-content\") pod \"59c0732b-a645-4f8a-8bc8-47713a7c0165\" (UID: \"59c0732b-a645-4f8a-8bc8-47713a7c0165\") " Jan 30 05:45:06 crc kubenswrapper[4841]: I0130 05:45:06.845940 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59c0732b-a645-4f8a-8bc8-47713a7c0165-utilities\") pod \"59c0732b-a645-4f8a-8bc8-47713a7c0165\" (UID: \"59c0732b-a645-4f8a-8bc8-47713a7c0165\") " Jan 30 05:45:06 crc kubenswrapper[4841]: I0130 05:45:06.846442 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49qdd\" (UniqueName: \"kubernetes.io/projected/59c0732b-a645-4f8a-8bc8-47713a7c0165-kube-api-access-49qdd\") on node \"crc\" DevicePath \"\"" Jan 30 05:45:06 crc kubenswrapper[4841]: I0130 05:45:06.847701 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59c0732b-a645-4f8a-8bc8-47713a7c0165-utilities" (OuterVolumeSpecName: "utilities") pod "59c0732b-a645-4f8a-8bc8-47713a7c0165" (UID: "59c0732b-a645-4f8a-8bc8-47713a7c0165"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:45:06 crc kubenswrapper[4841]: I0130 05:45:06.927464 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59c0732b-a645-4f8a-8bc8-47713a7c0165-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59c0732b-a645-4f8a-8bc8-47713a7c0165" (UID: "59c0732b-a645-4f8a-8bc8-47713a7c0165"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:45:06 crc kubenswrapper[4841]: I0130 05:45:06.947847 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59c0732b-a645-4f8a-8bc8-47713a7c0165-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:45:06 crc kubenswrapper[4841]: I0130 05:45:06.947899 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59c0732b-a645-4f8a-8bc8-47713a7c0165-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.198148 4841 generic.go:334] "Generic (PLEG): container finished" podID="59c0732b-a645-4f8a-8bc8-47713a7c0165" containerID="0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa" exitCode=0 Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.198191 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxqv8" event={"ID":"59c0732b-a645-4f8a-8bc8-47713a7c0165","Type":"ContainerDied","Data":"0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa"} Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.198222 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxqv8" Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.198241 4841 scope.go:117] "RemoveContainer" containerID="0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa" Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.198228 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxqv8" event={"ID":"59c0732b-a645-4f8a-8bc8-47713a7c0165","Type":"ContainerDied","Data":"0e2d72b3f756bbff1106fc80caf78fbc01b3529a1e7ec2ed23ebfb2eb22dacc5"} Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.219010 4841 scope.go:117] "RemoveContainer" containerID="34a207d19032470535076e0f08ab0c3a3eac505936e3645a8033e2c6114ec7a3" Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.238982 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vxqv8"] Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.246953 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vxqv8"] Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.257936 4841 scope.go:117] "RemoveContainer" containerID="4d5e8af5a9167d2d51c571d71034be1b7fabef02edb2f8134684aca13b9121f3" Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.271962 4841 scope.go:117] "RemoveContainer" containerID="0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa" Jan 30 05:45:07 crc kubenswrapper[4841]: E0130 05:45:07.272345 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa\": container with ID starting with 0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa not found: ID does not exist" containerID="0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa" Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.272392 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa"} err="failed to get container status \"0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa\": rpc error: code = NotFound desc = could not find container \"0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa\": container with ID starting with 0bee15bb210843cccd2470c8cfe4be988cd4cb2b6dc0f80bb2ab5d0c7c4711fa not found: ID does not exist" Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.272450 4841 scope.go:117] "RemoveContainer" containerID="34a207d19032470535076e0f08ab0c3a3eac505936e3645a8033e2c6114ec7a3" Jan 30 05:45:07 crc kubenswrapper[4841]: E0130 05:45:07.272849 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34a207d19032470535076e0f08ab0c3a3eac505936e3645a8033e2c6114ec7a3\": container with ID starting with 34a207d19032470535076e0f08ab0c3a3eac505936e3645a8033e2c6114ec7a3 not found: ID does not exist" containerID="34a207d19032470535076e0f08ab0c3a3eac505936e3645a8033e2c6114ec7a3" Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.272886 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34a207d19032470535076e0f08ab0c3a3eac505936e3645a8033e2c6114ec7a3"} err="failed to get container status \"34a207d19032470535076e0f08ab0c3a3eac505936e3645a8033e2c6114ec7a3\": rpc error: code = NotFound desc = could not find container \"34a207d19032470535076e0f08ab0c3a3eac505936e3645a8033e2c6114ec7a3\": container with ID starting with 34a207d19032470535076e0f08ab0c3a3eac505936e3645a8033e2c6114ec7a3 not found: ID does not exist" Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.272909 4841 scope.go:117] "RemoveContainer" containerID="4d5e8af5a9167d2d51c571d71034be1b7fabef02edb2f8134684aca13b9121f3" Jan 30 05:45:07 crc kubenswrapper[4841]: E0130 05:45:07.273167 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d5e8af5a9167d2d51c571d71034be1b7fabef02edb2f8134684aca13b9121f3\": container with ID starting with 4d5e8af5a9167d2d51c571d71034be1b7fabef02edb2f8134684aca13b9121f3 not found: ID does not exist" containerID="4d5e8af5a9167d2d51c571d71034be1b7fabef02edb2f8134684aca13b9121f3" Jan 30 05:45:07 crc kubenswrapper[4841]: I0130 05:45:07.273203 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d5e8af5a9167d2d51c571d71034be1b7fabef02edb2f8134684aca13b9121f3"} err="failed to get container status \"4d5e8af5a9167d2d51c571d71034be1b7fabef02edb2f8134684aca13b9121f3\": rpc error: code = NotFound desc = could not find container \"4d5e8af5a9167d2d51c571d71034be1b7fabef02edb2f8134684aca13b9121f3\": container with ID starting with 4d5e8af5a9167d2d51c571d71034be1b7fabef02edb2f8134684aca13b9121f3 not found: ID does not exist" Jan 30 05:45:08 crc kubenswrapper[4841]: I0130 05:45:08.448530 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59c0732b-a645-4f8a-8bc8-47713a7c0165" path="/var/lib/kubelet/pods/59c0732b-a645-4f8a-8bc8-47713a7c0165/volumes" Jan 30 05:45:14 crc kubenswrapper[4841]: I0130 05:45:14.441222 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:45:14 crc kubenswrapper[4841]: E0130 05:45:14.442125 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:45:25 crc kubenswrapper[4841]: I0130 05:45:25.432749 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:45:25 crc kubenswrapper[4841]: E0130 05:45:25.433907 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:45:38 crc kubenswrapper[4841]: I0130 05:45:38.432012 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:45:38 crc kubenswrapper[4841]: E0130 05:45:38.433034 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:45:49 crc kubenswrapper[4841]: I0130 05:45:49.636342 4841 scope.go:117] "RemoveContainer" containerID="44c5fce58ba6b5265fed46e62435d71d97b647c9e13baca4ce7f2c4c81a3671f" Jan 30 05:45:50 crc kubenswrapper[4841]: I0130 05:45:50.431813 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:45:50 crc kubenswrapper[4841]: E0130 05:45:50.432682 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:46:01 crc kubenswrapper[4841]: I0130 05:46:01.432777 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:46:01 crc kubenswrapper[4841]: E0130 05:46:01.433689 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:46:16 crc kubenswrapper[4841]: I0130 05:46:16.432150 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:46:16 crc kubenswrapper[4841]: E0130 05:46:16.433236 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:46:27 crc kubenswrapper[4841]: I0130 05:46:27.432087 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:46:27 crc kubenswrapper[4841]: E0130 05:46:27.433372 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:46:41 crc kubenswrapper[4841]: I0130 05:46:41.431929 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:46:41 crc kubenswrapper[4841]: E0130 05:46:41.433024 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:46:56 crc kubenswrapper[4841]: I0130 05:46:56.433205 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:46:56 crc kubenswrapper[4841]: E0130 05:46:56.434377 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:47:08 crc kubenswrapper[4841]: I0130 05:47:08.434130 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:47:08 crc kubenswrapper[4841]: E0130 05:47:08.435359 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:47:19 crc kubenswrapper[4841]: I0130 05:47:19.431686 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:47:19 crc kubenswrapper[4841]: E0130 05:47:19.432620 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:47:32 crc kubenswrapper[4841]: I0130 05:47:32.435780 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:47:32 crc kubenswrapper[4841]: E0130 05:47:32.437119 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:47:45 crc kubenswrapper[4841]: I0130 05:47:45.431823 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:47:45 crc kubenswrapper[4841]: E0130 05:47:45.432553 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:47:57 crc kubenswrapper[4841]: I0130 05:47:57.432183 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:47:57 crc kubenswrapper[4841]: E0130 05:47:57.433117 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:48:09 crc kubenswrapper[4841]: I0130 05:48:09.432788 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:48:09 crc kubenswrapper[4841]: E0130 05:48:09.433477 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:48:24 crc kubenswrapper[4841]: I0130 05:48:24.438419 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:48:24 crc kubenswrapper[4841]: E0130 05:48:24.439575 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:48:39 crc kubenswrapper[4841]: I0130 05:48:39.432314 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:48:39 crc kubenswrapper[4841]: E0130 05:48:39.433180 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:48:51 crc kubenswrapper[4841]: I0130 05:48:51.432145 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:48:51 crc kubenswrapper[4841]: E0130 05:48:51.433281 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:49:02 crc kubenswrapper[4841]: I0130 05:49:02.433365 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:49:02 crc kubenswrapper[4841]: E0130 05:49:02.434918 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:49:16 crc kubenswrapper[4841]: I0130 05:49:16.431923 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:49:17 crc kubenswrapper[4841]: I0130 05:49:17.515206 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"c7427dd6062e97635031533700819e96c8f2b18da1f6aafb8f304158f647c953"} Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.464488 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gj8r8"] Jan 30 05:49:19 crc kubenswrapper[4841]: E0130 05:49:19.465349 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c0732b-a645-4f8a-8bc8-47713a7c0165" containerName="extract-utilities" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.465371 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c0732b-a645-4f8a-8bc8-47713a7c0165" containerName="extract-utilities" Jan 30 05:49:19 crc kubenswrapper[4841]: E0130 05:49:19.465432 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c0732b-a645-4f8a-8bc8-47713a7c0165" containerName="extract-content" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.465446 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c0732b-a645-4f8a-8bc8-47713a7c0165" containerName="extract-content" Jan 30 05:49:19 crc kubenswrapper[4841]: E0130 05:49:19.465474 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c0732b-a645-4f8a-8bc8-47713a7c0165" containerName="registry-server" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.465489 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c0732b-a645-4f8a-8bc8-47713a7c0165" containerName="registry-server" Jan 30 05:49:19 crc kubenswrapper[4841]: E0130 05:49:19.465510 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedeb5d2-eeb0-44f1-919d-23fe3b5c0316" containerName="collect-profiles" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.465522 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedeb5d2-eeb0-44f1-919d-23fe3b5c0316" containerName="collect-profiles" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.465756 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c0732b-a645-4f8a-8bc8-47713a7c0165" containerName="registry-server" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.465795 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedeb5d2-eeb0-44f1-919d-23fe3b5c0316" containerName="collect-profiles" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.467585 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.477375 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gj8r8"] Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.642680 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flxjd\" (UniqueName: \"kubernetes.io/projected/738cee70-617f-4578-8208-09582640a2cd-kube-api-access-flxjd\") pod \"redhat-operators-gj8r8\" (UID: \"738cee70-617f-4578-8208-09582640a2cd\") " pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.642844 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738cee70-617f-4578-8208-09582640a2cd-utilities\") pod \"redhat-operators-gj8r8\" (UID: \"738cee70-617f-4578-8208-09582640a2cd\") " pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.642876 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738cee70-617f-4578-8208-09582640a2cd-catalog-content\") pod \"redhat-operators-gj8r8\" (UID: \"738cee70-617f-4578-8208-09582640a2cd\") " pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.743744 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738cee70-617f-4578-8208-09582640a2cd-utilities\") pod \"redhat-operators-gj8r8\" (UID: \"738cee70-617f-4578-8208-09582640a2cd\") " pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.743785 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738cee70-617f-4578-8208-09582640a2cd-catalog-content\") pod \"redhat-operators-gj8r8\" (UID: \"738cee70-617f-4578-8208-09582640a2cd\") " pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.743837 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flxjd\" (UniqueName: \"kubernetes.io/projected/738cee70-617f-4578-8208-09582640a2cd-kube-api-access-flxjd\") pod \"redhat-operators-gj8r8\" (UID: \"738cee70-617f-4578-8208-09582640a2cd\") " pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.744250 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738cee70-617f-4578-8208-09582640a2cd-utilities\") pod \"redhat-operators-gj8r8\" (UID: \"738cee70-617f-4578-8208-09582640a2cd\") " pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.744343 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738cee70-617f-4578-8208-09582640a2cd-catalog-content\") pod \"redhat-operators-gj8r8\" (UID: \"738cee70-617f-4578-8208-09582640a2cd\") " pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.771127 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flxjd\" (UniqueName: \"kubernetes.io/projected/738cee70-617f-4578-8208-09582640a2cd-kube-api-access-flxjd\") pod \"redhat-operators-gj8r8\" (UID: \"738cee70-617f-4578-8208-09582640a2cd\") " pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:19 crc kubenswrapper[4841]: I0130 05:49:19.793276 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:20 crc kubenswrapper[4841]: I0130 05:49:20.206877 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gj8r8"] Jan 30 05:49:20 crc kubenswrapper[4841]: I0130 05:49:20.537372 4841 generic.go:334] "Generic (PLEG): container finished" podID="738cee70-617f-4578-8208-09582640a2cd" containerID="5af73e9cbe4d7467247fca5e5671eff1d544a0d500dfbcc50f102922169947c1" exitCode=0 Jan 30 05:49:20 crc kubenswrapper[4841]: I0130 05:49:20.537452 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj8r8" event={"ID":"738cee70-617f-4578-8208-09582640a2cd","Type":"ContainerDied","Data":"5af73e9cbe4d7467247fca5e5671eff1d544a0d500dfbcc50f102922169947c1"} Jan 30 05:49:20 crc kubenswrapper[4841]: I0130 05:49:20.537480 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj8r8" event={"ID":"738cee70-617f-4578-8208-09582640a2cd","Type":"ContainerStarted","Data":"b33570dc6db85783aeec1470a6bf65fe3b3da4c01a1728e34a00f12c4df16d79"} Jan 30 05:49:20 crc kubenswrapper[4841]: I0130 05:49:20.538916 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:49:21 crc kubenswrapper[4841]: I0130 05:49:21.554545 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj8r8" event={"ID":"738cee70-617f-4578-8208-09582640a2cd","Type":"ContainerStarted","Data":"836f015b1cbc3bf22ec08d75dbb608f0029a2643f80f72043f447600df8843c2"} Jan 30 05:49:22 crc kubenswrapper[4841]: I0130 05:49:22.566462 4841 generic.go:334] "Generic (PLEG): container finished" podID="738cee70-617f-4578-8208-09582640a2cd" containerID="836f015b1cbc3bf22ec08d75dbb608f0029a2643f80f72043f447600df8843c2" exitCode=0 Jan 30 05:49:22 crc kubenswrapper[4841]: I0130 05:49:22.566587 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj8r8" event={"ID":"738cee70-617f-4578-8208-09582640a2cd","Type":"ContainerDied","Data":"836f015b1cbc3bf22ec08d75dbb608f0029a2643f80f72043f447600df8843c2"} Jan 30 05:49:23 crc kubenswrapper[4841]: I0130 05:49:23.578756 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj8r8" event={"ID":"738cee70-617f-4578-8208-09582640a2cd","Type":"ContainerStarted","Data":"c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127"} Jan 30 05:49:23 crc kubenswrapper[4841]: I0130 05:49:23.607067 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gj8r8" podStartSLOduration=2.160432813 podStartE2EDuration="4.607047539s" podCreationTimestamp="2026-01-30 05:49:19 +0000 UTC" firstStartedPulling="2026-01-30 05:49:20.538735243 +0000 UTC m=+2497.532207881" lastFinishedPulling="2026-01-30 05:49:22.985349939 +0000 UTC m=+2499.978822607" observedRunningTime="2026-01-30 05:49:23.598627664 +0000 UTC m=+2500.592100332" watchObservedRunningTime="2026-01-30 05:49:23.607047539 +0000 UTC m=+2500.600520187" Jan 30 05:49:29 crc kubenswrapper[4841]: I0130 05:49:29.793480 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:29 crc kubenswrapper[4841]: I0130 05:49:29.794176 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:30 crc kubenswrapper[4841]: I0130 05:49:30.871070 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gj8r8" podUID="738cee70-617f-4578-8208-09582640a2cd" containerName="registry-server" probeResult="failure" output=< Jan 30 05:49:30 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 05:49:30 crc kubenswrapper[4841]: > Jan 30 05:49:39 crc kubenswrapper[4841]: I0130 05:49:39.867280 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:39 crc kubenswrapper[4841]: I0130 05:49:39.946884 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:40 crc kubenswrapper[4841]: I0130 05:49:40.113652 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gj8r8"] Jan 30 05:49:41 crc kubenswrapper[4841]: I0130 05:49:41.745252 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gj8r8" podUID="738cee70-617f-4578-8208-09582640a2cd" containerName="registry-server" containerID="cri-o://c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127" gracePeriod=2 Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.246027 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.363196 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738cee70-617f-4578-8208-09582640a2cd-catalog-content\") pod \"738cee70-617f-4578-8208-09582640a2cd\" (UID: \"738cee70-617f-4578-8208-09582640a2cd\") " Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.363295 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738cee70-617f-4578-8208-09582640a2cd-utilities\") pod \"738cee70-617f-4578-8208-09582640a2cd\" (UID: \"738cee70-617f-4578-8208-09582640a2cd\") " Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.363349 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flxjd\" (UniqueName: \"kubernetes.io/projected/738cee70-617f-4578-8208-09582640a2cd-kube-api-access-flxjd\") pod \"738cee70-617f-4578-8208-09582640a2cd\" (UID: \"738cee70-617f-4578-8208-09582640a2cd\") " Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.367385 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/738cee70-617f-4578-8208-09582640a2cd-utilities" (OuterVolumeSpecName: "utilities") pod "738cee70-617f-4578-8208-09582640a2cd" (UID: "738cee70-617f-4578-8208-09582640a2cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.373423 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/738cee70-617f-4578-8208-09582640a2cd-kube-api-access-flxjd" (OuterVolumeSpecName: "kube-api-access-flxjd") pod "738cee70-617f-4578-8208-09582640a2cd" (UID: "738cee70-617f-4578-8208-09582640a2cd"). InnerVolumeSpecName "kube-api-access-flxjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.465848 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738cee70-617f-4578-8208-09582640a2cd-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.465902 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flxjd\" (UniqueName: \"kubernetes.io/projected/738cee70-617f-4578-8208-09582640a2cd-kube-api-access-flxjd\") on node \"crc\" DevicePath \"\"" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.516350 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/738cee70-617f-4578-8208-09582640a2cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "738cee70-617f-4578-8208-09582640a2cd" (UID: "738cee70-617f-4578-8208-09582640a2cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.566550 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738cee70-617f-4578-8208-09582640a2cd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.758806 4841 generic.go:334] "Generic (PLEG): container finished" podID="738cee70-617f-4578-8208-09582640a2cd" containerID="c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127" exitCode=0 Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.758853 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj8r8" event={"ID":"738cee70-617f-4578-8208-09582640a2cd","Type":"ContainerDied","Data":"c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127"} Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.759103 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gj8r8" event={"ID":"738cee70-617f-4578-8208-09582640a2cd","Type":"ContainerDied","Data":"b33570dc6db85783aeec1470a6bf65fe3b3da4c01a1728e34a00f12c4df16d79"} Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.759130 4841 scope.go:117] "RemoveContainer" containerID="c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.758975 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gj8r8" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.813413 4841 scope.go:117] "RemoveContainer" containerID="836f015b1cbc3bf22ec08d75dbb608f0029a2643f80f72043f447600df8843c2" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.822442 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gj8r8"] Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.833033 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gj8r8"] Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.836708 4841 scope.go:117] "RemoveContainer" containerID="5af73e9cbe4d7467247fca5e5671eff1d544a0d500dfbcc50f102922169947c1" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.869685 4841 scope.go:117] "RemoveContainer" containerID="c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127" Jan 30 05:49:42 crc kubenswrapper[4841]: E0130 05:49:42.870130 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127\": container with ID starting with c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127 not found: ID does not exist" containerID="c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.870169 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127"} err="failed to get container status \"c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127\": rpc error: code = NotFound desc = could not find container \"c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127\": container with ID starting with c1be0aa2f91197914746a7be8b736fa1d47052e63d6e84236d41f0cb77d0e127 not found: ID does not exist" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.870193 4841 scope.go:117] "RemoveContainer" containerID="836f015b1cbc3bf22ec08d75dbb608f0029a2643f80f72043f447600df8843c2" Jan 30 05:49:42 crc kubenswrapper[4841]: E0130 05:49:42.870637 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"836f015b1cbc3bf22ec08d75dbb608f0029a2643f80f72043f447600df8843c2\": container with ID starting with 836f015b1cbc3bf22ec08d75dbb608f0029a2643f80f72043f447600df8843c2 not found: ID does not exist" containerID="836f015b1cbc3bf22ec08d75dbb608f0029a2643f80f72043f447600df8843c2" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.870666 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836f015b1cbc3bf22ec08d75dbb608f0029a2643f80f72043f447600df8843c2"} err="failed to get container status \"836f015b1cbc3bf22ec08d75dbb608f0029a2643f80f72043f447600df8843c2\": rpc error: code = NotFound desc = could not find container \"836f015b1cbc3bf22ec08d75dbb608f0029a2643f80f72043f447600df8843c2\": container with ID starting with 836f015b1cbc3bf22ec08d75dbb608f0029a2643f80f72043f447600df8843c2 not found: ID does not exist" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.870688 4841 scope.go:117] "RemoveContainer" containerID="5af73e9cbe4d7467247fca5e5671eff1d544a0d500dfbcc50f102922169947c1" Jan 30 05:49:42 crc kubenswrapper[4841]: E0130 05:49:42.871070 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5af73e9cbe4d7467247fca5e5671eff1d544a0d500dfbcc50f102922169947c1\": container with ID starting with 5af73e9cbe4d7467247fca5e5671eff1d544a0d500dfbcc50f102922169947c1 not found: ID does not exist" containerID="5af73e9cbe4d7467247fca5e5671eff1d544a0d500dfbcc50f102922169947c1" Jan 30 05:49:42 crc kubenswrapper[4841]: I0130 05:49:42.871168 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af73e9cbe4d7467247fca5e5671eff1d544a0d500dfbcc50f102922169947c1"} err="failed to get container status \"5af73e9cbe4d7467247fca5e5671eff1d544a0d500dfbcc50f102922169947c1\": rpc error: code = NotFound desc = could not find container \"5af73e9cbe4d7467247fca5e5671eff1d544a0d500dfbcc50f102922169947c1\": container with ID starting with 5af73e9cbe4d7467247fca5e5671eff1d544a0d500dfbcc50f102922169947c1 not found: ID does not exist" Jan 30 05:49:44 crc kubenswrapper[4841]: I0130 05:49:44.452156 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="738cee70-617f-4578-8208-09582640a2cd" path="/var/lib/kubelet/pods/738cee70-617f-4578-8208-09582640a2cd/volumes" Jan 30 05:51:40 crc kubenswrapper[4841]: I0130 05:51:40.463656 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:51:40 crc kubenswrapper[4841]: I0130 05:51:40.464329 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:52:10 crc kubenswrapper[4841]: I0130 05:52:10.464253 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:52:10 crc kubenswrapper[4841]: I0130 05:52:10.464983 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:52:40 crc kubenswrapper[4841]: I0130 05:52:40.464137 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:52:40 crc kubenswrapper[4841]: I0130 05:52:40.464695 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:52:40 crc kubenswrapper[4841]: I0130 05:52:40.464745 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:52:40 crc kubenswrapper[4841]: I0130 05:52:40.465274 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7427dd6062e97635031533700819e96c8f2b18da1f6aafb8f304158f647c953"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:52:40 crc kubenswrapper[4841]: I0130 05:52:40.465332 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://c7427dd6062e97635031533700819e96c8f2b18da1f6aafb8f304158f647c953" gracePeriod=600 Jan 30 05:52:41 crc kubenswrapper[4841]: I0130 05:52:41.606358 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="c7427dd6062e97635031533700819e96c8f2b18da1f6aafb8f304158f647c953" exitCode=0 Jan 30 05:52:41 crc kubenswrapper[4841]: I0130 05:52:41.606542 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"c7427dd6062e97635031533700819e96c8f2b18da1f6aafb8f304158f647c953"} Jan 30 05:52:41 crc kubenswrapper[4841]: I0130 05:52:41.607346 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75"} Jan 30 05:52:41 crc kubenswrapper[4841]: I0130 05:52:41.607374 4841 scope.go:117] "RemoveContainer" containerID="90d5d5a7ec2c952668c0ebb1892a6934ee018c56b0459da151a8e55dd246ea44" Jan 30 05:53:16 crc kubenswrapper[4841]: I0130 05:53:16.813164 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ktg4l"] Jan 30 05:53:16 crc kubenswrapper[4841]: E0130 05:53:16.814249 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738cee70-617f-4578-8208-09582640a2cd" containerName="extract-utilities" Jan 30 05:53:16 crc kubenswrapper[4841]: I0130 05:53:16.814271 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="738cee70-617f-4578-8208-09582640a2cd" containerName="extract-utilities" Jan 30 05:53:16 crc kubenswrapper[4841]: E0130 05:53:16.814312 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738cee70-617f-4578-8208-09582640a2cd" containerName="extract-content" Jan 30 05:53:16 crc kubenswrapper[4841]: I0130 05:53:16.814326 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="738cee70-617f-4578-8208-09582640a2cd" containerName="extract-content" Jan 30 05:53:16 crc kubenswrapper[4841]: E0130 05:53:16.814345 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738cee70-617f-4578-8208-09582640a2cd" containerName="registry-server" Jan 30 05:53:16 crc kubenswrapper[4841]: I0130 05:53:16.814360 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="738cee70-617f-4578-8208-09582640a2cd" containerName="registry-server" Jan 30 05:53:16 crc kubenswrapper[4841]: I0130 05:53:16.814648 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="738cee70-617f-4578-8208-09582640a2cd" containerName="registry-server" Jan 30 05:53:16 crc kubenswrapper[4841]: I0130 05:53:16.816377 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:16 crc kubenswrapper[4841]: I0130 05:53:16.823734 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktg4l"] Jan 30 05:53:16 crc kubenswrapper[4841]: I0130 05:53:16.934732 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a041402d-f840-4045-88f7-77d4c69d097c-catalog-content\") pod \"redhat-marketplace-ktg4l\" (UID: \"a041402d-f840-4045-88f7-77d4c69d097c\") " pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:16 crc kubenswrapper[4841]: I0130 05:53:16.934827 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd9p7\" (UniqueName: \"kubernetes.io/projected/a041402d-f840-4045-88f7-77d4c69d097c-kube-api-access-pd9p7\") pod \"redhat-marketplace-ktg4l\" (UID: \"a041402d-f840-4045-88f7-77d4c69d097c\") " pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:16 crc kubenswrapper[4841]: I0130 05:53:16.934932 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a041402d-f840-4045-88f7-77d4c69d097c-utilities\") pod \"redhat-marketplace-ktg4l\" (UID: \"a041402d-f840-4045-88f7-77d4c69d097c\") " pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:17 crc kubenswrapper[4841]: I0130 05:53:17.035877 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a041402d-f840-4045-88f7-77d4c69d097c-catalog-content\") pod \"redhat-marketplace-ktg4l\" (UID: \"a041402d-f840-4045-88f7-77d4c69d097c\") " pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:17 crc kubenswrapper[4841]: I0130 05:53:17.035973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd9p7\" (UniqueName: \"kubernetes.io/projected/a041402d-f840-4045-88f7-77d4c69d097c-kube-api-access-pd9p7\") pod \"redhat-marketplace-ktg4l\" (UID: \"a041402d-f840-4045-88f7-77d4c69d097c\") " pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:17 crc kubenswrapper[4841]: I0130 05:53:17.036085 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a041402d-f840-4045-88f7-77d4c69d097c-utilities\") pod \"redhat-marketplace-ktg4l\" (UID: \"a041402d-f840-4045-88f7-77d4c69d097c\") " pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:17 crc kubenswrapper[4841]: I0130 05:53:17.036468 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a041402d-f840-4045-88f7-77d4c69d097c-catalog-content\") pod \"redhat-marketplace-ktg4l\" (UID: \"a041402d-f840-4045-88f7-77d4c69d097c\") " pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:17 crc kubenswrapper[4841]: I0130 05:53:17.036672 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a041402d-f840-4045-88f7-77d4c69d097c-utilities\") pod \"redhat-marketplace-ktg4l\" (UID: \"a041402d-f840-4045-88f7-77d4c69d097c\") " pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:17 crc kubenswrapper[4841]: I0130 05:53:17.057331 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd9p7\" (UniqueName: \"kubernetes.io/projected/a041402d-f840-4045-88f7-77d4c69d097c-kube-api-access-pd9p7\") pod \"redhat-marketplace-ktg4l\" (UID: \"a041402d-f840-4045-88f7-77d4c69d097c\") " pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:17 crc kubenswrapper[4841]: I0130 05:53:17.137484 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:17 crc kubenswrapper[4841]: I0130 05:53:17.627420 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktg4l"] Jan 30 05:53:17 crc kubenswrapper[4841]: I0130 05:53:17.963596 4841 generic.go:334] "Generic (PLEG): container finished" podID="a041402d-f840-4045-88f7-77d4c69d097c" containerID="3c82d8ff9b6df357ee59ea7145df98bb2bcfac4de58422f371aeac5c41fb2be5" exitCode=0 Jan 30 05:53:17 crc kubenswrapper[4841]: I0130 05:53:17.963995 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktg4l" event={"ID":"a041402d-f840-4045-88f7-77d4c69d097c","Type":"ContainerDied","Data":"3c82d8ff9b6df357ee59ea7145df98bb2bcfac4de58422f371aeac5c41fb2be5"} Jan 30 05:53:17 crc kubenswrapper[4841]: I0130 05:53:17.964042 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktg4l" event={"ID":"a041402d-f840-4045-88f7-77d4c69d097c","Type":"ContainerStarted","Data":"b59dd6564489bc7dd6b2ddc6788122ced87b9662ce1acfc316f375566213cf99"} Jan 30 05:53:19 crc kubenswrapper[4841]: I0130 05:53:19.020647 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktg4l" event={"ID":"a041402d-f840-4045-88f7-77d4c69d097c","Type":"ContainerStarted","Data":"34139b3962272b4018c38a6eaf7360c067b4822c79bf198194c9c9e64cbe870b"} Jan 30 05:53:20 crc kubenswrapper[4841]: I0130 05:53:20.034045 4841 generic.go:334] "Generic (PLEG): container finished" podID="a041402d-f840-4045-88f7-77d4c69d097c" containerID="34139b3962272b4018c38a6eaf7360c067b4822c79bf198194c9c9e64cbe870b" exitCode=0 Jan 30 05:53:20 crc kubenswrapper[4841]: I0130 05:53:20.034098 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktg4l" event={"ID":"a041402d-f840-4045-88f7-77d4c69d097c","Type":"ContainerDied","Data":"34139b3962272b4018c38a6eaf7360c067b4822c79bf198194c9c9e64cbe870b"} Jan 30 05:53:21 crc kubenswrapper[4841]: I0130 05:53:21.048302 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktg4l" event={"ID":"a041402d-f840-4045-88f7-77d4c69d097c","Type":"ContainerStarted","Data":"a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a"} Jan 30 05:53:27 crc kubenswrapper[4841]: I0130 05:53:27.138388 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:27 crc kubenswrapper[4841]: I0130 05:53:27.139023 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:27 crc kubenswrapper[4841]: I0130 05:53:27.213768 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:27 crc kubenswrapper[4841]: I0130 05:53:27.237807 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ktg4l" podStartSLOduration=8.729164283 podStartE2EDuration="11.237788435s" podCreationTimestamp="2026-01-30 05:53:16 +0000 UTC" firstStartedPulling="2026-01-30 05:53:17.969250336 +0000 UTC m=+2734.962723004" lastFinishedPulling="2026-01-30 05:53:20.477874478 +0000 UTC m=+2737.471347156" observedRunningTime="2026-01-30 05:53:21.07706062 +0000 UTC m=+2738.070533308" watchObservedRunningTime="2026-01-30 05:53:27.237788435 +0000 UTC m=+2744.231261083" Jan 30 05:53:28 crc kubenswrapper[4841]: I0130 05:53:28.151285 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:28 crc kubenswrapper[4841]: I0130 05:53:28.195124 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktg4l"] Jan 30 05:53:30 crc kubenswrapper[4841]: I0130 05:53:30.129491 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ktg4l" podUID="a041402d-f840-4045-88f7-77d4c69d097c" containerName="registry-server" containerID="cri-o://a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a" gracePeriod=2 Jan 30 05:53:30 crc kubenswrapper[4841]: E0130 05:53:30.209790 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda041402d_f840_4045_88f7_77d4c69d097c.slice/crio-a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a.scope\": RecentStats: unable to find data in memory cache]" Jan 30 05:53:30 crc kubenswrapper[4841]: I0130 05:53:30.643518 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:30 crc kubenswrapper[4841]: I0130 05:53:30.750494 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a041402d-f840-4045-88f7-77d4c69d097c-utilities\") pod \"a041402d-f840-4045-88f7-77d4c69d097c\" (UID: \"a041402d-f840-4045-88f7-77d4c69d097c\") " Jan 30 05:53:30 crc kubenswrapper[4841]: I0130 05:53:30.750605 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a041402d-f840-4045-88f7-77d4c69d097c-catalog-content\") pod \"a041402d-f840-4045-88f7-77d4c69d097c\" (UID: \"a041402d-f840-4045-88f7-77d4c69d097c\") " Jan 30 05:53:30 crc kubenswrapper[4841]: I0130 05:53:30.750647 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd9p7\" (UniqueName: \"kubernetes.io/projected/a041402d-f840-4045-88f7-77d4c69d097c-kube-api-access-pd9p7\") pod \"a041402d-f840-4045-88f7-77d4c69d097c\" (UID: \"a041402d-f840-4045-88f7-77d4c69d097c\") " Jan 30 05:53:30 crc kubenswrapper[4841]: I0130 05:53:30.752170 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a041402d-f840-4045-88f7-77d4c69d097c-utilities" (OuterVolumeSpecName: "utilities") pod "a041402d-f840-4045-88f7-77d4c69d097c" (UID: "a041402d-f840-4045-88f7-77d4c69d097c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:53:30 crc kubenswrapper[4841]: I0130 05:53:30.755329 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a041402d-f840-4045-88f7-77d4c69d097c-kube-api-access-pd9p7" (OuterVolumeSpecName: "kube-api-access-pd9p7") pod "a041402d-f840-4045-88f7-77d4c69d097c" (UID: "a041402d-f840-4045-88f7-77d4c69d097c"). InnerVolumeSpecName "kube-api-access-pd9p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:53:30 crc kubenswrapper[4841]: I0130 05:53:30.814568 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a041402d-f840-4045-88f7-77d4c69d097c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a041402d-f840-4045-88f7-77d4c69d097c" (UID: "a041402d-f840-4045-88f7-77d4c69d097c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:53:30 crc kubenswrapper[4841]: I0130 05:53:30.852509 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a041402d-f840-4045-88f7-77d4c69d097c-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:53:30 crc kubenswrapper[4841]: I0130 05:53:30.852540 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a041402d-f840-4045-88f7-77d4c69d097c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:53:30 crc kubenswrapper[4841]: I0130 05:53:30.852551 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd9p7\" (UniqueName: \"kubernetes.io/projected/a041402d-f840-4045-88f7-77d4c69d097c-kube-api-access-pd9p7\") on node \"crc\" DevicePath \"\"" Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.140296 4841 generic.go:334] "Generic (PLEG): container finished" podID="a041402d-f840-4045-88f7-77d4c69d097c" containerID="a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a" exitCode=0 Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.140373 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktg4l" event={"ID":"a041402d-f840-4045-88f7-77d4c69d097c","Type":"ContainerDied","Data":"a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a"} Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.140430 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktg4l" Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.140492 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktg4l" event={"ID":"a041402d-f840-4045-88f7-77d4c69d097c","Type":"ContainerDied","Data":"b59dd6564489bc7dd6b2ddc6788122ced87b9662ce1acfc316f375566213cf99"} Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.140537 4841 scope.go:117] "RemoveContainer" containerID="a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a" Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.162776 4841 scope.go:117] "RemoveContainer" containerID="34139b3962272b4018c38a6eaf7360c067b4822c79bf198194c9c9e64cbe870b" Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.192412 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktg4l"] Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.199955 4841 scope.go:117] "RemoveContainer" containerID="3c82d8ff9b6df357ee59ea7145df98bb2bcfac4de58422f371aeac5c41fb2be5" Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.200787 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktg4l"] Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.226653 4841 scope.go:117] "RemoveContainer" containerID="a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a" Jan 30 05:53:31 crc kubenswrapper[4841]: E0130 05:53:31.227103 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a\": container with ID starting with a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a not found: ID does not exist" containerID="a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a" Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.227147 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a"} err="failed to get container status \"a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a\": rpc error: code = NotFound desc = could not find container \"a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a\": container with ID starting with a23cb234935519eafa35fcc1043e4522e92641cf478feacfb4a19bb075f8f81a not found: ID does not exist" Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.227174 4841 scope.go:117] "RemoveContainer" containerID="34139b3962272b4018c38a6eaf7360c067b4822c79bf198194c9c9e64cbe870b" Jan 30 05:53:31 crc kubenswrapper[4841]: E0130 05:53:31.227495 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34139b3962272b4018c38a6eaf7360c067b4822c79bf198194c9c9e64cbe870b\": container with ID starting with 34139b3962272b4018c38a6eaf7360c067b4822c79bf198194c9c9e64cbe870b not found: ID does not exist" containerID="34139b3962272b4018c38a6eaf7360c067b4822c79bf198194c9c9e64cbe870b" Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.227539 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34139b3962272b4018c38a6eaf7360c067b4822c79bf198194c9c9e64cbe870b"} err="failed to get container status \"34139b3962272b4018c38a6eaf7360c067b4822c79bf198194c9c9e64cbe870b\": rpc error: code = NotFound desc = could not find container \"34139b3962272b4018c38a6eaf7360c067b4822c79bf198194c9c9e64cbe870b\": container with ID starting with 34139b3962272b4018c38a6eaf7360c067b4822c79bf198194c9c9e64cbe870b not found: ID does not exist" Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.227567 4841 scope.go:117] "RemoveContainer" containerID="3c82d8ff9b6df357ee59ea7145df98bb2bcfac4de58422f371aeac5c41fb2be5" Jan 30 05:53:31 crc kubenswrapper[4841]: E0130 05:53:31.227835 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c82d8ff9b6df357ee59ea7145df98bb2bcfac4de58422f371aeac5c41fb2be5\": container with ID starting with 3c82d8ff9b6df357ee59ea7145df98bb2bcfac4de58422f371aeac5c41fb2be5 not found: ID does not exist" containerID="3c82d8ff9b6df357ee59ea7145df98bb2bcfac4de58422f371aeac5c41fb2be5" Jan 30 05:53:31 crc kubenswrapper[4841]: I0130 05:53:31.227868 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c82d8ff9b6df357ee59ea7145df98bb2bcfac4de58422f371aeac5c41fb2be5"} err="failed to get container status \"3c82d8ff9b6df357ee59ea7145df98bb2bcfac4de58422f371aeac5c41fb2be5\": rpc error: code = NotFound desc = could not find container \"3c82d8ff9b6df357ee59ea7145df98bb2bcfac4de58422f371aeac5c41fb2be5\": container with ID starting with 3c82d8ff9b6df357ee59ea7145df98bb2bcfac4de58422f371aeac5c41fb2be5 not found: ID does not exist" Jan 30 05:53:32 crc kubenswrapper[4841]: I0130 05:53:32.449783 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a041402d-f840-4045-88f7-77d4c69d097c" path="/var/lib/kubelet/pods/a041402d-f840-4045-88f7-77d4c69d097c/volumes" Jan 30 05:53:48 crc kubenswrapper[4841]: I0130 05:53:48.804996 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5rwp8"] Jan 30 05:53:48 crc kubenswrapper[4841]: E0130 05:53:48.806050 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a041402d-f840-4045-88f7-77d4c69d097c" containerName="extract-content" Jan 30 05:53:48 crc kubenswrapper[4841]: I0130 05:53:48.806072 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a041402d-f840-4045-88f7-77d4c69d097c" containerName="extract-content" Jan 30 05:53:48 crc kubenswrapper[4841]: E0130 05:53:48.806097 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a041402d-f840-4045-88f7-77d4c69d097c" containerName="registry-server" Jan 30 05:53:48 crc kubenswrapper[4841]: I0130 05:53:48.806109 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a041402d-f840-4045-88f7-77d4c69d097c" containerName="registry-server" Jan 30 05:53:48 crc kubenswrapper[4841]: E0130 05:53:48.806152 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a041402d-f840-4045-88f7-77d4c69d097c" containerName="extract-utilities" Jan 30 05:53:48 crc kubenswrapper[4841]: I0130 05:53:48.806166 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a041402d-f840-4045-88f7-77d4c69d097c" containerName="extract-utilities" Jan 30 05:53:48 crc kubenswrapper[4841]: I0130 05:53:48.806448 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a041402d-f840-4045-88f7-77d4c69d097c" containerName="registry-server" Jan 30 05:53:48 crc kubenswrapper[4841]: I0130 05:53:48.808129 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:48 crc kubenswrapper[4841]: I0130 05:53:48.871314 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5rwp8"] Jan 30 05:53:48 crc kubenswrapper[4841]: I0130 05:53:48.970234 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djf9v\" (UniqueName: \"kubernetes.io/projected/ef6c7621-c52d-4fd1-a327-f297061a9bb9-kube-api-access-djf9v\") pod \"certified-operators-5rwp8\" (UID: \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\") " pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:48 crc kubenswrapper[4841]: I0130 05:53:48.970903 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6c7621-c52d-4fd1-a327-f297061a9bb9-utilities\") pod \"certified-operators-5rwp8\" (UID: \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\") " pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:48 crc kubenswrapper[4841]: I0130 05:53:48.971067 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6c7621-c52d-4fd1-a327-f297061a9bb9-catalog-content\") pod \"certified-operators-5rwp8\" (UID: \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\") " pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:49 crc kubenswrapper[4841]: I0130 05:53:49.072854 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6c7621-c52d-4fd1-a327-f297061a9bb9-catalog-content\") pod \"certified-operators-5rwp8\" (UID: \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\") " pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:49 crc kubenswrapper[4841]: I0130 05:53:49.072964 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djf9v\" (UniqueName: \"kubernetes.io/projected/ef6c7621-c52d-4fd1-a327-f297061a9bb9-kube-api-access-djf9v\") pod \"certified-operators-5rwp8\" (UID: \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\") " pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:49 crc kubenswrapper[4841]: I0130 05:53:49.073064 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6c7621-c52d-4fd1-a327-f297061a9bb9-utilities\") pod \"certified-operators-5rwp8\" (UID: \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\") " pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:49 crc kubenswrapper[4841]: I0130 05:53:49.073447 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6c7621-c52d-4fd1-a327-f297061a9bb9-catalog-content\") pod \"certified-operators-5rwp8\" (UID: \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\") " pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:49 crc kubenswrapper[4841]: I0130 05:53:49.073564 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6c7621-c52d-4fd1-a327-f297061a9bb9-utilities\") pod \"certified-operators-5rwp8\" (UID: \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\") " pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:49 crc kubenswrapper[4841]: I0130 05:53:49.102862 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djf9v\" (UniqueName: \"kubernetes.io/projected/ef6c7621-c52d-4fd1-a327-f297061a9bb9-kube-api-access-djf9v\") pod \"certified-operators-5rwp8\" (UID: \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\") " pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:49 crc kubenswrapper[4841]: I0130 05:53:49.178769 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:49 crc kubenswrapper[4841]: I0130 05:53:49.742120 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5rwp8"] Jan 30 05:53:50 crc kubenswrapper[4841]: I0130 05:53:50.310578 4841 generic.go:334] "Generic (PLEG): container finished" podID="ef6c7621-c52d-4fd1-a327-f297061a9bb9" containerID="c77a9fe23cb7fd7d9b5251c4971125ea37effe710a159e168e17c847fbd6c38e" exitCode=0 Jan 30 05:53:50 crc kubenswrapper[4841]: I0130 05:53:50.310629 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwp8" event={"ID":"ef6c7621-c52d-4fd1-a327-f297061a9bb9","Type":"ContainerDied","Data":"c77a9fe23cb7fd7d9b5251c4971125ea37effe710a159e168e17c847fbd6c38e"} Jan 30 05:53:50 crc kubenswrapper[4841]: I0130 05:53:50.310650 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwp8" event={"ID":"ef6c7621-c52d-4fd1-a327-f297061a9bb9","Type":"ContainerStarted","Data":"23c8261a88133fbfbb4f51867b11b12953ee8bbd78eb74fd9f1e164b0fc47c72"} Jan 30 05:53:51 crc kubenswrapper[4841]: I0130 05:53:51.322906 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwp8" event={"ID":"ef6c7621-c52d-4fd1-a327-f297061a9bb9","Type":"ContainerStarted","Data":"f695de8657e4ec3a32614408178bbc8eb3cf10bf5f4524a41399c8c7757decc8"} Jan 30 05:53:52 crc kubenswrapper[4841]: I0130 05:53:52.333265 4841 generic.go:334] "Generic (PLEG): container finished" podID="ef6c7621-c52d-4fd1-a327-f297061a9bb9" containerID="f695de8657e4ec3a32614408178bbc8eb3cf10bf5f4524a41399c8c7757decc8" exitCode=0 Jan 30 05:53:52 crc kubenswrapper[4841]: I0130 05:53:52.333358 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwp8" event={"ID":"ef6c7621-c52d-4fd1-a327-f297061a9bb9","Type":"ContainerDied","Data":"f695de8657e4ec3a32614408178bbc8eb3cf10bf5f4524a41399c8c7757decc8"} Jan 30 05:53:53 crc kubenswrapper[4841]: I0130 05:53:53.347060 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwp8" event={"ID":"ef6c7621-c52d-4fd1-a327-f297061a9bb9","Type":"ContainerStarted","Data":"2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6"} Jan 30 05:53:53 crc kubenswrapper[4841]: I0130 05:53:53.375792 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5rwp8" podStartSLOduration=2.94140781 podStartE2EDuration="5.375766163s" podCreationTimestamp="2026-01-30 05:53:48 +0000 UTC" firstStartedPulling="2026-01-30 05:53:50.313093275 +0000 UTC m=+2767.306565913" lastFinishedPulling="2026-01-30 05:53:52.747451588 +0000 UTC m=+2769.740924266" observedRunningTime="2026-01-30 05:53:53.3691635 +0000 UTC m=+2770.362636168" watchObservedRunningTime="2026-01-30 05:53:53.375766163 +0000 UTC m=+2770.369238841" Jan 30 05:53:59 crc kubenswrapper[4841]: I0130 05:53:59.178828 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:59 crc kubenswrapper[4841]: I0130 05:53:59.179223 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:59 crc kubenswrapper[4841]: I0130 05:53:59.255554 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:59 crc kubenswrapper[4841]: I0130 05:53:59.485039 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:53:59 crc kubenswrapper[4841]: I0130 05:53:59.590813 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5rwp8"] Jan 30 05:54:01 crc kubenswrapper[4841]: I0130 05:54:01.433937 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5rwp8" podUID="ef6c7621-c52d-4fd1-a327-f297061a9bb9" containerName="registry-server" containerID="cri-o://2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6" gracePeriod=2 Jan 30 05:54:01 crc kubenswrapper[4841]: I0130 05:54:01.920624 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.097084 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djf9v\" (UniqueName: \"kubernetes.io/projected/ef6c7621-c52d-4fd1-a327-f297061a9bb9-kube-api-access-djf9v\") pod \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\" (UID: \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\") " Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.097140 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6c7621-c52d-4fd1-a327-f297061a9bb9-utilities\") pod \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\" (UID: \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\") " Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.097277 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6c7621-c52d-4fd1-a327-f297061a9bb9-catalog-content\") pod \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\" (UID: \"ef6c7621-c52d-4fd1-a327-f297061a9bb9\") " Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.098211 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef6c7621-c52d-4fd1-a327-f297061a9bb9-utilities" (OuterVolumeSpecName: "utilities") pod "ef6c7621-c52d-4fd1-a327-f297061a9bb9" (UID: "ef6c7621-c52d-4fd1-a327-f297061a9bb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.103862 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef6c7621-c52d-4fd1-a327-f297061a9bb9-kube-api-access-djf9v" (OuterVolumeSpecName: "kube-api-access-djf9v") pod "ef6c7621-c52d-4fd1-a327-f297061a9bb9" (UID: "ef6c7621-c52d-4fd1-a327-f297061a9bb9"). InnerVolumeSpecName "kube-api-access-djf9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.156422 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef6c7621-c52d-4fd1-a327-f297061a9bb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef6c7621-c52d-4fd1-a327-f297061a9bb9" (UID: "ef6c7621-c52d-4fd1-a327-f297061a9bb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.199249 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djf9v\" (UniqueName: \"kubernetes.io/projected/ef6c7621-c52d-4fd1-a327-f297061a9bb9-kube-api-access-djf9v\") on node \"crc\" DevicePath \"\"" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.199291 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef6c7621-c52d-4fd1-a327-f297061a9bb9-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.199305 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef6c7621-c52d-4fd1-a327-f297061a9bb9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.457428 4841 generic.go:334] "Generic (PLEG): container finished" podID="ef6c7621-c52d-4fd1-a327-f297061a9bb9" containerID="2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6" exitCode=0 Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.457714 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rwp8" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.457786 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwp8" event={"ID":"ef6c7621-c52d-4fd1-a327-f297061a9bb9","Type":"ContainerDied","Data":"2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6"} Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.459773 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwp8" event={"ID":"ef6c7621-c52d-4fd1-a327-f297061a9bb9","Type":"ContainerDied","Data":"23c8261a88133fbfbb4f51867b11b12953ee8bbd78eb74fd9f1e164b0fc47c72"} Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.459829 4841 scope.go:117] "RemoveContainer" containerID="2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.491451 4841 scope.go:117] "RemoveContainer" containerID="f695de8657e4ec3a32614408178bbc8eb3cf10bf5f4524a41399c8c7757decc8" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.513017 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5rwp8"] Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.523366 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5rwp8"] Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.526310 4841 scope.go:117] "RemoveContainer" containerID="c77a9fe23cb7fd7d9b5251c4971125ea37effe710a159e168e17c847fbd6c38e" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.557154 4841 scope.go:117] "RemoveContainer" containerID="2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6" Jan 30 05:54:02 crc kubenswrapper[4841]: E0130 05:54:02.557692 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6\": container with ID starting with 2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6 not found: ID does not exist" containerID="2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.557737 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6"} err="failed to get container status \"2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6\": rpc error: code = NotFound desc = could not find container \"2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6\": container with ID starting with 2f26afa37e5dd49cd463a0c8c90cdf5ee6d99e2d37deb3fa564cdf724c465cf6 not found: ID does not exist" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.557765 4841 scope.go:117] "RemoveContainer" containerID="f695de8657e4ec3a32614408178bbc8eb3cf10bf5f4524a41399c8c7757decc8" Jan 30 05:54:02 crc kubenswrapper[4841]: E0130 05:54:02.558170 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f695de8657e4ec3a32614408178bbc8eb3cf10bf5f4524a41399c8c7757decc8\": container with ID starting with f695de8657e4ec3a32614408178bbc8eb3cf10bf5f4524a41399c8c7757decc8 not found: ID does not exist" containerID="f695de8657e4ec3a32614408178bbc8eb3cf10bf5f4524a41399c8c7757decc8" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.558386 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f695de8657e4ec3a32614408178bbc8eb3cf10bf5f4524a41399c8c7757decc8"} err="failed to get container status \"f695de8657e4ec3a32614408178bbc8eb3cf10bf5f4524a41399c8c7757decc8\": rpc error: code = NotFound desc = could not find container \"f695de8657e4ec3a32614408178bbc8eb3cf10bf5f4524a41399c8c7757decc8\": container with ID starting with f695de8657e4ec3a32614408178bbc8eb3cf10bf5f4524a41399c8c7757decc8 not found: ID does not exist" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.558617 4841 scope.go:117] "RemoveContainer" containerID="c77a9fe23cb7fd7d9b5251c4971125ea37effe710a159e168e17c847fbd6c38e" Jan 30 05:54:02 crc kubenswrapper[4841]: E0130 05:54:02.559078 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c77a9fe23cb7fd7d9b5251c4971125ea37effe710a159e168e17c847fbd6c38e\": container with ID starting with c77a9fe23cb7fd7d9b5251c4971125ea37effe710a159e168e17c847fbd6c38e not found: ID does not exist" containerID="c77a9fe23cb7fd7d9b5251c4971125ea37effe710a159e168e17c847fbd6c38e" Jan 30 05:54:02 crc kubenswrapper[4841]: I0130 05:54:02.559115 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c77a9fe23cb7fd7d9b5251c4971125ea37effe710a159e168e17c847fbd6c38e"} err="failed to get container status \"c77a9fe23cb7fd7d9b5251c4971125ea37effe710a159e168e17c847fbd6c38e\": rpc error: code = NotFound desc = could not find container \"c77a9fe23cb7fd7d9b5251c4971125ea37effe710a159e168e17c847fbd6c38e\": container with ID starting with c77a9fe23cb7fd7d9b5251c4971125ea37effe710a159e168e17c847fbd6c38e not found: ID does not exist" Jan 30 05:54:04 crc kubenswrapper[4841]: I0130 05:54:04.448073 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef6c7621-c52d-4fd1-a327-f297061a9bb9" path="/var/lib/kubelet/pods/ef6c7621-c52d-4fd1-a327-f297061a9bb9/volumes" Jan 30 05:54:40 crc kubenswrapper[4841]: I0130 05:54:40.463428 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:54:40 crc kubenswrapper[4841]: I0130 05:54:40.464126 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:55:10 crc kubenswrapper[4841]: I0130 05:55:10.464146 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:55:10 crc kubenswrapper[4841]: I0130 05:55:10.464802 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.314047 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zn9bf"] Jan 30 05:55:27 crc kubenswrapper[4841]: E0130 05:55:27.314978 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6c7621-c52d-4fd1-a327-f297061a9bb9" containerName="extract-content" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.314992 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6c7621-c52d-4fd1-a327-f297061a9bb9" containerName="extract-content" Jan 30 05:55:27 crc kubenswrapper[4841]: E0130 05:55:27.315005 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6c7621-c52d-4fd1-a327-f297061a9bb9" containerName="extract-utilities" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.315012 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6c7621-c52d-4fd1-a327-f297061a9bb9" containerName="extract-utilities" Jan 30 05:55:27 crc kubenswrapper[4841]: E0130 05:55:27.315046 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6c7621-c52d-4fd1-a327-f297061a9bb9" containerName="registry-server" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.315055 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6c7621-c52d-4fd1-a327-f297061a9bb9" containerName="registry-server" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.315211 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef6c7621-c52d-4fd1-a327-f297061a9bb9" containerName="registry-server" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.316586 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.339156 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zn9bf"] Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.425074 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pddx7\" (UniqueName: \"kubernetes.io/projected/80741778-3aed-469c-b234-af300e01cab0-kube-api-access-pddx7\") pod \"community-operators-zn9bf\" (UID: \"80741778-3aed-469c-b234-af300e01cab0\") " pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.425299 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80741778-3aed-469c-b234-af300e01cab0-utilities\") pod \"community-operators-zn9bf\" (UID: \"80741778-3aed-469c-b234-af300e01cab0\") " pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.425352 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80741778-3aed-469c-b234-af300e01cab0-catalog-content\") pod \"community-operators-zn9bf\" (UID: \"80741778-3aed-469c-b234-af300e01cab0\") " pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.527051 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pddx7\" (UniqueName: \"kubernetes.io/projected/80741778-3aed-469c-b234-af300e01cab0-kube-api-access-pddx7\") pod \"community-operators-zn9bf\" (UID: \"80741778-3aed-469c-b234-af300e01cab0\") " pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.527135 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80741778-3aed-469c-b234-af300e01cab0-utilities\") pod \"community-operators-zn9bf\" (UID: \"80741778-3aed-469c-b234-af300e01cab0\") " pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.527153 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80741778-3aed-469c-b234-af300e01cab0-catalog-content\") pod \"community-operators-zn9bf\" (UID: \"80741778-3aed-469c-b234-af300e01cab0\") " pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.527663 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80741778-3aed-469c-b234-af300e01cab0-catalog-content\") pod \"community-operators-zn9bf\" (UID: \"80741778-3aed-469c-b234-af300e01cab0\") " pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.527789 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80741778-3aed-469c-b234-af300e01cab0-utilities\") pod \"community-operators-zn9bf\" (UID: \"80741778-3aed-469c-b234-af300e01cab0\") " pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.557454 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pddx7\" (UniqueName: \"kubernetes.io/projected/80741778-3aed-469c-b234-af300e01cab0-kube-api-access-pddx7\") pod \"community-operators-zn9bf\" (UID: \"80741778-3aed-469c-b234-af300e01cab0\") " pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:27 crc kubenswrapper[4841]: I0130 05:55:27.687726 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:28 crc kubenswrapper[4841]: I0130 05:55:28.173247 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zn9bf"] Jan 30 05:55:28 crc kubenswrapper[4841]: I0130 05:55:28.306264 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zn9bf" event={"ID":"80741778-3aed-469c-b234-af300e01cab0","Type":"ContainerStarted","Data":"d49989aa984bbc6e9e48a58f132af8277fc9edcde317177b8e6e8032e3ddd1dd"} Jan 30 05:55:29 crc kubenswrapper[4841]: I0130 05:55:29.318538 4841 generic.go:334] "Generic (PLEG): container finished" podID="80741778-3aed-469c-b234-af300e01cab0" containerID="8652dce5dbb76cb6c6158b3c90f5797fa6ded670b764a95bfc11e8bf5f540dd6" exitCode=0 Jan 30 05:55:29 crc kubenswrapper[4841]: I0130 05:55:29.318625 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zn9bf" event={"ID":"80741778-3aed-469c-b234-af300e01cab0","Type":"ContainerDied","Data":"8652dce5dbb76cb6c6158b3c90f5797fa6ded670b764a95bfc11e8bf5f540dd6"} Jan 30 05:55:29 crc kubenswrapper[4841]: I0130 05:55:29.323303 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 05:55:30 crc kubenswrapper[4841]: I0130 05:55:30.332533 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zn9bf" event={"ID":"80741778-3aed-469c-b234-af300e01cab0","Type":"ContainerStarted","Data":"5ab57e25394758a9ab209f4fc7f5097e847de29d1b77264ee5b31b81a6bd4087"} Jan 30 05:55:31 crc kubenswrapper[4841]: I0130 05:55:31.346675 4841 generic.go:334] "Generic (PLEG): container finished" podID="80741778-3aed-469c-b234-af300e01cab0" containerID="5ab57e25394758a9ab209f4fc7f5097e847de29d1b77264ee5b31b81a6bd4087" exitCode=0 Jan 30 05:55:31 crc kubenswrapper[4841]: I0130 05:55:31.346805 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zn9bf" event={"ID":"80741778-3aed-469c-b234-af300e01cab0","Type":"ContainerDied","Data":"5ab57e25394758a9ab209f4fc7f5097e847de29d1b77264ee5b31b81a6bd4087"} Jan 30 05:55:32 crc kubenswrapper[4841]: I0130 05:55:32.362872 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zn9bf" event={"ID":"80741778-3aed-469c-b234-af300e01cab0","Type":"ContainerStarted","Data":"ea192bb3c87388c4ef51348a2cd90328899dab69e2dc91a32539fcb2f2b713a2"} Jan 30 05:55:32 crc kubenswrapper[4841]: I0130 05:55:32.403808 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zn9bf" podStartSLOduration=2.808628472 podStartE2EDuration="5.403782353s" podCreationTimestamp="2026-01-30 05:55:27 +0000 UTC" firstStartedPulling="2026-01-30 05:55:29.322982769 +0000 UTC m=+2866.316455417" lastFinishedPulling="2026-01-30 05:55:31.91813663 +0000 UTC m=+2868.911609298" observedRunningTime="2026-01-30 05:55:32.391707614 +0000 UTC m=+2869.385180292" watchObservedRunningTime="2026-01-30 05:55:32.403782353 +0000 UTC m=+2869.397255031" Jan 30 05:55:37 crc kubenswrapper[4841]: I0130 05:55:37.688005 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:37 crc kubenswrapper[4841]: I0130 05:55:37.688708 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:37 crc kubenswrapper[4841]: I0130 05:55:37.764023 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:38 crc kubenswrapper[4841]: I0130 05:55:38.481921 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:38 crc kubenswrapper[4841]: I0130 05:55:38.549385 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zn9bf"] Jan 30 05:55:40 crc kubenswrapper[4841]: I0130 05:55:40.432842 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zn9bf" podUID="80741778-3aed-469c-b234-af300e01cab0" containerName="registry-server" containerID="cri-o://ea192bb3c87388c4ef51348a2cd90328899dab69e2dc91a32539fcb2f2b713a2" gracePeriod=2 Jan 30 05:55:40 crc kubenswrapper[4841]: I0130 05:55:40.463954 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 05:55:40 crc kubenswrapper[4841]: I0130 05:55:40.464050 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 05:55:40 crc kubenswrapper[4841]: I0130 05:55:40.464113 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 05:55:40 crc kubenswrapper[4841]: I0130 05:55:40.465127 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 05:55:40 crc kubenswrapper[4841]: I0130 05:55:40.465249 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" gracePeriod=600 Jan 30 05:55:41 crc kubenswrapper[4841]: E0130 05:55:41.095847 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.450845 4841 generic.go:334] "Generic (PLEG): container finished" podID="80741778-3aed-469c-b234-af300e01cab0" containerID="ea192bb3c87388c4ef51348a2cd90328899dab69e2dc91a32539fcb2f2b713a2" exitCode=0 Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.450916 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zn9bf" event={"ID":"80741778-3aed-469c-b234-af300e01cab0","Type":"ContainerDied","Data":"ea192bb3c87388c4ef51348a2cd90328899dab69e2dc91a32539fcb2f2b713a2"} Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.450945 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zn9bf" event={"ID":"80741778-3aed-469c-b234-af300e01cab0","Type":"ContainerDied","Data":"d49989aa984bbc6e9e48a58f132af8277fc9edcde317177b8e6e8032e3ddd1dd"} Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.450959 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d49989aa984bbc6e9e48a58f132af8277fc9edcde317177b8e6e8032e3ddd1dd" Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.454497 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" exitCode=0 Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.454548 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75"} Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.454575 4841 scope.go:117] "RemoveContainer" containerID="c7427dd6062e97635031533700819e96c8f2b18da1f6aafb8f304158f647c953" Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.455426 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:55:41 crc kubenswrapper[4841]: E0130 05:55:41.455727 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.456304 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.646215 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80741778-3aed-469c-b234-af300e01cab0-utilities\") pod \"80741778-3aed-469c-b234-af300e01cab0\" (UID: \"80741778-3aed-469c-b234-af300e01cab0\") " Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.646587 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pddx7\" (UniqueName: \"kubernetes.io/projected/80741778-3aed-469c-b234-af300e01cab0-kube-api-access-pddx7\") pod \"80741778-3aed-469c-b234-af300e01cab0\" (UID: \"80741778-3aed-469c-b234-af300e01cab0\") " Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.646702 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80741778-3aed-469c-b234-af300e01cab0-catalog-content\") pod \"80741778-3aed-469c-b234-af300e01cab0\" (UID: \"80741778-3aed-469c-b234-af300e01cab0\") " Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.647919 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80741778-3aed-469c-b234-af300e01cab0-utilities" (OuterVolumeSpecName: "utilities") pod "80741778-3aed-469c-b234-af300e01cab0" (UID: "80741778-3aed-469c-b234-af300e01cab0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.652906 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80741778-3aed-469c-b234-af300e01cab0-kube-api-access-pddx7" (OuterVolumeSpecName: "kube-api-access-pddx7") pod "80741778-3aed-469c-b234-af300e01cab0" (UID: "80741778-3aed-469c-b234-af300e01cab0"). InnerVolumeSpecName "kube-api-access-pddx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.710166 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80741778-3aed-469c-b234-af300e01cab0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80741778-3aed-469c-b234-af300e01cab0" (UID: "80741778-3aed-469c-b234-af300e01cab0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.748757 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80741778-3aed-469c-b234-af300e01cab0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.748803 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80741778-3aed-469c-b234-af300e01cab0-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 05:55:41 crc kubenswrapper[4841]: I0130 05:55:41.748816 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pddx7\" (UniqueName: \"kubernetes.io/projected/80741778-3aed-469c-b234-af300e01cab0-kube-api-access-pddx7\") on node \"crc\" DevicePath \"\"" Jan 30 05:55:42 crc kubenswrapper[4841]: I0130 05:55:42.472054 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zn9bf" Jan 30 05:55:42 crc kubenswrapper[4841]: I0130 05:55:42.518184 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zn9bf"] Jan 30 05:55:42 crc kubenswrapper[4841]: I0130 05:55:42.531353 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zn9bf"] Jan 30 05:55:44 crc kubenswrapper[4841]: I0130 05:55:44.447901 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80741778-3aed-469c-b234-af300e01cab0" path="/var/lib/kubelet/pods/80741778-3aed-469c-b234-af300e01cab0/volumes" Jan 30 05:55:55 crc kubenswrapper[4841]: I0130 05:55:55.432604 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:55:55 crc kubenswrapper[4841]: E0130 05:55:55.433474 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:56:10 crc kubenswrapper[4841]: I0130 05:56:10.432918 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:56:10 crc kubenswrapper[4841]: E0130 05:56:10.434112 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:56:22 crc kubenswrapper[4841]: I0130 05:56:22.431859 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:56:22 crc kubenswrapper[4841]: E0130 05:56:22.432828 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:56:33 crc kubenswrapper[4841]: I0130 05:56:33.431844 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:56:33 crc kubenswrapper[4841]: E0130 05:56:33.432849 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:56:44 crc kubenswrapper[4841]: I0130 05:56:44.436953 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:56:44 crc kubenswrapper[4841]: E0130 05:56:44.437842 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:56:59 crc kubenswrapper[4841]: I0130 05:56:59.431794 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:56:59 crc kubenswrapper[4841]: E0130 05:56:59.432896 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:57:10 crc kubenswrapper[4841]: I0130 05:57:10.432740 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:57:10 crc kubenswrapper[4841]: E0130 05:57:10.433998 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:57:22 crc kubenswrapper[4841]: I0130 05:57:22.432625 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:57:22 crc kubenswrapper[4841]: E0130 05:57:22.433413 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:57:34 crc kubenswrapper[4841]: I0130 05:57:34.439228 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:57:34 crc kubenswrapper[4841]: E0130 05:57:34.440633 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:57:48 crc kubenswrapper[4841]: I0130 05:57:48.432482 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:57:48 crc kubenswrapper[4841]: E0130 05:57:48.433183 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:57:59 crc kubenswrapper[4841]: I0130 05:57:59.432366 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:57:59 crc kubenswrapper[4841]: E0130 05:57:59.433510 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:58:13 crc kubenswrapper[4841]: I0130 05:58:13.432805 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:58:13 crc kubenswrapper[4841]: E0130 05:58:13.434026 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:58:25 crc kubenswrapper[4841]: I0130 05:58:25.433010 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:58:25 crc kubenswrapper[4841]: E0130 05:58:25.433956 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:58:39 crc kubenswrapper[4841]: I0130 05:58:39.432237 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:58:39 crc kubenswrapper[4841]: E0130 05:58:39.432935 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:58:53 crc kubenswrapper[4841]: I0130 05:58:53.431769 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:58:53 crc kubenswrapper[4841]: E0130 05:58:53.432570 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:59:05 crc kubenswrapper[4841]: I0130 05:59:05.432242 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:59:05 crc kubenswrapper[4841]: E0130 05:59:05.433165 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:59:19 crc kubenswrapper[4841]: I0130 05:59:19.433062 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:59:19 crc kubenswrapper[4841]: E0130 05:59:19.434215 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:59:30 crc kubenswrapper[4841]: I0130 05:59:30.431761 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:59:30 crc kubenswrapper[4841]: E0130 05:59:30.432706 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:59:41 crc kubenswrapper[4841]: I0130 05:59:41.432003 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:59:41 crc kubenswrapper[4841]: E0130 05:59:41.432970 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:59:52 crc kubenswrapper[4841]: I0130 05:59:52.432191 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 05:59:52 crc kubenswrapper[4841]: E0130 05:59:52.433082 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 05:59:59 crc kubenswrapper[4841]: I0130 05:59:59.977928 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mzbcj"] Jan 30 05:59:59 crc kubenswrapper[4841]: E0130 05:59:59.980586 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80741778-3aed-469c-b234-af300e01cab0" containerName="registry-server" Jan 30 05:59:59 crc kubenswrapper[4841]: I0130 05:59:59.980655 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="80741778-3aed-469c-b234-af300e01cab0" containerName="registry-server" Jan 30 05:59:59 crc kubenswrapper[4841]: E0130 05:59:59.980686 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80741778-3aed-469c-b234-af300e01cab0" containerName="extract-utilities" Jan 30 05:59:59 crc kubenswrapper[4841]: I0130 05:59:59.980701 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="80741778-3aed-469c-b234-af300e01cab0" containerName="extract-utilities" Jan 30 05:59:59 crc kubenswrapper[4841]: E0130 05:59:59.980781 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80741778-3aed-469c-b234-af300e01cab0" containerName="extract-content" Jan 30 05:59:59 crc kubenswrapper[4841]: I0130 05:59:59.980803 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="80741778-3aed-469c-b234-af300e01cab0" containerName="extract-content" Jan 30 05:59:59 crc kubenswrapper[4841]: I0130 05:59:59.981475 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="80741778-3aed-469c-b234-af300e01cab0" containerName="registry-server" Jan 30 05:59:59 crc kubenswrapper[4841]: I0130 05:59:59.987540 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mzbcj"] Jan 30 05:59:59 crc kubenswrapper[4841]: I0130 05:59:59.987709 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.137637 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d905cf3a-0752-43e2-bf4f-e063c364c437-utilities\") pod \"redhat-operators-mzbcj\" (UID: \"d905cf3a-0752-43e2-bf4f-e063c364c437\") " pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.137838 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mntsx\" (UniqueName: \"kubernetes.io/projected/d905cf3a-0752-43e2-bf4f-e063c364c437-kube-api-access-mntsx\") pod \"redhat-operators-mzbcj\" (UID: \"d905cf3a-0752-43e2-bf4f-e063c364c437\") " pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.137894 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d905cf3a-0752-43e2-bf4f-e063c364c437-catalog-content\") pod \"redhat-operators-mzbcj\" (UID: \"d905cf3a-0752-43e2-bf4f-e063c364c437\") " pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.164235 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm"] Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.165639 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.170095 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.175734 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.180376 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm"] Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.239193 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mntsx\" (UniqueName: \"kubernetes.io/projected/d905cf3a-0752-43e2-bf4f-e063c364c437-kube-api-access-mntsx\") pod \"redhat-operators-mzbcj\" (UID: \"d905cf3a-0752-43e2-bf4f-e063c364c437\") " pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.239264 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d905cf3a-0752-43e2-bf4f-e063c364c437-catalog-content\") pod \"redhat-operators-mzbcj\" (UID: \"d905cf3a-0752-43e2-bf4f-e063c364c437\") " pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.239381 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d905cf3a-0752-43e2-bf4f-e063c364c437-utilities\") pod \"redhat-operators-mzbcj\" (UID: \"d905cf3a-0752-43e2-bf4f-e063c364c437\") " pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.240015 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d905cf3a-0752-43e2-bf4f-e063c364c437-catalog-content\") pod \"redhat-operators-mzbcj\" (UID: \"d905cf3a-0752-43e2-bf4f-e063c364c437\") " pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.240035 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d905cf3a-0752-43e2-bf4f-e063c364c437-utilities\") pod \"redhat-operators-mzbcj\" (UID: \"d905cf3a-0752-43e2-bf4f-e063c364c437\") " pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.262482 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mntsx\" (UniqueName: \"kubernetes.io/projected/d905cf3a-0752-43e2-bf4f-e063c364c437-kube-api-access-mntsx\") pod \"redhat-operators-mzbcj\" (UID: \"d905cf3a-0752-43e2-bf4f-e063c364c437\") " pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.324599 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.347119 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f21364d-f9d0-4b5c-897e-44d183bbd441-secret-volume\") pod \"collect-profiles-29495880-hvjbm\" (UID: \"5f21364d-f9d0-4b5c-897e-44d183bbd441\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.347196 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f21364d-f9d0-4b5c-897e-44d183bbd441-config-volume\") pod \"collect-profiles-29495880-hvjbm\" (UID: \"5f21364d-f9d0-4b5c-897e-44d183bbd441\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.347320 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm5f6\" (UniqueName: \"kubernetes.io/projected/5f21364d-f9d0-4b5c-897e-44d183bbd441-kube-api-access-mm5f6\") pod \"collect-profiles-29495880-hvjbm\" (UID: \"5f21364d-f9d0-4b5c-897e-44d183bbd441\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.448691 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f21364d-f9d0-4b5c-897e-44d183bbd441-secret-volume\") pod \"collect-profiles-29495880-hvjbm\" (UID: \"5f21364d-f9d0-4b5c-897e-44d183bbd441\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.449038 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f21364d-f9d0-4b5c-897e-44d183bbd441-config-volume\") pod \"collect-profiles-29495880-hvjbm\" (UID: \"5f21364d-f9d0-4b5c-897e-44d183bbd441\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.449089 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm5f6\" (UniqueName: \"kubernetes.io/projected/5f21364d-f9d0-4b5c-897e-44d183bbd441-kube-api-access-mm5f6\") pod \"collect-profiles-29495880-hvjbm\" (UID: \"5f21364d-f9d0-4b5c-897e-44d183bbd441\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.451349 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f21364d-f9d0-4b5c-897e-44d183bbd441-config-volume\") pod \"collect-profiles-29495880-hvjbm\" (UID: \"5f21364d-f9d0-4b5c-897e-44d183bbd441\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.453086 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f21364d-f9d0-4b5c-897e-44d183bbd441-secret-volume\") pod \"collect-profiles-29495880-hvjbm\" (UID: \"5f21364d-f9d0-4b5c-897e-44d183bbd441\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.478595 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm5f6\" (UniqueName: \"kubernetes.io/projected/5f21364d-f9d0-4b5c-897e-44d183bbd441-kube-api-access-mm5f6\") pod \"collect-profiles-29495880-hvjbm\" (UID: \"5f21364d-f9d0-4b5c-897e-44d183bbd441\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.489422 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.717426 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm"] Jan 30 06:00:00 crc kubenswrapper[4841]: W0130 06:00:00.761368 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd905cf3a_0752_43e2_bf4f_e063c364c437.slice/crio-9629fb558a10788bb539e3250c7227ebe258066c1a9cc844966188e26bb8fcaf WatchSource:0}: Error finding container 9629fb558a10788bb539e3250c7227ebe258066c1a9cc844966188e26bb8fcaf: Status 404 returned error can't find the container with id 9629fb558a10788bb539e3250c7227ebe258066c1a9cc844966188e26bb8fcaf Jan 30 06:00:00 crc kubenswrapper[4841]: I0130 06:00:00.762512 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mzbcj"] Jan 30 06:00:01 crc kubenswrapper[4841]: I0130 06:00:01.238127 4841 generic.go:334] "Generic (PLEG): container finished" podID="5f21364d-f9d0-4b5c-897e-44d183bbd441" containerID="2181758f83931ab95d5f734fd20cb598f51d12c4708e8c7e675b734e95a1e4bf" exitCode=0 Jan 30 06:00:01 crc kubenswrapper[4841]: I0130 06:00:01.239002 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" event={"ID":"5f21364d-f9d0-4b5c-897e-44d183bbd441","Type":"ContainerDied","Data":"2181758f83931ab95d5f734fd20cb598f51d12c4708e8c7e675b734e95a1e4bf"} Jan 30 06:00:01 crc kubenswrapper[4841]: I0130 06:00:01.239027 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" event={"ID":"5f21364d-f9d0-4b5c-897e-44d183bbd441","Type":"ContainerStarted","Data":"79d132d7f898d1fecdef9f99e8fd5e44081bc2bda5ae13fcce762e4e123ad1dc"} Jan 30 06:00:01 crc kubenswrapper[4841]: I0130 06:00:01.240570 4841 generic.go:334] "Generic (PLEG): container finished" podID="d905cf3a-0752-43e2-bf4f-e063c364c437" containerID="d431a4263a9a5a3fb8e9c5186f2909a62b329d271d0d06b7e749b4ce927bc1d5" exitCode=0 Jan 30 06:00:01 crc kubenswrapper[4841]: I0130 06:00:01.240594 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzbcj" event={"ID":"d905cf3a-0752-43e2-bf4f-e063c364c437","Type":"ContainerDied","Data":"d431a4263a9a5a3fb8e9c5186f2909a62b329d271d0d06b7e749b4ce927bc1d5"} Jan 30 06:00:01 crc kubenswrapper[4841]: I0130 06:00:01.240607 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzbcj" event={"ID":"d905cf3a-0752-43e2-bf4f-e063c364c437","Type":"ContainerStarted","Data":"9629fb558a10788bb539e3250c7227ebe258066c1a9cc844966188e26bb8fcaf"} Jan 30 06:00:02 crc kubenswrapper[4841]: I0130 06:00:02.665284 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:02 crc kubenswrapper[4841]: I0130 06:00:02.786842 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm5f6\" (UniqueName: \"kubernetes.io/projected/5f21364d-f9d0-4b5c-897e-44d183bbd441-kube-api-access-mm5f6\") pod \"5f21364d-f9d0-4b5c-897e-44d183bbd441\" (UID: \"5f21364d-f9d0-4b5c-897e-44d183bbd441\") " Jan 30 06:00:02 crc kubenswrapper[4841]: I0130 06:00:02.786919 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f21364d-f9d0-4b5c-897e-44d183bbd441-config-volume\") pod \"5f21364d-f9d0-4b5c-897e-44d183bbd441\" (UID: \"5f21364d-f9d0-4b5c-897e-44d183bbd441\") " Jan 30 06:00:02 crc kubenswrapper[4841]: I0130 06:00:02.786986 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f21364d-f9d0-4b5c-897e-44d183bbd441-secret-volume\") pod \"5f21364d-f9d0-4b5c-897e-44d183bbd441\" (UID: \"5f21364d-f9d0-4b5c-897e-44d183bbd441\") " Jan 30 06:00:02 crc kubenswrapper[4841]: I0130 06:00:02.787785 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f21364d-f9d0-4b5c-897e-44d183bbd441-config-volume" (OuterVolumeSpecName: "config-volume") pod "5f21364d-f9d0-4b5c-897e-44d183bbd441" (UID: "5f21364d-f9d0-4b5c-897e-44d183bbd441"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:00:02 crc kubenswrapper[4841]: I0130 06:00:02.793868 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f21364d-f9d0-4b5c-897e-44d183bbd441-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5f21364d-f9d0-4b5c-897e-44d183bbd441" (UID: "5f21364d-f9d0-4b5c-897e-44d183bbd441"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:00:02 crc kubenswrapper[4841]: I0130 06:00:02.795266 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f21364d-f9d0-4b5c-897e-44d183bbd441-kube-api-access-mm5f6" (OuterVolumeSpecName: "kube-api-access-mm5f6") pod "5f21364d-f9d0-4b5c-897e-44d183bbd441" (UID: "5f21364d-f9d0-4b5c-897e-44d183bbd441"). InnerVolumeSpecName "kube-api-access-mm5f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:00:02 crc kubenswrapper[4841]: I0130 06:00:02.888951 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm5f6\" (UniqueName: \"kubernetes.io/projected/5f21364d-f9d0-4b5c-897e-44d183bbd441-kube-api-access-mm5f6\") on node \"crc\" DevicePath \"\"" Jan 30 06:00:02 crc kubenswrapper[4841]: I0130 06:00:02.889010 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f21364d-f9d0-4b5c-897e-44d183bbd441-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:00:02 crc kubenswrapper[4841]: I0130 06:00:02.889044 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f21364d-f9d0-4b5c-897e-44d183bbd441-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:00:03 crc kubenswrapper[4841]: I0130 06:00:03.261902 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" event={"ID":"5f21364d-f9d0-4b5c-897e-44d183bbd441","Type":"ContainerDied","Data":"79d132d7f898d1fecdef9f99e8fd5e44081bc2bda5ae13fcce762e4e123ad1dc"} Jan 30 06:00:03 crc kubenswrapper[4841]: I0130 06:00:03.262283 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79d132d7f898d1fecdef9f99e8fd5e44081bc2bda5ae13fcce762e4e123ad1dc" Jan 30 06:00:03 crc kubenswrapper[4841]: I0130 06:00:03.261980 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm" Jan 30 06:00:03 crc kubenswrapper[4841]: I0130 06:00:03.432708 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 06:00:03 crc kubenswrapper[4841]: E0130 06:00:03.433082 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:00:03 crc kubenswrapper[4841]: I0130 06:00:03.762085 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t"] Jan 30 06:00:03 crc kubenswrapper[4841]: I0130 06:00:03.770478 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495835-dqj6t"] Jan 30 06:00:04 crc kubenswrapper[4841]: I0130 06:00:04.447547 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91b1aae-b372-4348-b07b-0afb79ecfc61" path="/var/lib/kubelet/pods/f91b1aae-b372-4348-b07b-0afb79ecfc61/volumes" Jan 30 06:00:06 crc kubenswrapper[4841]: I0130 06:00:06.318855 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzbcj" event={"ID":"d905cf3a-0752-43e2-bf4f-e063c364c437","Type":"ContainerStarted","Data":"4bd224e64d96ba80e617218c3664412940351a2236242c88cd8c889a6e4b30d1"} Jan 30 06:00:07 crc kubenswrapper[4841]: I0130 06:00:07.333258 4841 generic.go:334] "Generic (PLEG): container finished" podID="d905cf3a-0752-43e2-bf4f-e063c364c437" containerID="4bd224e64d96ba80e617218c3664412940351a2236242c88cd8c889a6e4b30d1" exitCode=0 Jan 30 06:00:07 crc kubenswrapper[4841]: I0130 06:00:07.333323 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzbcj" event={"ID":"d905cf3a-0752-43e2-bf4f-e063c364c437","Type":"ContainerDied","Data":"4bd224e64d96ba80e617218c3664412940351a2236242c88cd8c889a6e4b30d1"} Jan 30 06:00:08 crc kubenswrapper[4841]: I0130 06:00:08.348295 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzbcj" event={"ID":"d905cf3a-0752-43e2-bf4f-e063c364c437","Type":"ContainerStarted","Data":"9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5"} Jan 30 06:00:08 crc kubenswrapper[4841]: I0130 06:00:08.387539 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mzbcj" podStartSLOduration=2.8647661700000002 podStartE2EDuration="9.387510666s" podCreationTimestamp="2026-01-30 05:59:59 +0000 UTC" firstStartedPulling="2026-01-30 06:00:01.242101092 +0000 UTC m=+3138.235573730" lastFinishedPulling="2026-01-30 06:00:07.764845558 +0000 UTC m=+3144.758318226" observedRunningTime="2026-01-30 06:00:08.381873157 +0000 UTC m=+3145.375345805" watchObservedRunningTime="2026-01-30 06:00:08.387510666 +0000 UTC m=+3145.380983354" Jan 30 06:00:10 crc kubenswrapper[4841]: I0130 06:00:10.326355 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:10 crc kubenswrapper[4841]: I0130 06:00:10.326699 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:11 crc kubenswrapper[4841]: I0130 06:00:11.390912 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mzbcj" podUID="d905cf3a-0752-43e2-bf4f-e063c364c437" containerName="registry-server" probeResult="failure" output=< Jan 30 06:00:11 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 06:00:11 crc kubenswrapper[4841]: > Jan 30 06:00:17 crc kubenswrapper[4841]: I0130 06:00:17.432958 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 06:00:17 crc kubenswrapper[4841]: E0130 06:00:17.433876 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:00:20 crc kubenswrapper[4841]: I0130 06:00:20.402015 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:20 crc kubenswrapper[4841]: I0130 06:00:20.482822 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:20 crc kubenswrapper[4841]: I0130 06:00:20.651422 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mzbcj"] Jan 30 06:00:21 crc kubenswrapper[4841]: I0130 06:00:21.495338 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mzbcj" podUID="d905cf3a-0752-43e2-bf4f-e063c364c437" containerName="registry-server" containerID="cri-o://9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5" gracePeriod=2 Jan 30 06:00:21 crc kubenswrapper[4841]: I0130 06:00:21.972068 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.108119 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d905cf3a-0752-43e2-bf4f-e063c364c437-utilities\") pod \"d905cf3a-0752-43e2-bf4f-e063c364c437\" (UID: \"d905cf3a-0752-43e2-bf4f-e063c364c437\") " Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.108244 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mntsx\" (UniqueName: \"kubernetes.io/projected/d905cf3a-0752-43e2-bf4f-e063c364c437-kube-api-access-mntsx\") pod \"d905cf3a-0752-43e2-bf4f-e063c364c437\" (UID: \"d905cf3a-0752-43e2-bf4f-e063c364c437\") " Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.108335 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d905cf3a-0752-43e2-bf4f-e063c364c437-catalog-content\") pod \"d905cf3a-0752-43e2-bf4f-e063c364c437\" (UID: \"d905cf3a-0752-43e2-bf4f-e063c364c437\") " Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.109897 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d905cf3a-0752-43e2-bf4f-e063c364c437-utilities" (OuterVolumeSpecName: "utilities") pod "d905cf3a-0752-43e2-bf4f-e063c364c437" (UID: "d905cf3a-0752-43e2-bf4f-e063c364c437"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.117900 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d905cf3a-0752-43e2-bf4f-e063c364c437-kube-api-access-mntsx" (OuterVolumeSpecName: "kube-api-access-mntsx") pod "d905cf3a-0752-43e2-bf4f-e063c364c437" (UID: "d905cf3a-0752-43e2-bf4f-e063c364c437"). InnerVolumeSpecName "kube-api-access-mntsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.210811 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d905cf3a-0752-43e2-bf4f-e063c364c437-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.210870 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mntsx\" (UniqueName: \"kubernetes.io/projected/d905cf3a-0752-43e2-bf4f-e063c364c437-kube-api-access-mntsx\") on node \"crc\" DevicePath \"\"" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.257855 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d905cf3a-0752-43e2-bf4f-e063c364c437-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d905cf3a-0752-43e2-bf4f-e063c364c437" (UID: "d905cf3a-0752-43e2-bf4f-e063c364c437"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.312748 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d905cf3a-0752-43e2-bf4f-e063c364c437-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.507261 4841 generic.go:334] "Generic (PLEG): container finished" podID="d905cf3a-0752-43e2-bf4f-e063c364c437" containerID="9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5" exitCode=0 Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.507319 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzbcj" event={"ID":"d905cf3a-0752-43e2-bf4f-e063c364c437","Type":"ContainerDied","Data":"9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5"} Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.507348 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzbcj" event={"ID":"d905cf3a-0752-43e2-bf4f-e063c364c437","Type":"ContainerDied","Data":"9629fb558a10788bb539e3250c7227ebe258066c1a9cc844966188e26bb8fcaf"} Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.507366 4841 scope.go:117] "RemoveContainer" containerID="9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.507431 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mzbcj" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.537341 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mzbcj"] Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.548542 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mzbcj"] Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.550941 4841 scope.go:117] "RemoveContainer" containerID="4bd224e64d96ba80e617218c3664412940351a2236242c88cd8c889a6e4b30d1" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.577370 4841 scope.go:117] "RemoveContainer" containerID="d431a4263a9a5a3fb8e9c5186f2909a62b329d271d0d06b7e749b4ce927bc1d5" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.604061 4841 scope.go:117] "RemoveContainer" containerID="9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5" Jan 30 06:00:22 crc kubenswrapper[4841]: E0130 06:00:22.604590 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5\": container with ID starting with 9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5 not found: ID does not exist" containerID="9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.604657 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5"} err="failed to get container status \"9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5\": rpc error: code = NotFound desc = could not find container \"9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5\": container with ID starting with 9cd58d5ce11eb095aae77dfde281637828fcfa950cce1eacdec40ca8963647f5 not found: ID does not exist" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.604705 4841 scope.go:117] "RemoveContainer" containerID="4bd224e64d96ba80e617218c3664412940351a2236242c88cd8c889a6e4b30d1" Jan 30 06:00:22 crc kubenswrapper[4841]: E0130 06:00:22.605567 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bd224e64d96ba80e617218c3664412940351a2236242c88cd8c889a6e4b30d1\": container with ID starting with 4bd224e64d96ba80e617218c3664412940351a2236242c88cd8c889a6e4b30d1 not found: ID does not exist" containerID="4bd224e64d96ba80e617218c3664412940351a2236242c88cd8c889a6e4b30d1" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.605630 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bd224e64d96ba80e617218c3664412940351a2236242c88cd8c889a6e4b30d1"} err="failed to get container status \"4bd224e64d96ba80e617218c3664412940351a2236242c88cd8c889a6e4b30d1\": rpc error: code = NotFound desc = could not find container \"4bd224e64d96ba80e617218c3664412940351a2236242c88cd8c889a6e4b30d1\": container with ID starting with 4bd224e64d96ba80e617218c3664412940351a2236242c88cd8c889a6e4b30d1 not found: ID does not exist" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.605669 4841 scope.go:117] "RemoveContainer" containerID="d431a4263a9a5a3fb8e9c5186f2909a62b329d271d0d06b7e749b4ce927bc1d5" Jan 30 06:00:22 crc kubenswrapper[4841]: E0130 06:00:22.606098 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d431a4263a9a5a3fb8e9c5186f2909a62b329d271d0d06b7e749b4ce927bc1d5\": container with ID starting with d431a4263a9a5a3fb8e9c5186f2909a62b329d271d0d06b7e749b4ce927bc1d5 not found: ID does not exist" containerID="d431a4263a9a5a3fb8e9c5186f2909a62b329d271d0d06b7e749b4ce927bc1d5" Jan 30 06:00:22 crc kubenswrapper[4841]: I0130 06:00:22.606157 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d431a4263a9a5a3fb8e9c5186f2909a62b329d271d0d06b7e749b4ce927bc1d5"} err="failed to get container status \"d431a4263a9a5a3fb8e9c5186f2909a62b329d271d0d06b7e749b4ce927bc1d5\": rpc error: code = NotFound desc = could not find container \"d431a4263a9a5a3fb8e9c5186f2909a62b329d271d0d06b7e749b4ce927bc1d5\": container with ID starting with d431a4263a9a5a3fb8e9c5186f2909a62b329d271d0d06b7e749b4ce927bc1d5 not found: ID does not exist" Jan 30 06:00:24 crc kubenswrapper[4841]: I0130 06:00:24.450804 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d905cf3a-0752-43e2-bf4f-e063c364c437" path="/var/lib/kubelet/pods/d905cf3a-0752-43e2-bf4f-e063c364c437/volumes" Jan 30 06:00:28 crc kubenswrapper[4841]: I0130 06:00:28.432709 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 06:00:28 crc kubenswrapper[4841]: E0130 06:00:28.433380 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:00:42 crc kubenswrapper[4841]: I0130 06:00:42.432864 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 06:00:43 crc kubenswrapper[4841]: I0130 06:00:43.716563 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"8b27b440560b98b1aaa629d9247b2cb76a1ed1307d7294ce65ab494bca036655"} Jan 30 06:00:50 crc kubenswrapper[4841]: I0130 06:00:50.006003 4841 scope.go:117] "RemoveContainer" containerID="a9dfc72753b604180f840fb28e8d834420846f09f8cc3ecf5371ce85c5fdf6d8" Jan 30 06:01:50 crc kubenswrapper[4841]: I0130 06:01:50.093701 4841 scope.go:117] "RemoveContainer" containerID="ea192bb3c87388c4ef51348a2cd90328899dab69e2dc91a32539fcb2f2b713a2" Jan 30 06:01:50 crc kubenswrapper[4841]: I0130 06:01:50.115876 4841 scope.go:117] "RemoveContainer" containerID="8652dce5dbb76cb6c6158b3c90f5797fa6ded670b764a95bfc11e8bf5f540dd6" Jan 30 06:01:50 crc kubenswrapper[4841]: I0130 06:01:50.133745 4841 scope.go:117] "RemoveContainer" containerID="5ab57e25394758a9ab209f4fc7f5097e847de29d1b77264ee5b31b81a6bd4087" Jan 30 06:03:10 crc kubenswrapper[4841]: I0130 06:03:10.463345 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:03:10 crc kubenswrapper[4841]: I0130 06:03:10.465688 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:03:40 crc kubenswrapper[4841]: I0130 06:03:40.464083 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:03:40 crc kubenswrapper[4841]: I0130 06:03:40.464829 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.574751 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f528q"] Jan 30 06:04:03 crc kubenswrapper[4841]: E0130 06:04:03.575642 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d905cf3a-0752-43e2-bf4f-e063c364c437" containerName="registry-server" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.575659 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d905cf3a-0752-43e2-bf4f-e063c364c437" containerName="registry-server" Jan 30 06:04:03 crc kubenswrapper[4841]: E0130 06:04:03.575688 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d905cf3a-0752-43e2-bf4f-e063c364c437" containerName="extract-utilities" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.575696 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d905cf3a-0752-43e2-bf4f-e063c364c437" containerName="extract-utilities" Jan 30 06:04:03 crc kubenswrapper[4841]: E0130 06:04:03.575735 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f21364d-f9d0-4b5c-897e-44d183bbd441" containerName="collect-profiles" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.575745 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f21364d-f9d0-4b5c-897e-44d183bbd441" containerName="collect-profiles" Jan 30 06:04:03 crc kubenswrapper[4841]: E0130 06:04:03.575757 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d905cf3a-0752-43e2-bf4f-e063c364c437" containerName="extract-content" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.575766 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d905cf3a-0752-43e2-bf4f-e063c364c437" containerName="extract-content" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.575920 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f21364d-f9d0-4b5c-897e-44d183bbd441" containerName="collect-profiles" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.575936 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d905cf3a-0752-43e2-bf4f-e063c364c437" containerName="registry-server" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.577193 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.609433 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f528q"] Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.774573 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q54x5\" (UniqueName: \"kubernetes.io/projected/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-kube-api-access-q54x5\") pod \"redhat-marketplace-f528q\" (UID: \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\") " pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.774743 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-utilities\") pod \"redhat-marketplace-f528q\" (UID: \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\") " pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.774833 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-catalog-content\") pod \"redhat-marketplace-f528q\" (UID: \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\") " pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.876154 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-catalog-content\") pod \"redhat-marketplace-f528q\" (UID: \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\") " pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.876227 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q54x5\" (UniqueName: \"kubernetes.io/projected/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-kube-api-access-q54x5\") pod \"redhat-marketplace-f528q\" (UID: \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\") " pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.876306 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-utilities\") pod \"redhat-marketplace-f528q\" (UID: \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\") " pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.876983 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-utilities\") pod \"redhat-marketplace-f528q\" (UID: \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\") " pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.876979 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-catalog-content\") pod \"redhat-marketplace-f528q\" (UID: \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\") " pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:03 crc kubenswrapper[4841]: I0130 06:04:03.910351 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q54x5\" (UniqueName: \"kubernetes.io/projected/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-kube-api-access-q54x5\") pod \"redhat-marketplace-f528q\" (UID: \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\") " pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:04 crc kubenswrapper[4841]: I0130 06:04:04.195676 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:04 crc kubenswrapper[4841]: I0130 06:04:04.452371 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f528q"] Jan 30 06:04:04 crc kubenswrapper[4841]: I0130 06:04:04.568391 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f528q" event={"ID":"9216d9c5-0659-49dc-a557-3fbe3aa21cbc","Type":"ContainerStarted","Data":"34b6217f0ff7e283cf51a5e932d5f9da16e8bcee17c487d45c9edca4c805d85e"} Jan 30 06:04:05 crc kubenswrapper[4841]: I0130 06:04:05.575852 4841 generic.go:334] "Generic (PLEG): container finished" podID="9216d9c5-0659-49dc-a557-3fbe3aa21cbc" containerID="148e84caa9fee8276456345b5cad54ff569a07bf5e388a7099e2778f09ae2330" exitCode=0 Jan 30 06:04:05 crc kubenswrapper[4841]: I0130 06:04:05.575905 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f528q" event={"ID":"9216d9c5-0659-49dc-a557-3fbe3aa21cbc","Type":"ContainerDied","Data":"148e84caa9fee8276456345b5cad54ff569a07bf5e388a7099e2778f09ae2330"} Jan 30 06:04:05 crc kubenswrapper[4841]: I0130 06:04:05.578012 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:04:06 crc kubenswrapper[4841]: I0130 06:04:06.602275 4841 generic.go:334] "Generic (PLEG): container finished" podID="9216d9c5-0659-49dc-a557-3fbe3aa21cbc" containerID="cb453ed462efb7805db48ec77606717fd1cfc03f1fe1e22ad10891ac3c7fc9e3" exitCode=0 Jan 30 06:04:06 crc kubenswrapper[4841]: I0130 06:04:06.602456 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f528q" event={"ID":"9216d9c5-0659-49dc-a557-3fbe3aa21cbc","Type":"ContainerDied","Data":"cb453ed462efb7805db48ec77606717fd1cfc03f1fe1e22ad10891ac3c7fc9e3"} Jan 30 06:04:07 crc kubenswrapper[4841]: I0130 06:04:07.614088 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f528q" event={"ID":"9216d9c5-0659-49dc-a557-3fbe3aa21cbc","Type":"ContainerStarted","Data":"7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7"} Jan 30 06:04:07 crc kubenswrapper[4841]: I0130 06:04:07.642934 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f528q" podStartSLOduration=3.161918091 podStartE2EDuration="4.6429106s" podCreationTimestamp="2026-01-30 06:04:03 +0000 UTC" firstStartedPulling="2026-01-30 06:04:05.577789585 +0000 UTC m=+3382.571262223" lastFinishedPulling="2026-01-30 06:04:07.058782054 +0000 UTC m=+3384.052254732" observedRunningTime="2026-01-30 06:04:07.637228851 +0000 UTC m=+3384.630701509" watchObservedRunningTime="2026-01-30 06:04:07.6429106 +0000 UTC m=+3384.636383258" Jan 30 06:04:09 crc kubenswrapper[4841]: I0130 06:04:09.952531 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hqr7t"] Jan 30 06:04:09 crc kubenswrapper[4841]: I0130 06:04:09.956688 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:09 crc kubenswrapper[4841]: I0130 06:04:09.975127 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqr7t"] Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.066999 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntt5d\" (UniqueName: \"kubernetes.io/projected/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-kube-api-access-ntt5d\") pod \"certified-operators-hqr7t\" (UID: \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\") " pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.067235 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-catalog-content\") pod \"certified-operators-hqr7t\" (UID: \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\") " pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.067334 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-utilities\") pod \"certified-operators-hqr7t\" (UID: \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\") " pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.168962 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntt5d\" (UniqueName: \"kubernetes.io/projected/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-kube-api-access-ntt5d\") pod \"certified-operators-hqr7t\" (UID: \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\") " pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.169066 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-catalog-content\") pod \"certified-operators-hqr7t\" (UID: \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\") " pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.169112 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-utilities\") pod \"certified-operators-hqr7t\" (UID: \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\") " pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.169746 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-catalog-content\") pod \"certified-operators-hqr7t\" (UID: \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\") " pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.169830 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-utilities\") pod \"certified-operators-hqr7t\" (UID: \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\") " pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.190114 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntt5d\" (UniqueName: \"kubernetes.io/projected/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-kube-api-access-ntt5d\") pod \"certified-operators-hqr7t\" (UID: \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\") " pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.317812 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.464853 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.465127 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.465172 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.465750 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b27b440560b98b1aaa629d9247b2cb76a1ed1307d7294ce65ab494bca036655"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.465802 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://8b27b440560b98b1aaa629d9247b2cb76a1ed1307d7294ce65ab494bca036655" gracePeriod=600 Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.529843 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqr7t"] Jan 30 06:04:10 crc kubenswrapper[4841]: W0130 06:04:10.538569 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0ce968e_0cb2_4996_85a4_b20b30f3fa80.slice/crio-6e6106a89d7d919f26df634b80ad7fdb210b6275b6b1b87c3ba0c3c197042077 WatchSource:0}: Error finding container 6e6106a89d7d919f26df634b80ad7fdb210b6275b6b1b87c3ba0c3c197042077: Status 404 returned error can't find the container with id 6e6106a89d7d919f26df634b80ad7fdb210b6275b6b1b87c3ba0c3c197042077 Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.643571 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="8b27b440560b98b1aaa629d9247b2cb76a1ed1307d7294ce65ab494bca036655" exitCode=0 Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.643632 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"8b27b440560b98b1aaa629d9247b2cb76a1ed1307d7294ce65ab494bca036655"} Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.643662 4841 scope.go:117] "RemoveContainer" containerID="4134ba1f8c4807759033737045ca0d350f0bbf69e22b48a5dfca2c755bf3da75" Jan 30 06:04:10 crc kubenswrapper[4841]: I0130 06:04:10.644624 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqr7t" event={"ID":"f0ce968e-0cb2-4996-85a4-b20b30f3fa80","Type":"ContainerStarted","Data":"6e6106a89d7d919f26df634b80ad7fdb210b6275b6b1b87c3ba0c3c197042077"} Jan 30 06:04:11 crc kubenswrapper[4841]: I0130 06:04:11.656545 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801"} Jan 30 06:04:11 crc kubenswrapper[4841]: I0130 06:04:11.659044 4841 generic.go:334] "Generic (PLEG): container finished" podID="f0ce968e-0cb2-4996-85a4-b20b30f3fa80" containerID="984fe1c9fb7dd333a812281d866a77b6d4f50f1f48043b48af0c87b649c6ab26" exitCode=0 Jan 30 06:04:11 crc kubenswrapper[4841]: I0130 06:04:11.659096 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqr7t" event={"ID":"f0ce968e-0cb2-4996-85a4-b20b30f3fa80","Type":"ContainerDied","Data":"984fe1c9fb7dd333a812281d866a77b6d4f50f1f48043b48af0c87b649c6ab26"} Jan 30 06:04:13 crc kubenswrapper[4841]: I0130 06:04:13.691519 4841 generic.go:334] "Generic (PLEG): container finished" podID="f0ce968e-0cb2-4996-85a4-b20b30f3fa80" containerID="d679b7ce5f70de090ec03a661d3d6e1c766ecdb9459bbe502ef6e50f32f7e663" exitCode=0 Jan 30 06:04:13 crc kubenswrapper[4841]: I0130 06:04:13.691651 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqr7t" event={"ID":"f0ce968e-0cb2-4996-85a4-b20b30f3fa80","Type":"ContainerDied","Data":"d679b7ce5f70de090ec03a661d3d6e1c766ecdb9459bbe502ef6e50f32f7e663"} Jan 30 06:04:14 crc kubenswrapper[4841]: I0130 06:04:14.196296 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:14 crc kubenswrapper[4841]: I0130 06:04:14.196762 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:14 crc kubenswrapper[4841]: I0130 06:04:14.251669 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:14 crc kubenswrapper[4841]: I0130 06:04:14.708387 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqr7t" event={"ID":"f0ce968e-0cb2-4996-85a4-b20b30f3fa80","Type":"ContainerStarted","Data":"3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c"} Jan 30 06:04:14 crc kubenswrapper[4841]: I0130 06:04:14.747306 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hqr7t" podStartSLOduration=3.31797273 podStartE2EDuration="5.747283321s" podCreationTimestamp="2026-01-30 06:04:09 +0000 UTC" firstStartedPulling="2026-01-30 06:04:11.661962795 +0000 UTC m=+3388.655435463" lastFinishedPulling="2026-01-30 06:04:14.091273416 +0000 UTC m=+3391.084746054" observedRunningTime="2026-01-30 06:04:14.742110356 +0000 UTC m=+3391.735583024" watchObservedRunningTime="2026-01-30 06:04:14.747283321 +0000 UTC m=+3391.740755989" Jan 30 06:04:14 crc kubenswrapper[4841]: I0130 06:04:14.788506 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:15 crc kubenswrapper[4841]: I0130 06:04:15.337092 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f528q"] Jan 30 06:04:16 crc kubenswrapper[4841]: I0130 06:04:16.743817 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f528q" podUID="9216d9c5-0659-49dc-a557-3fbe3aa21cbc" containerName="registry-server" containerID="cri-o://7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7" gracePeriod=2 Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.258944 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.390677 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-utilities\") pod \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\" (UID: \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\") " Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.391084 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q54x5\" (UniqueName: \"kubernetes.io/projected/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-kube-api-access-q54x5\") pod \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\" (UID: \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\") " Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.391302 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-catalog-content\") pod \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\" (UID: \"9216d9c5-0659-49dc-a557-3fbe3aa21cbc\") " Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.392155 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-utilities" (OuterVolumeSpecName: "utilities") pod "9216d9c5-0659-49dc-a557-3fbe3aa21cbc" (UID: "9216d9c5-0659-49dc-a557-3fbe3aa21cbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.400066 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-kube-api-access-q54x5" (OuterVolumeSpecName: "kube-api-access-q54x5") pod "9216d9c5-0659-49dc-a557-3fbe3aa21cbc" (UID: "9216d9c5-0659-49dc-a557-3fbe3aa21cbc"). InnerVolumeSpecName "kube-api-access-q54x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.422248 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9216d9c5-0659-49dc-a557-3fbe3aa21cbc" (UID: "9216d9c5-0659-49dc-a557-3fbe3aa21cbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.492974 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.493269 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q54x5\" (UniqueName: \"kubernetes.io/projected/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-kube-api-access-q54x5\") on node \"crc\" DevicePath \"\"" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.493347 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216d9c5-0659-49dc-a557-3fbe3aa21cbc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.759920 4841 generic.go:334] "Generic (PLEG): container finished" podID="9216d9c5-0659-49dc-a557-3fbe3aa21cbc" containerID="7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7" exitCode=0 Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.759984 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f528q" event={"ID":"9216d9c5-0659-49dc-a557-3fbe3aa21cbc","Type":"ContainerDied","Data":"7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7"} Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.760093 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f528q" event={"ID":"9216d9c5-0659-49dc-a557-3fbe3aa21cbc","Type":"ContainerDied","Data":"34b6217f0ff7e283cf51a5e932d5f9da16e8bcee17c487d45c9edca4c805d85e"} Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.760153 4841 scope.go:117] "RemoveContainer" containerID="7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.761148 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f528q" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.811166 4841 scope.go:117] "RemoveContainer" containerID="cb453ed462efb7805db48ec77606717fd1cfc03f1fe1e22ad10891ac3c7fc9e3" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.816307 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f528q"] Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.826984 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f528q"] Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.840811 4841 scope.go:117] "RemoveContainer" containerID="148e84caa9fee8276456345b5cad54ff569a07bf5e388a7099e2778f09ae2330" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.873473 4841 scope.go:117] "RemoveContainer" containerID="7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7" Jan 30 06:04:17 crc kubenswrapper[4841]: E0130 06:04:17.874049 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7\": container with ID starting with 7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7 not found: ID does not exist" containerID="7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.874141 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7"} err="failed to get container status \"7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7\": rpc error: code = NotFound desc = could not find container \"7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7\": container with ID starting with 7a74df871869526911b256bfffa51fd11b220d65b1963ee8fd404b0bfdbddfc7 not found: ID does not exist" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.874211 4841 scope.go:117] "RemoveContainer" containerID="cb453ed462efb7805db48ec77606717fd1cfc03f1fe1e22ad10891ac3c7fc9e3" Jan 30 06:04:17 crc kubenswrapper[4841]: E0130 06:04:17.874877 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb453ed462efb7805db48ec77606717fd1cfc03f1fe1e22ad10891ac3c7fc9e3\": container with ID starting with cb453ed462efb7805db48ec77606717fd1cfc03f1fe1e22ad10891ac3c7fc9e3 not found: ID does not exist" containerID="cb453ed462efb7805db48ec77606717fd1cfc03f1fe1e22ad10891ac3c7fc9e3" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.874940 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb453ed462efb7805db48ec77606717fd1cfc03f1fe1e22ad10891ac3c7fc9e3"} err="failed to get container status \"cb453ed462efb7805db48ec77606717fd1cfc03f1fe1e22ad10891ac3c7fc9e3\": rpc error: code = NotFound desc = could not find container \"cb453ed462efb7805db48ec77606717fd1cfc03f1fe1e22ad10891ac3c7fc9e3\": container with ID starting with cb453ed462efb7805db48ec77606717fd1cfc03f1fe1e22ad10891ac3c7fc9e3 not found: ID does not exist" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.874978 4841 scope.go:117] "RemoveContainer" containerID="148e84caa9fee8276456345b5cad54ff569a07bf5e388a7099e2778f09ae2330" Jan 30 06:04:17 crc kubenswrapper[4841]: E0130 06:04:17.875861 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"148e84caa9fee8276456345b5cad54ff569a07bf5e388a7099e2778f09ae2330\": container with ID starting with 148e84caa9fee8276456345b5cad54ff569a07bf5e388a7099e2778f09ae2330 not found: ID does not exist" containerID="148e84caa9fee8276456345b5cad54ff569a07bf5e388a7099e2778f09ae2330" Jan 30 06:04:17 crc kubenswrapper[4841]: I0130 06:04:17.875931 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148e84caa9fee8276456345b5cad54ff569a07bf5e388a7099e2778f09ae2330"} err="failed to get container status \"148e84caa9fee8276456345b5cad54ff569a07bf5e388a7099e2778f09ae2330\": rpc error: code = NotFound desc = could not find container \"148e84caa9fee8276456345b5cad54ff569a07bf5e388a7099e2778f09ae2330\": container with ID starting with 148e84caa9fee8276456345b5cad54ff569a07bf5e388a7099e2778f09ae2330 not found: ID does not exist" Jan 30 06:04:18 crc kubenswrapper[4841]: I0130 06:04:18.447953 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9216d9c5-0659-49dc-a557-3fbe3aa21cbc" path="/var/lib/kubelet/pods/9216d9c5-0659-49dc-a557-3fbe3aa21cbc/volumes" Jan 30 06:04:20 crc kubenswrapper[4841]: I0130 06:04:20.318775 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:20 crc kubenswrapper[4841]: I0130 06:04:20.319199 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:20 crc kubenswrapper[4841]: I0130 06:04:20.384727 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:20 crc kubenswrapper[4841]: I0130 06:04:20.850245 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:21 crc kubenswrapper[4841]: I0130 06:04:21.540450 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqr7t"] Jan 30 06:04:22 crc kubenswrapper[4841]: I0130 06:04:22.805262 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hqr7t" podUID="f0ce968e-0cb2-4996-85a4-b20b30f3fa80" containerName="registry-server" containerID="cri-o://3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c" gracePeriod=2 Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.285938 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.325428 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-utilities\") pod \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\" (UID: \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\") " Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.325581 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-catalog-content\") pod \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\" (UID: \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\") " Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.325663 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntt5d\" (UniqueName: \"kubernetes.io/projected/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-kube-api-access-ntt5d\") pod \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\" (UID: \"f0ce968e-0cb2-4996-85a4-b20b30f3fa80\") " Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.327188 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-utilities" (OuterVolumeSpecName: "utilities") pod "f0ce968e-0cb2-4996-85a4-b20b30f3fa80" (UID: "f0ce968e-0cb2-4996-85a4-b20b30f3fa80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.335761 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-kube-api-access-ntt5d" (OuterVolumeSpecName: "kube-api-access-ntt5d") pod "f0ce968e-0cb2-4996-85a4-b20b30f3fa80" (UID: "f0ce968e-0cb2-4996-85a4-b20b30f3fa80"). InnerVolumeSpecName "kube-api-access-ntt5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.427151 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntt5d\" (UniqueName: \"kubernetes.io/projected/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-kube-api-access-ntt5d\") on node \"crc\" DevicePath \"\"" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.427200 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.540826 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0ce968e-0cb2-4996-85a4-b20b30f3fa80" (UID: "f0ce968e-0cb2-4996-85a4-b20b30f3fa80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.630298 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0ce968e-0cb2-4996-85a4-b20b30f3fa80-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.819759 4841 generic.go:334] "Generic (PLEG): container finished" podID="f0ce968e-0cb2-4996-85a4-b20b30f3fa80" containerID="3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c" exitCode=0 Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.819776 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqr7t" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.819847 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqr7t" event={"ID":"f0ce968e-0cb2-4996-85a4-b20b30f3fa80","Type":"ContainerDied","Data":"3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c"} Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.820691 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqr7t" event={"ID":"f0ce968e-0cb2-4996-85a4-b20b30f3fa80","Type":"ContainerDied","Data":"6e6106a89d7d919f26df634b80ad7fdb210b6275b6b1b87c3ba0c3c197042077"} Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.820736 4841 scope.go:117] "RemoveContainer" containerID="3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.856056 4841 scope.go:117] "RemoveContainer" containerID="d679b7ce5f70de090ec03a661d3d6e1c766ecdb9459bbe502ef6e50f32f7e663" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.886482 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqr7t"] Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.899597 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hqr7t"] Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.911768 4841 scope.go:117] "RemoveContainer" containerID="984fe1c9fb7dd333a812281d866a77b6d4f50f1f48043b48af0c87b649c6ab26" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.935473 4841 scope.go:117] "RemoveContainer" containerID="3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c" Jan 30 06:04:23 crc kubenswrapper[4841]: E0130 06:04:23.936143 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c\": container with ID starting with 3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c not found: ID does not exist" containerID="3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.936197 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c"} err="failed to get container status \"3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c\": rpc error: code = NotFound desc = could not find container \"3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c\": container with ID starting with 3ae29af4c73714a1499fde20222211929c5851b720b443a2fc50b81be8a99b2c not found: ID does not exist" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.936232 4841 scope.go:117] "RemoveContainer" containerID="d679b7ce5f70de090ec03a661d3d6e1c766ecdb9459bbe502ef6e50f32f7e663" Jan 30 06:04:23 crc kubenswrapper[4841]: E0130 06:04:23.936751 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d679b7ce5f70de090ec03a661d3d6e1c766ecdb9459bbe502ef6e50f32f7e663\": container with ID starting with d679b7ce5f70de090ec03a661d3d6e1c766ecdb9459bbe502ef6e50f32f7e663 not found: ID does not exist" containerID="d679b7ce5f70de090ec03a661d3d6e1c766ecdb9459bbe502ef6e50f32f7e663" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.936820 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d679b7ce5f70de090ec03a661d3d6e1c766ecdb9459bbe502ef6e50f32f7e663"} err="failed to get container status \"d679b7ce5f70de090ec03a661d3d6e1c766ecdb9459bbe502ef6e50f32f7e663\": rpc error: code = NotFound desc = could not find container \"d679b7ce5f70de090ec03a661d3d6e1c766ecdb9459bbe502ef6e50f32f7e663\": container with ID starting with d679b7ce5f70de090ec03a661d3d6e1c766ecdb9459bbe502ef6e50f32f7e663 not found: ID does not exist" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.936866 4841 scope.go:117] "RemoveContainer" containerID="984fe1c9fb7dd333a812281d866a77b6d4f50f1f48043b48af0c87b649c6ab26" Jan 30 06:04:23 crc kubenswrapper[4841]: E0130 06:04:23.937519 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984fe1c9fb7dd333a812281d866a77b6d4f50f1f48043b48af0c87b649c6ab26\": container with ID starting with 984fe1c9fb7dd333a812281d866a77b6d4f50f1f48043b48af0c87b649c6ab26 not found: ID does not exist" containerID="984fe1c9fb7dd333a812281d866a77b6d4f50f1f48043b48af0c87b649c6ab26" Jan 30 06:04:23 crc kubenswrapper[4841]: I0130 06:04:23.937561 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984fe1c9fb7dd333a812281d866a77b6d4f50f1f48043b48af0c87b649c6ab26"} err="failed to get container status \"984fe1c9fb7dd333a812281d866a77b6d4f50f1f48043b48af0c87b649c6ab26\": rpc error: code = NotFound desc = could not find container \"984fe1c9fb7dd333a812281d866a77b6d4f50f1f48043b48af0c87b649c6ab26\": container with ID starting with 984fe1c9fb7dd333a812281d866a77b6d4f50f1f48043b48af0c87b649c6ab26 not found: ID does not exist" Jan 30 06:04:24 crc kubenswrapper[4841]: I0130 06:04:24.458770 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ce968e-0cb2-4996-85a4-b20b30f3fa80" path="/var/lib/kubelet/pods/f0ce968e-0cb2-4996-85a4-b20b30f3fa80/volumes" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.023187 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ww7np"] Jan 30 06:05:39 crc kubenswrapper[4841]: E0130 06:05:39.024237 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ce968e-0cb2-4996-85a4-b20b30f3fa80" containerName="registry-server" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.024258 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ce968e-0cb2-4996-85a4-b20b30f3fa80" containerName="registry-server" Jan 30 06:05:39 crc kubenswrapper[4841]: E0130 06:05:39.024278 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ce968e-0cb2-4996-85a4-b20b30f3fa80" containerName="extract-utilities" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.024291 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ce968e-0cb2-4996-85a4-b20b30f3fa80" containerName="extract-utilities" Jan 30 06:05:39 crc kubenswrapper[4841]: E0130 06:05:39.024314 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9216d9c5-0659-49dc-a557-3fbe3aa21cbc" containerName="registry-server" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.024327 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9216d9c5-0659-49dc-a557-3fbe3aa21cbc" containerName="registry-server" Jan 30 06:05:39 crc kubenswrapper[4841]: E0130 06:05:39.024351 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ce968e-0cb2-4996-85a4-b20b30f3fa80" containerName="extract-content" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.024367 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ce968e-0cb2-4996-85a4-b20b30f3fa80" containerName="extract-content" Jan 30 06:05:39 crc kubenswrapper[4841]: E0130 06:05:39.024387 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9216d9c5-0659-49dc-a557-3fbe3aa21cbc" containerName="extract-utilities" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.024422 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9216d9c5-0659-49dc-a557-3fbe3aa21cbc" containerName="extract-utilities" Jan 30 06:05:39 crc kubenswrapper[4841]: E0130 06:05:39.024450 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9216d9c5-0659-49dc-a557-3fbe3aa21cbc" containerName="extract-content" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.024462 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9216d9c5-0659-49dc-a557-3fbe3aa21cbc" containerName="extract-content" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.024691 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ce968e-0cb2-4996-85a4-b20b30f3fa80" containerName="registry-server" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.024715 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9216d9c5-0659-49dc-a557-3fbe3aa21cbc" containerName="registry-server" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.028792 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.042867 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ww7np"] Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.169433 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a40cd63-8877-4559-9248-d2f2fb6636c4-utilities\") pod \"community-operators-ww7np\" (UID: \"6a40cd63-8877-4559-9248-d2f2fb6636c4\") " pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.169483 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b69f8\" (UniqueName: \"kubernetes.io/projected/6a40cd63-8877-4559-9248-d2f2fb6636c4-kube-api-access-b69f8\") pod \"community-operators-ww7np\" (UID: \"6a40cd63-8877-4559-9248-d2f2fb6636c4\") " pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.169540 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a40cd63-8877-4559-9248-d2f2fb6636c4-catalog-content\") pod \"community-operators-ww7np\" (UID: \"6a40cd63-8877-4559-9248-d2f2fb6636c4\") " pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.270678 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a40cd63-8877-4559-9248-d2f2fb6636c4-utilities\") pod \"community-operators-ww7np\" (UID: \"6a40cd63-8877-4559-9248-d2f2fb6636c4\") " pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.270734 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b69f8\" (UniqueName: \"kubernetes.io/projected/6a40cd63-8877-4559-9248-d2f2fb6636c4-kube-api-access-b69f8\") pod \"community-operators-ww7np\" (UID: \"6a40cd63-8877-4559-9248-d2f2fb6636c4\") " pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.270795 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a40cd63-8877-4559-9248-d2f2fb6636c4-catalog-content\") pod \"community-operators-ww7np\" (UID: \"6a40cd63-8877-4559-9248-d2f2fb6636c4\") " pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.271296 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a40cd63-8877-4559-9248-d2f2fb6636c4-utilities\") pod \"community-operators-ww7np\" (UID: \"6a40cd63-8877-4559-9248-d2f2fb6636c4\") " pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.271335 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a40cd63-8877-4559-9248-d2f2fb6636c4-catalog-content\") pod \"community-operators-ww7np\" (UID: \"6a40cd63-8877-4559-9248-d2f2fb6636c4\") " pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.299239 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b69f8\" (UniqueName: \"kubernetes.io/projected/6a40cd63-8877-4559-9248-d2f2fb6636c4-kube-api-access-b69f8\") pod \"community-operators-ww7np\" (UID: \"6a40cd63-8877-4559-9248-d2f2fb6636c4\") " pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.359776 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:39 crc kubenswrapper[4841]: I0130 06:05:39.910382 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ww7np"] Jan 30 06:05:40 crc kubenswrapper[4841]: I0130 06:05:40.551008 4841 generic.go:334] "Generic (PLEG): container finished" podID="6a40cd63-8877-4559-9248-d2f2fb6636c4" containerID="28bbeb56c200f689e98a40c46e766bf00800c6c177b80429e4ec3e222e1f9f31" exitCode=0 Jan 30 06:05:40 crc kubenswrapper[4841]: I0130 06:05:40.551240 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ww7np" event={"ID":"6a40cd63-8877-4559-9248-d2f2fb6636c4","Type":"ContainerDied","Data":"28bbeb56c200f689e98a40c46e766bf00800c6c177b80429e4ec3e222e1f9f31"} Jan 30 06:05:40 crc kubenswrapper[4841]: I0130 06:05:40.551357 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ww7np" event={"ID":"6a40cd63-8877-4559-9248-d2f2fb6636c4","Type":"ContainerStarted","Data":"a96607062024580a48ec89c3913e5354e7dff240ec39309cec0ad7ca45659f72"} Jan 30 06:05:42 crc kubenswrapper[4841]: I0130 06:05:42.570394 4841 generic.go:334] "Generic (PLEG): container finished" podID="6a40cd63-8877-4559-9248-d2f2fb6636c4" containerID="3a38fc7401eda5f2e4d36b91a4d3ee5ca2eef8b19b34382f94f8be506494f767" exitCode=0 Jan 30 06:05:42 crc kubenswrapper[4841]: I0130 06:05:42.570585 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ww7np" event={"ID":"6a40cd63-8877-4559-9248-d2f2fb6636c4","Type":"ContainerDied","Data":"3a38fc7401eda5f2e4d36b91a4d3ee5ca2eef8b19b34382f94f8be506494f767"} Jan 30 06:05:43 crc kubenswrapper[4841]: I0130 06:05:43.582656 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ww7np" event={"ID":"6a40cd63-8877-4559-9248-d2f2fb6636c4","Type":"ContainerStarted","Data":"e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5"} Jan 30 06:05:43 crc kubenswrapper[4841]: I0130 06:05:43.610886 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ww7np" podStartSLOduration=3.199919238 podStartE2EDuration="5.610862487s" podCreationTimestamp="2026-01-30 06:05:38 +0000 UTC" firstStartedPulling="2026-01-30 06:05:40.553552504 +0000 UTC m=+3477.547025172" lastFinishedPulling="2026-01-30 06:05:42.964495743 +0000 UTC m=+3479.957968421" observedRunningTime="2026-01-30 06:05:43.604388198 +0000 UTC m=+3480.597860846" watchObservedRunningTime="2026-01-30 06:05:43.610862487 +0000 UTC m=+3480.604335165" Jan 30 06:05:49 crc kubenswrapper[4841]: I0130 06:05:49.361347 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:49 crc kubenswrapper[4841]: I0130 06:05:49.362094 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:49 crc kubenswrapper[4841]: I0130 06:05:49.440609 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:49 crc kubenswrapper[4841]: I0130 06:05:49.703331 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:49 crc kubenswrapper[4841]: I0130 06:05:49.782561 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ww7np"] Jan 30 06:05:51 crc kubenswrapper[4841]: I0130 06:05:51.645012 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ww7np" podUID="6a40cd63-8877-4559-9248-d2f2fb6636c4" containerName="registry-server" containerID="cri-o://e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5" gracePeriod=2 Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.073206 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.172238 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b69f8\" (UniqueName: \"kubernetes.io/projected/6a40cd63-8877-4559-9248-d2f2fb6636c4-kube-api-access-b69f8\") pod \"6a40cd63-8877-4559-9248-d2f2fb6636c4\" (UID: \"6a40cd63-8877-4559-9248-d2f2fb6636c4\") " Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.172329 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a40cd63-8877-4559-9248-d2f2fb6636c4-catalog-content\") pod \"6a40cd63-8877-4559-9248-d2f2fb6636c4\" (UID: \"6a40cd63-8877-4559-9248-d2f2fb6636c4\") " Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.172448 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a40cd63-8877-4559-9248-d2f2fb6636c4-utilities\") pod \"6a40cd63-8877-4559-9248-d2f2fb6636c4\" (UID: \"6a40cd63-8877-4559-9248-d2f2fb6636c4\") " Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.175279 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a40cd63-8877-4559-9248-d2f2fb6636c4-utilities" (OuterVolumeSpecName: "utilities") pod "6a40cd63-8877-4559-9248-d2f2fb6636c4" (UID: "6a40cd63-8877-4559-9248-d2f2fb6636c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.184716 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a40cd63-8877-4559-9248-d2f2fb6636c4-kube-api-access-b69f8" (OuterVolumeSpecName: "kube-api-access-b69f8") pod "6a40cd63-8877-4559-9248-d2f2fb6636c4" (UID: "6a40cd63-8877-4559-9248-d2f2fb6636c4"). InnerVolumeSpecName "kube-api-access-b69f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.229142 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a40cd63-8877-4559-9248-d2f2fb6636c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a40cd63-8877-4559-9248-d2f2fb6636c4" (UID: "6a40cd63-8877-4559-9248-d2f2fb6636c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.274256 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b69f8\" (UniqueName: \"kubernetes.io/projected/6a40cd63-8877-4559-9248-d2f2fb6636c4-kube-api-access-b69f8\") on node \"crc\" DevicePath \"\"" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.274314 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a40cd63-8877-4559-9248-d2f2fb6636c4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.274333 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a40cd63-8877-4559-9248-d2f2fb6636c4-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.659528 4841 generic.go:334] "Generic (PLEG): container finished" podID="6a40cd63-8877-4559-9248-d2f2fb6636c4" containerID="e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5" exitCode=0 Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.659590 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ww7np" event={"ID":"6a40cd63-8877-4559-9248-d2f2fb6636c4","Type":"ContainerDied","Data":"e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5"} Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.659628 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ww7np" event={"ID":"6a40cd63-8877-4559-9248-d2f2fb6636c4","Type":"ContainerDied","Data":"a96607062024580a48ec89c3913e5354e7dff240ec39309cec0ad7ca45659f72"} Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.659656 4841 scope.go:117] "RemoveContainer" containerID="e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.659653 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ww7np" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.713069 4841 scope.go:117] "RemoveContainer" containerID="3a38fc7401eda5f2e4d36b91a4d3ee5ca2eef8b19b34382f94f8be506494f767" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.713234 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ww7np"] Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.720192 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ww7np"] Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.740702 4841 scope.go:117] "RemoveContainer" containerID="28bbeb56c200f689e98a40c46e766bf00800c6c177b80429e4ec3e222e1f9f31" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.756028 4841 scope.go:117] "RemoveContainer" containerID="e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5" Jan 30 06:05:52 crc kubenswrapper[4841]: E0130 06:05:52.756300 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5\": container with ID starting with e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5 not found: ID does not exist" containerID="e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.756326 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5"} err="failed to get container status \"e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5\": rpc error: code = NotFound desc = could not find container \"e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5\": container with ID starting with e1f778b667b129a67e10a807854968c1d89b3558ef7c31b5de66e08fdee309d5 not found: ID does not exist" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.756345 4841 scope.go:117] "RemoveContainer" containerID="3a38fc7401eda5f2e4d36b91a4d3ee5ca2eef8b19b34382f94f8be506494f767" Jan 30 06:05:52 crc kubenswrapper[4841]: E0130 06:05:52.756614 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a38fc7401eda5f2e4d36b91a4d3ee5ca2eef8b19b34382f94f8be506494f767\": container with ID starting with 3a38fc7401eda5f2e4d36b91a4d3ee5ca2eef8b19b34382f94f8be506494f767 not found: ID does not exist" containerID="3a38fc7401eda5f2e4d36b91a4d3ee5ca2eef8b19b34382f94f8be506494f767" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.756634 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a38fc7401eda5f2e4d36b91a4d3ee5ca2eef8b19b34382f94f8be506494f767"} err="failed to get container status \"3a38fc7401eda5f2e4d36b91a4d3ee5ca2eef8b19b34382f94f8be506494f767\": rpc error: code = NotFound desc = could not find container \"3a38fc7401eda5f2e4d36b91a4d3ee5ca2eef8b19b34382f94f8be506494f767\": container with ID starting with 3a38fc7401eda5f2e4d36b91a4d3ee5ca2eef8b19b34382f94f8be506494f767 not found: ID does not exist" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.756647 4841 scope.go:117] "RemoveContainer" containerID="28bbeb56c200f689e98a40c46e766bf00800c6c177b80429e4ec3e222e1f9f31" Jan 30 06:05:52 crc kubenswrapper[4841]: E0130 06:05:52.756859 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28bbeb56c200f689e98a40c46e766bf00800c6c177b80429e4ec3e222e1f9f31\": container with ID starting with 28bbeb56c200f689e98a40c46e766bf00800c6c177b80429e4ec3e222e1f9f31 not found: ID does not exist" containerID="28bbeb56c200f689e98a40c46e766bf00800c6c177b80429e4ec3e222e1f9f31" Jan 30 06:05:52 crc kubenswrapper[4841]: I0130 06:05:52.756877 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28bbeb56c200f689e98a40c46e766bf00800c6c177b80429e4ec3e222e1f9f31"} err="failed to get container status \"28bbeb56c200f689e98a40c46e766bf00800c6c177b80429e4ec3e222e1f9f31\": rpc error: code = NotFound desc = could not find container \"28bbeb56c200f689e98a40c46e766bf00800c6c177b80429e4ec3e222e1f9f31\": container with ID starting with 28bbeb56c200f689e98a40c46e766bf00800c6c177b80429e4ec3e222e1f9f31 not found: ID does not exist" Jan 30 06:05:54 crc kubenswrapper[4841]: I0130 06:05:54.447979 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a40cd63-8877-4559-9248-d2f2fb6636c4" path="/var/lib/kubelet/pods/6a40cd63-8877-4559-9248-d2f2fb6636c4/volumes" Jan 30 06:06:10 crc kubenswrapper[4841]: I0130 06:06:10.464533 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:06:10 crc kubenswrapper[4841]: I0130 06:06:10.465179 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:06:40 crc kubenswrapper[4841]: I0130 06:06:40.464091 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:06:40 crc kubenswrapper[4841]: I0130 06:06:40.465886 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:07:10 crc kubenswrapper[4841]: I0130 06:07:10.463942 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:07:10 crc kubenswrapper[4841]: I0130 06:07:10.465642 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:07:10 crc kubenswrapper[4841]: I0130 06:07:10.465915 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 06:07:10 crc kubenswrapper[4841]: I0130 06:07:10.466905 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:07:10 crc kubenswrapper[4841]: I0130 06:07:10.467071 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" gracePeriod=600 Jan 30 06:07:10 crc kubenswrapper[4841]: E0130 06:07:10.620858 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:07:11 crc kubenswrapper[4841]: I0130 06:07:11.374055 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" exitCode=0 Jan 30 06:07:11 crc kubenswrapper[4841]: I0130 06:07:11.374116 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801"} Jan 30 06:07:11 crc kubenswrapper[4841]: I0130 06:07:11.374214 4841 scope.go:117] "RemoveContainer" containerID="8b27b440560b98b1aaa629d9247b2cb76a1ed1307d7294ce65ab494bca036655" Jan 30 06:07:11 crc kubenswrapper[4841]: I0130 06:07:11.374929 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:07:11 crc kubenswrapper[4841]: E0130 06:07:11.375305 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:07:26 crc kubenswrapper[4841]: I0130 06:07:26.432572 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:07:26 crc kubenswrapper[4841]: E0130 06:07:26.433323 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:07:37 crc kubenswrapper[4841]: I0130 06:07:37.432202 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:07:37 crc kubenswrapper[4841]: E0130 06:07:37.433340 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:07:48 crc kubenswrapper[4841]: I0130 06:07:48.432309 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:07:48 crc kubenswrapper[4841]: E0130 06:07:48.433374 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:08:01 crc kubenswrapper[4841]: I0130 06:08:01.432160 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:08:01 crc kubenswrapper[4841]: E0130 06:08:01.433054 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:08:12 crc kubenswrapper[4841]: I0130 06:08:12.432571 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:08:12 crc kubenswrapper[4841]: E0130 06:08:12.433347 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:08:25 crc kubenswrapper[4841]: I0130 06:08:25.431937 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:08:25 crc kubenswrapper[4841]: E0130 06:08:25.432874 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:08:39 crc kubenswrapper[4841]: I0130 06:08:39.431814 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:08:39 crc kubenswrapper[4841]: E0130 06:08:39.434927 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:08:51 crc kubenswrapper[4841]: I0130 06:08:51.431566 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:08:51 crc kubenswrapper[4841]: E0130 06:08:51.432472 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:09:04 crc kubenswrapper[4841]: I0130 06:09:04.440593 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:09:04 crc kubenswrapper[4841]: E0130 06:09:04.441549 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:09:17 crc kubenswrapper[4841]: I0130 06:09:17.431668 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:09:17 crc kubenswrapper[4841]: E0130 06:09:17.432617 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:09:29 crc kubenswrapper[4841]: I0130 06:09:29.431963 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:09:29 crc kubenswrapper[4841]: E0130 06:09:29.432997 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:09:43 crc kubenswrapper[4841]: I0130 06:09:43.432184 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:09:43 crc kubenswrapper[4841]: E0130 06:09:43.433147 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:09:58 crc kubenswrapper[4841]: I0130 06:09:58.432187 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:09:58 crc kubenswrapper[4841]: E0130 06:09:58.432947 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:10:12 crc kubenswrapper[4841]: I0130 06:10:12.432445 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:10:12 crc kubenswrapper[4841]: E0130 06:10:12.433339 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:10:25 crc kubenswrapper[4841]: I0130 06:10:25.432447 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:10:25 crc kubenswrapper[4841]: E0130 06:10:25.433186 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:10:37 crc kubenswrapper[4841]: I0130 06:10:37.432151 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:10:37 crc kubenswrapper[4841]: E0130 06:10:37.433168 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:10:50 crc kubenswrapper[4841]: I0130 06:10:50.432363 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:10:50 crc kubenswrapper[4841]: E0130 06:10:50.433513 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:11:02 crc kubenswrapper[4841]: I0130 06:11:02.436469 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:11:02 crc kubenswrapper[4841]: E0130 06:11:02.437454 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:11:17 crc kubenswrapper[4841]: I0130 06:11:17.431884 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:11:17 crc kubenswrapper[4841]: E0130 06:11:17.432918 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:11:32 crc kubenswrapper[4841]: I0130 06:11:32.431895 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:11:32 crc kubenswrapper[4841]: E0130 06:11:32.432632 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:11:43 crc kubenswrapper[4841]: I0130 06:11:43.431932 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:11:43 crc kubenswrapper[4841]: E0130 06:11:43.432993 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:11:54 crc kubenswrapper[4841]: I0130 06:11:54.436331 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:11:54 crc kubenswrapper[4841]: E0130 06:11:54.437257 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:12:06 crc kubenswrapper[4841]: I0130 06:12:06.432445 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:12:06 crc kubenswrapper[4841]: E0130 06:12:06.433668 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:12:21 crc kubenswrapper[4841]: I0130 06:12:21.432780 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:12:22 crc kubenswrapper[4841]: I0130 06:12:22.627440 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"c3f2d4df2271d53f797d15ee95e4485f03a994633066034c36fad35ff2c88b42"} Jan 30 06:14:40 crc kubenswrapper[4841]: I0130 06:14:40.464097 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:14:40 crc kubenswrapper[4841]: I0130 06:14:40.464580 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.194293 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss"] Jan 30 06:15:00 crc kubenswrapper[4841]: E0130 06:15:00.195076 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a40cd63-8877-4559-9248-d2f2fb6636c4" containerName="extract-content" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.195088 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a40cd63-8877-4559-9248-d2f2fb6636c4" containerName="extract-content" Jan 30 06:15:00 crc kubenswrapper[4841]: E0130 06:15:00.195104 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a40cd63-8877-4559-9248-d2f2fb6636c4" containerName="extract-utilities" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.195111 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a40cd63-8877-4559-9248-d2f2fb6636c4" containerName="extract-utilities" Jan 30 06:15:00 crc kubenswrapper[4841]: E0130 06:15:00.195120 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a40cd63-8877-4559-9248-d2f2fb6636c4" containerName="registry-server" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.195126 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a40cd63-8877-4559-9248-d2f2fb6636c4" containerName="registry-server" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.195273 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a40cd63-8877-4559-9248-d2f2fb6636c4" containerName="registry-server" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.195711 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.199802 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.224843 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.235222 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss"] Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.326580 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nxzd\" (UniqueName: \"kubernetes.io/projected/3aee0eb8-1461-42b5-b53c-b020d519ee43-kube-api-access-8nxzd\") pod \"collect-profiles-29495895-brpss\" (UID: \"3aee0eb8-1461-42b5-b53c-b020d519ee43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.326680 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3aee0eb8-1461-42b5-b53c-b020d519ee43-config-volume\") pod \"collect-profiles-29495895-brpss\" (UID: \"3aee0eb8-1461-42b5-b53c-b020d519ee43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.326740 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3aee0eb8-1461-42b5-b53c-b020d519ee43-secret-volume\") pod \"collect-profiles-29495895-brpss\" (UID: \"3aee0eb8-1461-42b5-b53c-b020d519ee43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.428769 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3aee0eb8-1461-42b5-b53c-b020d519ee43-secret-volume\") pod \"collect-profiles-29495895-brpss\" (UID: \"3aee0eb8-1461-42b5-b53c-b020d519ee43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.428896 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nxzd\" (UniqueName: \"kubernetes.io/projected/3aee0eb8-1461-42b5-b53c-b020d519ee43-kube-api-access-8nxzd\") pod \"collect-profiles-29495895-brpss\" (UID: \"3aee0eb8-1461-42b5-b53c-b020d519ee43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.429006 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3aee0eb8-1461-42b5-b53c-b020d519ee43-config-volume\") pod \"collect-profiles-29495895-brpss\" (UID: \"3aee0eb8-1461-42b5-b53c-b020d519ee43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.430706 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3aee0eb8-1461-42b5-b53c-b020d519ee43-config-volume\") pod \"collect-profiles-29495895-brpss\" (UID: \"3aee0eb8-1461-42b5-b53c-b020d519ee43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.437504 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3aee0eb8-1461-42b5-b53c-b020d519ee43-secret-volume\") pod \"collect-profiles-29495895-brpss\" (UID: \"3aee0eb8-1461-42b5-b53c-b020d519ee43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.452776 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nxzd\" (UniqueName: \"kubernetes.io/projected/3aee0eb8-1461-42b5-b53c-b020d519ee43-kube-api-access-8nxzd\") pod \"collect-profiles-29495895-brpss\" (UID: \"3aee0eb8-1461-42b5-b53c-b020d519ee43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:00 crc kubenswrapper[4841]: I0130 06:15:00.536349 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:01 crc kubenswrapper[4841]: I0130 06:15:01.025033 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss"] Jan 30 06:15:01 crc kubenswrapper[4841]: I0130 06:15:01.144576 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" event={"ID":"3aee0eb8-1461-42b5-b53c-b020d519ee43","Type":"ContainerStarted","Data":"b0d342634b65b5ff51b1803f99f4a55d64b0c78993aef3cef3e8f6eae74404fa"} Jan 30 06:15:02 crc kubenswrapper[4841]: I0130 06:15:02.156709 4841 generic.go:334] "Generic (PLEG): container finished" podID="3aee0eb8-1461-42b5-b53c-b020d519ee43" containerID="15203c44eb2199d094440f479d6223702bccf3b5cd6b611e91dc45bbc260a8ff" exitCode=0 Jan 30 06:15:02 crc kubenswrapper[4841]: I0130 06:15:02.156799 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" event={"ID":"3aee0eb8-1461-42b5-b53c-b020d519ee43","Type":"ContainerDied","Data":"15203c44eb2199d094440f479d6223702bccf3b5cd6b611e91dc45bbc260a8ff"} Jan 30 06:15:03 crc kubenswrapper[4841]: I0130 06:15:03.711537 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:03 crc kubenswrapper[4841]: I0130 06:15:03.776551 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3aee0eb8-1461-42b5-b53c-b020d519ee43-secret-volume\") pod \"3aee0eb8-1461-42b5-b53c-b020d519ee43\" (UID: \"3aee0eb8-1461-42b5-b53c-b020d519ee43\") " Jan 30 06:15:03 crc kubenswrapper[4841]: I0130 06:15:03.776682 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3aee0eb8-1461-42b5-b53c-b020d519ee43-config-volume\") pod \"3aee0eb8-1461-42b5-b53c-b020d519ee43\" (UID: \"3aee0eb8-1461-42b5-b53c-b020d519ee43\") " Jan 30 06:15:03 crc kubenswrapper[4841]: I0130 06:15:03.776860 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nxzd\" (UniqueName: \"kubernetes.io/projected/3aee0eb8-1461-42b5-b53c-b020d519ee43-kube-api-access-8nxzd\") pod \"3aee0eb8-1461-42b5-b53c-b020d519ee43\" (UID: \"3aee0eb8-1461-42b5-b53c-b020d519ee43\") " Jan 30 06:15:03 crc kubenswrapper[4841]: I0130 06:15:03.777577 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aee0eb8-1461-42b5-b53c-b020d519ee43-config-volume" (OuterVolumeSpecName: "config-volume") pod "3aee0eb8-1461-42b5-b53c-b020d519ee43" (UID: "3aee0eb8-1461-42b5-b53c-b020d519ee43"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:15:03 crc kubenswrapper[4841]: I0130 06:15:03.778728 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3aee0eb8-1461-42b5-b53c-b020d519ee43-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:15:03 crc kubenswrapper[4841]: I0130 06:15:03.783571 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aee0eb8-1461-42b5-b53c-b020d519ee43-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3aee0eb8-1461-42b5-b53c-b020d519ee43" (UID: "3aee0eb8-1461-42b5-b53c-b020d519ee43"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:15:03 crc kubenswrapper[4841]: I0130 06:15:03.783993 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aee0eb8-1461-42b5-b53c-b020d519ee43-kube-api-access-8nxzd" (OuterVolumeSpecName: "kube-api-access-8nxzd") pod "3aee0eb8-1461-42b5-b53c-b020d519ee43" (UID: "3aee0eb8-1461-42b5-b53c-b020d519ee43"). InnerVolumeSpecName "kube-api-access-8nxzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:15:03 crc kubenswrapper[4841]: I0130 06:15:03.880647 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nxzd\" (UniqueName: \"kubernetes.io/projected/3aee0eb8-1461-42b5-b53c-b020d519ee43-kube-api-access-8nxzd\") on node \"crc\" DevicePath \"\"" Jan 30 06:15:03 crc kubenswrapper[4841]: I0130 06:15:03.880696 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3aee0eb8-1461-42b5-b53c-b020d519ee43-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:15:04 crc kubenswrapper[4841]: I0130 06:15:04.173527 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" event={"ID":"3aee0eb8-1461-42b5-b53c-b020d519ee43","Type":"ContainerDied","Data":"b0d342634b65b5ff51b1803f99f4a55d64b0c78993aef3cef3e8f6eae74404fa"} Jan 30 06:15:04 crc kubenswrapper[4841]: I0130 06:15:04.173570 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0d342634b65b5ff51b1803f99f4a55d64b0c78993aef3cef3e8f6eae74404fa" Jan 30 06:15:04 crc kubenswrapper[4841]: I0130 06:15:04.173635 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss" Jan 30 06:15:04 crc kubenswrapper[4841]: I0130 06:15:04.817695 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz"] Jan 30 06:15:04 crc kubenswrapper[4841]: I0130 06:15:04.823526 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495850-v5jgz"] Jan 30 06:15:06 crc kubenswrapper[4841]: I0130 06:15:06.449872 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3827ca89-b447-4c79-a946-bb1170c1e039" path="/var/lib/kubelet/pods/3827ca89-b447-4c79-a946-bb1170c1e039/volumes" Jan 30 06:15:10 crc kubenswrapper[4841]: I0130 06:15:10.464266 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:15:10 crc kubenswrapper[4841]: I0130 06:15:10.464709 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.021946 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d5bht"] Jan 30 06:15:28 crc kubenswrapper[4841]: E0130 06:15:28.023543 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aee0eb8-1461-42b5-b53c-b020d519ee43" containerName="collect-profiles" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.023568 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aee0eb8-1461-42b5-b53c-b020d519ee43" containerName="collect-profiles" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.023848 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aee0eb8-1461-42b5-b53c-b020d519ee43" containerName="collect-profiles" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.025520 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.071002 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5bht"] Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.213160 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-utilities\") pod \"redhat-marketplace-d5bht\" (UID: \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\") " pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.213352 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-catalog-content\") pod \"redhat-marketplace-d5bht\" (UID: \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\") " pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.213464 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bmk4\" (UniqueName: \"kubernetes.io/projected/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-kube-api-access-8bmk4\") pod \"redhat-marketplace-d5bht\" (UID: \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\") " pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.314386 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-utilities\") pod \"redhat-marketplace-d5bht\" (UID: \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\") " pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.314517 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-catalog-content\") pod \"redhat-marketplace-d5bht\" (UID: \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\") " pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.314559 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bmk4\" (UniqueName: \"kubernetes.io/projected/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-kube-api-access-8bmk4\") pod \"redhat-marketplace-d5bht\" (UID: \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\") " pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.314927 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-utilities\") pod \"redhat-marketplace-d5bht\" (UID: \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\") " pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.315034 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-catalog-content\") pod \"redhat-marketplace-d5bht\" (UID: \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\") " pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.343426 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bmk4\" (UniqueName: \"kubernetes.io/projected/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-kube-api-access-8bmk4\") pod \"redhat-marketplace-d5bht\" (UID: \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\") " pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.387971 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:28 crc kubenswrapper[4841]: I0130 06:15:28.879214 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5bht"] Jan 30 06:15:29 crc kubenswrapper[4841]: I0130 06:15:29.398977 4841 generic.go:334] "Generic (PLEG): container finished" podID="9433f72b-3889-48f5-b7d1-5c5f4d14bc76" containerID="b06f36e8f9ea8700d6a30ddb9726dd5a7feed677c7f2c0093ff0ce1fb9b5f2c5" exitCode=0 Jan 30 06:15:29 crc kubenswrapper[4841]: I0130 06:15:29.399074 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5bht" event={"ID":"9433f72b-3889-48f5-b7d1-5c5f4d14bc76","Type":"ContainerDied","Data":"b06f36e8f9ea8700d6a30ddb9726dd5a7feed677c7f2c0093ff0ce1fb9b5f2c5"} Jan 30 06:15:29 crc kubenswrapper[4841]: I0130 06:15:29.399299 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5bht" event={"ID":"9433f72b-3889-48f5-b7d1-5c5f4d14bc76","Type":"ContainerStarted","Data":"e0fb40837786a72105c04410cb2307c87d689430d720053654e3a2f417f379b1"} Jan 30 06:15:29 crc kubenswrapper[4841]: I0130 06:15:29.401716 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:15:30 crc kubenswrapper[4841]: I0130 06:15:30.457299 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5bht" event={"ID":"9433f72b-3889-48f5-b7d1-5c5f4d14bc76","Type":"ContainerStarted","Data":"0bcda87d19525168d363dfdf3c5712fed709745883945ba0613e43de47b053fb"} Jan 30 06:15:31 crc kubenswrapper[4841]: I0130 06:15:31.454049 4841 generic.go:334] "Generic (PLEG): container finished" podID="9433f72b-3889-48f5-b7d1-5c5f4d14bc76" containerID="0bcda87d19525168d363dfdf3c5712fed709745883945ba0613e43de47b053fb" exitCode=0 Jan 30 06:15:31 crc kubenswrapper[4841]: I0130 06:15:31.454090 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5bht" event={"ID":"9433f72b-3889-48f5-b7d1-5c5f4d14bc76","Type":"ContainerDied","Data":"0bcda87d19525168d363dfdf3c5712fed709745883945ba0613e43de47b053fb"} Jan 30 06:15:32 crc kubenswrapper[4841]: I0130 06:15:32.463235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5bht" event={"ID":"9433f72b-3889-48f5-b7d1-5c5f4d14bc76","Type":"ContainerStarted","Data":"aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b"} Jan 30 06:15:32 crc kubenswrapper[4841]: I0130 06:15:32.484524 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d5bht" podStartSLOduration=2.903484643 podStartE2EDuration="5.48449276s" podCreationTimestamp="2026-01-30 06:15:27 +0000 UTC" firstStartedPulling="2026-01-30 06:15:29.401199014 +0000 UTC m=+4066.394671692" lastFinishedPulling="2026-01-30 06:15:31.982207171 +0000 UTC m=+4068.975679809" observedRunningTime="2026-01-30 06:15:32.483549065 +0000 UTC m=+4069.477021743" watchObservedRunningTime="2026-01-30 06:15:32.48449276 +0000 UTC m=+4069.477965408" Jan 30 06:15:38 crc kubenswrapper[4841]: I0130 06:15:38.388466 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:38 crc kubenswrapper[4841]: I0130 06:15:38.389908 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:38 crc kubenswrapper[4841]: I0130 06:15:38.471301 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:38 crc kubenswrapper[4841]: I0130 06:15:38.579454 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:38 crc kubenswrapper[4841]: I0130 06:15:38.706111 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5bht"] Jan 30 06:15:40 crc kubenswrapper[4841]: I0130 06:15:40.463653 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:15:40 crc kubenswrapper[4841]: I0130 06:15:40.464008 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:15:40 crc kubenswrapper[4841]: I0130 06:15:40.464063 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 06:15:40 crc kubenswrapper[4841]: I0130 06:15:40.464805 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c3f2d4df2271d53f797d15ee95e4485f03a994633066034c36fad35ff2c88b42"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:15:40 crc kubenswrapper[4841]: I0130 06:15:40.464896 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://c3f2d4df2271d53f797d15ee95e4485f03a994633066034c36fad35ff2c88b42" gracePeriod=600 Jan 30 06:15:40 crc kubenswrapper[4841]: I0130 06:15:40.526795 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d5bht" podUID="9433f72b-3889-48f5-b7d1-5c5f4d14bc76" containerName="registry-server" containerID="cri-o://aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b" gracePeriod=2 Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.044100 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.226992 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-catalog-content\") pod \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\" (UID: \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\") " Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.227483 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-utilities\") pod \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\" (UID: \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\") " Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.227672 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bmk4\" (UniqueName: \"kubernetes.io/projected/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-kube-api-access-8bmk4\") pod \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\" (UID: \"9433f72b-3889-48f5-b7d1-5c5f4d14bc76\") " Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.228370 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-utilities" (OuterVolumeSpecName: "utilities") pod "9433f72b-3889-48f5-b7d1-5c5f4d14bc76" (UID: "9433f72b-3889-48f5-b7d1-5c5f4d14bc76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.233515 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-kube-api-access-8bmk4" (OuterVolumeSpecName: "kube-api-access-8bmk4") pod "9433f72b-3889-48f5-b7d1-5c5f4d14bc76" (UID: "9433f72b-3889-48f5-b7d1-5c5f4d14bc76"). InnerVolumeSpecName "kube-api-access-8bmk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.265055 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9433f72b-3889-48f5-b7d1-5c5f4d14bc76" (UID: "9433f72b-3889-48f5-b7d1-5c5f4d14bc76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.329790 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bmk4\" (UniqueName: \"kubernetes.io/projected/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-kube-api-access-8bmk4\") on node \"crc\" DevicePath \"\"" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.329825 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.329838 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9433f72b-3889-48f5-b7d1-5c5f4d14bc76-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.545863 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="c3f2d4df2271d53f797d15ee95e4485f03a994633066034c36fad35ff2c88b42" exitCode=0 Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.545970 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"c3f2d4df2271d53f797d15ee95e4485f03a994633066034c36fad35ff2c88b42"} Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.546074 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842"} Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.546110 4841 scope.go:117] "RemoveContainer" containerID="f3601515815e3c42332c21021b76d70a4b5dad7e8eefe16b5b284ef1dd1f0801" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.550533 4841 generic.go:334] "Generic (PLEG): container finished" podID="9433f72b-3889-48f5-b7d1-5c5f4d14bc76" containerID="aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b" exitCode=0 Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.550588 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5bht" event={"ID":"9433f72b-3889-48f5-b7d1-5c5f4d14bc76","Type":"ContainerDied","Data":"aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b"} Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.550630 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5bht" event={"ID":"9433f72b-3889-48f5-b7d1-5c5f4d14bc76","Type":"ContainerDied","Data":"e0fb40837786a72105c04410cb2307c87d689430d720053654e3a2f417f379b1"} Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.550714 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5bht" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.606715 4841 scope.go:117] "RemoveContainer" containerID="aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.610609 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5bht"] Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.616931 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5bht"] Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.631226 4841 scope.go:117] "RemoveContainer" containerID="0bcda87d19525168d363dfdf3c5712fed709745883945ba0613e43de47b053fb" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.669456 4841 scope.go:117] "RemoveContainer" containerID="b06f36e8f9ea8700d6a30ddb9726dd5a7feed677c7f2c0093ff0ce1fb9b5f2c5" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.697314 4841 scope.go:117] "RemoveContainer" containerID="aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b" Jan 30 06:15:41 crc kubenswrapper[4841]: E0130 06:15:41.698779 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b\": container with ID starting with aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b not found: ID does not exist" containerID="aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.698833 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b"} err="failed to get container status \"aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b\": rpc error: code = NotFound desc = could not find container \"aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b\": container with ID starting with aeb01163647453ae67a27cd2e1501b869990dbb2bd1a3ac1d31ef3e60afef30b not found: ID does not exist" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.698866 4841 scope.go:117] "RemoveContainer" containerID="0bcda87d19525168d363dfdf3c5712fed709745883945ba0613e43de47b053fb" Jan 30 06:15:41 crc kubenswrapper[4841]: E0130 06:15:41.699270 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bcda87d19525168d363dfdf3c5712fed709745883945ba0613e43de47b053fb\": container with ID starting with 0bcda87d19525168d363dfdf3c5712fed709745883945ba0613e43de47b053fb not found: ID does not exist" containerID="0bcda87d19525168d363dfdf3c5712fed709745883945ba0613e43de47b053fb" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.699327 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bcda87d19525168d363dfdf3c5712fed709745883945ba0613e43de47b053fb"} err="failed to get container status \"0bcda87d19525168d363dfdf3c5712fed709745883945ba0613e43de47b053fb\": rpc error: code = NotFound desc = could not find container \"0bcda87d19525168d363dfdf3c5712fed709745883945ba0613e43de47b053fb\": container with ID starting with 0bcda87d19525168d363dfdf3c5712fed709745883945ba0613e43de47b053fb not found: ID does not exist" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.699366 4841 scope.go:117] "RemoveContainer" containerID="b06f36e8f9ea8700d6a30ddb9726dd5a7feed677c7f2c0093ff0ce1fb9b5f2c5" Jan 30 06:15:41 crc kubenswrapper[4841]: E0130 06:15:41.699892 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b06f36e8f9ea8700d6a30ddb9726dd5a7feed677c7f2c0093ff0ce1fb9b5f2c5\": container with ID starting with b06f36e8f9ea8700d6a30ddb9726dd5a7feed677c7f2c0093ff0ce1fb9b5f2c5 not found: ID does not exist" containerID="b06f36e8f9ea8700d6a30ddb9726dd5a7feed677c7f2c0093ff0ce1fb9b5f2c5" Jan 30 06:15:41 crc kubenswrapper[4841]: I0130 06:15:41.699937 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06f36e8f9ea8700d6a30ddb9726dd5a7feed677c7f2c0093ff0ce1fb9b5f2c5"} err="failed to get container status \"b06f36e8f9ea8700d6a30ddb9726dd5a7feed677c7f2c0093ff0ce1fb9b5f2c5\": rpc error: code = NotFound desc = could not find container \"b06f36e8f9ea8700d6a30ddb9726dd5a7feed677c7f2c0093ff0ce1fb9b5f2c5\": container with ID starting with b06f36e8f9ea8700d6a30ddb9726dd5a7feed677c7f2c0093ff0ce1fb9b5f2c5 not found: ID does not exist" Jan 30 06:15:42 crc kubenswrapper[4841]: I0130 06:15:42.442824 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9433f72b-3889-48f5-b7d1-5c5f4d14bc76" path="/var/lib/kubelet/pods/9433f72b-3889-48f5-b7d1-5c5f4d14bc76/volumes" Jan 30 06:15:50 crc kubenswrapper[4841]: I0130 06:15:50.504278 4841 scope.go:117] "RemoveContainer" containerID="435aae0b9f4a9db91cf6f64a68c68cc9cf71c6c3a4aa3c5817295a3f6d932f9d" Jan 30 06:17:40 crc kubenswrapper[4841]: I0130 06:17:40.464505 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:17:40 crc kubenswrapper[4841]: I0130 06:17:40.465235 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:18:10 crc kubenswrapper[4841]: I0130 06:18:10.463562 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:18:10 crc kubenswrapper[4841]: I0130 06:18:10.464574 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:18:40 crc kubenswrapper[4841]: I0130 06:18:40.463997 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:18:40 crc kubenswrapper[4841]: I0130 06:18:40.464465 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:18:40 crc kubenswrapper[4841]: I0130 06:18:40.464510 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 06:18:40 crc kubenswrapper[4841]: I0130 06:18:40.465126 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:18:40 crc kubenswrapper[4841]: I0130 06:18:40.465191 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" gracePeriod=600 Jan 30 06:18:40 crc kubenswrapper[4841]: E0130 06:18:40.618117 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:18:41 crc kubenswrapper[4841]: I0130 06:18:41.290573 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" exitCode=0 Jan 30 06:18:41 crc kubenswrapper[4841]: I0130 06:18:41.290650 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842"} Jan 30 06:18:41 crc kubenswrapper[4841]: I0130 06:18:41.290707 4841 scope.go:117] "RemoveContainer" containerID="c3f2d4df2271d53f797d15ee95e4485f03a994633066034c36fad35ff2c88b42" Jan 30 06:18:41 crc kubenswrapper[4841]: I0130 06:18:41.291496 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:18:41 crc kubenswrapper[4841]: E0130 06:18:41.291875 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.577768 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-swf2b"] Jan 30 06:18:51 crc kubenswrapper[4841]: E0130 06:18:51.578957 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9433f72b-3889-48f5-b7d1-5c5f4d14bc76" containerName="extract-content" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.578983 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9433f72b-3889-48f5-b7d1-5c5f4d14bc76" containerName="extract-content" Jan 30 06:18:51 crc kubenswrapper[4841]: E0130 06:18:51.579014 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9433f72b-3889-48f5-b7d1-5c5f4d14bc76" containerName="extract-utilities" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.579029 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9433f72b-3889-48f5-b7d1-5c5f4d14bc76" containerName="extract-utilities" Jan 30 06:18:51 crc kubenswrapper[4841]: E0130 06:18:51.579048 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9433f72b-3889-48f5-b7d1-5c5f4d14bc76" containerName="registry-server" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.579061 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9433f72b-3889-48f5-b7d1-5c5f4d14bc76" containerName="registry-server" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.579337 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9433f72b-3889-48f5-b7d1-5c5f4d14bc76" containerName="registry-server" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.584233 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.588274 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-swf2b"] Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.708624 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398414c1-4a02-4e87-9cc3-2a47b985c4d5-catalog-content\") pod \"redhat-operators-swf2b\" (UID: \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\") " pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.708805 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398414c1-4a02-4e87-9cc3-2a47b985c4d5-utilities\") pod \"redhat-operators-swf2b\" (UID: \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\") " pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.708995 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q554\" (UniqueName: \"kubernetes.io/projected/398414c1-4a02-4e87-9cc3-2a47b985c4d5-kube-api-access-2q554\") pod \"redhat-operators-swf2b\" (UID: \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\") " pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.810665 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398414c1-4a02-4e87-9cc3-2a47b985c4d5-catalog-content\") pod \"redhat-operators-swf2b\" (UID: \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\") " pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.810736 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398414c1-4a02-4e87-9cc3-2a47b985c4d5-utilities\") pod \"redhat-operators-swf2b\" (UID: \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\") " pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.810788 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q554\" (UniqueName: \"kubernetes.io/projected/398414c1-4a02-4e87-9cc3-2a47b985c4d5-kube-api-access-2q554\") pod \"redhat-operators-swf2b\" (UID: \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\") " pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.811337 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398414c1-4a02-4e87-9cc3-2a47b985c4d5-catalog-content\") pod \"redhat-operators-swf2b\" (UID: \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\") " pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.811578 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398414c1-4a02-4e87-9cc3-2a47b985c4d5-utilities\") pod \"redhat-operators-swf2b\" (UID: \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\") " pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.845481 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q554\" (UniqueName: \"kubernetes.io/projected/398414c1-4a02-4e87-9cc3-2a47b985c4d5-kube-api-access-2q554\") pod \"redhat-operators-swf2b\" (UID: \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\") " pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:18:51 crc kubenswrapper[4841]: I0130 06:18:51.916811 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:18:52 crc kubenswrapper[4841]: I0130 06:18:52.157072 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-swf2b"] Jan 30 06:18:52 crc kubenswrapper[4841]: I0130 06:18:52.406348 4841 generic.go:334] "Generic (PLEG): container finished" podID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" containerID="bc8ecd2f1cd74194f32750b00bf386a87809f4c300e8e486eb851a805f913cbc" exitCode=0 Jan 30 06:18:52 crc kubenswrapper[4841]: I0130 06:18:52.406385 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-swf2b" event={"ID":"398414c1-4a02-4e87-9cc3-2a47b985c4d5","Type":"ContainerDied","Data":"bc8ecd2f1cd74194f32750b00bf386a87809f4c300e8e486eb851a805f913cbc"} Jan 30 06:18:52 crc kubenswrapper[4841]: I0130 06:18:52.406428 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-swf2b" event={"ID":"398414c1-4a02-4e87-9cc3-2a47b985c4d5","Type":"ContainerStarted","Data":"f6985dc15f010733fc8d9aa0a9b9987cdf1435565d677c94081329cf1e172b5c"} Jan 30 06:18:53 crc kubenswrapper[4841]: I0130 06:18:53.417099 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-swf2b" event={"ID":"398414c1-4a02-4e87-9cc3-2a47b985c4d5","Type":"ContainerStarted","Data":"d7f2355f599aefcc7465f130295730e50e218d9efdeab772d884754593213127"} Jan 30 06:18:54 crc kubenswrapper[4841]: I0130 06:18:54.429563 4841 generic.go:334] "Generic (PLEG): container finished" podID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" containerID="d7f2355f599aefcc7465f130295730e50e218d9efdeab772d884754593213127" exitCode=0 Jan 30 06:18:54 crc kubenswrapper[4841]: I0130 06:18:54.429682 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-swf2b" event={"ID":"398414c1-4a02-4e87-9cc3-2a47b985c4d5","Type":"ContainerDied","Data":"d7f2355f599aefcc7465f130295730e50e218d9efdeab772d884754593213127"} Jan 30 06:18:55 crc kubenswrapper[4841]: I0130 06:18:55.441951 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-swf2b" event={"ID":"398414c1-4a02-4e87-9cc3-2a47b985c4d5","Type":"ContainerStarted","Data":"ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197"} Jan 30 06:18:55 crc kubenswrapper[4841]: I0130 06:18:55.475655 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-swf2b" podStartSLOduration=2.001446316 podStartE2EDuration="4.475626405s" podCreationTimestamp="2026-01-30 06:18:51 +0000 UTC" firstStartedPulling="2026-01-30 06:18:52.408590414 +0000 UTC m=+4269.402063062" lastFinishedPulling="2026-01-30 06:18:54.882770473 +0000 UTC m=+4271.876243151" observedRunningTime="2026-01-30 06:18:55.468791496 +0000 UTC m=+4272.462264174" watchObservedRunningTime="2026-01-30 06:18:55.475626405 +0000 UTC m=+4272.469099083" Jan 30 06:18:56 crc kubenswrapper[4841]: I0130 06:18:56.433443 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:18:56 crc kubenswrapper[4841]: E0130 06:18:56.433904 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:19:01 crc kubenswrapper[4841]: I0130 06:19:01.917691 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:19:01 crc kubenswrapper[4841]: I0130 06:19:01.918197 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:19:02 crc kubenswrapper[4841]: I0130 06:19:02.999044 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-swf2b" podUID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" containerName="registry-server" probeResult="failure" output=< Jan 30 06:19:02 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 06:19:02 crc kubenswrapper[4841]: > Jan 30 06:19:11 crc kubenswrapper[4841]: I0130 06:19:11.432533 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:19:11 crc kubenswrapper[4841]: E0130 06:19:11.433755 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:19:11 crc kubenswrapper[4841]: I0130 06:19:11.994543 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:19:12 crc kubenswrapper[4841]: I0130 06:19:12.068886 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:19:12 crc kubenswrapper[4841]: I0130 06:19:12.245960 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-swf2b"] Jan 30 06:19:13 crc kubenswrapper[4841]: I0130 06:19:13.594784 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-swf2b" podUID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" containerName="registry-server" containerID="cri-o://ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197" gracePeriod=2 Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.394787 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.568106 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q554\" (UniqueName: \"kubernetes.io/projected/398414c1-4a02-4e87-9cc3-2a47b985c4d5-kube-api-access-2q554\") pod \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\" (UID: \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\") " Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.568186 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398414c1-4a02-4e87-9cc3-2a47b985c4d5-utilities\") pod \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\" (UID: \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\") " Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.568327 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398414c1-4a02-4e87-9cc3-2a47b985c4d5-catalog-content\") pod \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\" (UID: \"398414c1-4a02-4e87-9cc3-2a47b985c4d5\") " Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.574991 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398414c1-4a02-4e87-9cc3-2a47b985c4d5-utilities" (OuterVolumeSpecName: "utilities") pod "398414c1-4a02-4e87-9cc3-2a47b985c4d5" (UID: "398414c1-4a02-4e87-9cc3-2a47b985c4d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.585643 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398414c1-4a02-4e87-9cc3-2a47b985c4d5-kube-api-access-2q554" (OuterVolumeSpecName: "kube-api-access-2q554") pod "398414c1-4a02-4e87-9cc3-2a47b985c4d5" (UID: "398414c1-4a02-4e87-9cc3-2a47b985c4d5"). InnerVolumeSpecName "kube-api-access-2q554". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.593849 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q554\" (UniqueName: \"kubernetes.io/projected/398414c1-4a02-4e87-9cc3-2a47b985c4d5-kube-api-access-2q554\") on node \"crc\" DevicePath \"\"" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.600633 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398414c1-4a02-4e87-9cc3-2a47b985c4d5-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.626078 4841 generic.go:334] "Generic (PLEG): container finished" podID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" containerID="ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197" exitCode=0 Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.626122 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-swf2b" event={"ID":"398414c1-4a02-4e87-9cc3-2a47b985c4d5","Type":"ContainerDied","Data":"ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197"} Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.626152 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-swf2b" event={"ID":"398414c1-4a02-4e87-9cc3-2a47b985c4d5","Type":"ContainerDied","Data":"f6985dc15f010733fc8d9aa0a9b9987cdf1435565d677c94081329cf1e172b5c"} Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.626167 4841 scope.go:117] "RemoveContainer" containerID="ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.626268 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-swf2b" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.654927 4841 scope.go:117] "RemoveContainer" containerID="d7f2355f599aefcc7465f130295730e50e218d9efdeab772d884754593213127" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.680790 4841 scope.go:117] "RemoveContainer" containerID="bc8ecd2f1cd74194f32750b00bf386a87809f4c300e8e486eb851a805f913cbc" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.710887 4841 scope.go:117] "RemoveContainer" containerID="ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197" Jan 30 06:19:14 crc kubenswrapper[4841]: E0130 06:19:14.714707 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197\": container with ID starting with ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197 not found: ID does not exist" containerID="ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.714758 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197"} err="failed to get container status \"ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197\": rpc error: code = NotFound desc = could not find container \"ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197\": container with ID starting with ef5836e5c8d993eb844b413fa32251f54ea847e2dd40ef81a921ccd1f8125197 not found: ID does not exist" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.714791 4841 scope.go:117] "RemoveContainer" containerID="d7f2355f599aefcc7465f130295730e50e218d9efdeab772d884754593213127" Jan 30 06:19:14 crc kubenswrapper[4841]: E0130 06:19:14.720547 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7f2355f599aefcc7465f130295730e50e218d9efdeab772d884754593213127\": container with ID starting with d7f2355f599aefcc7465f130295730e50e218d9efdeab772d884754593213127 not found: ID does not exist" containerID="d7f2355f599aefcc7465f130295730e50e218d9efdeab772d884754593213127" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.720698 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7f2355f599aefcc7465f130295730e50e218d9efdeab772d884754593213127"} err="failed to get container status \"d7f2355f599aefcc7465f130295730e50e218d9efdeab772d884754593213127\": rpc error: code = NotFound desc = could not find container \"d7f2355f599aefcc7465f130295730e50e218d9efdeab772d884754593213127\": container with ID starting with d7f2355f599aefcc7465f130295730e50e218d9efdeab772d884754593213127 not found: ID does not exist" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.720811 4841 scope.go:117] "RemoveContainer" containerID="bc8ecd2f1cd74194f32750b00bf386a87809f4c300e8e486eb851a805f913cbc" Jan 30 06:19:14 crc kubenswrapper[4841]: E0130 06:19:14.721954 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8ecd2f1cd74194f32750b00bf386a87809f4c300e8e486eb851a805f913cbc\": container with ID starting with bc8ecd2f1cd74194f32750b00bf386a87809f4c300e8e486eb851a805f913cbc not found: ID does not exist" containerID="bc8ecd2f1cd74194f32750b00bf386a87809f4c300e8e486eb851a805f913cbc" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.722032 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8ecd2f1cd74194f32750b00bf386a87809f4c300e8e486eb851a805f913cbc"} err="failed to get container status \"bc8ecd2f1cd74194f32750b00bf386a87809f4c300e8e486eb851a805f913cbc\": rpc error: code = NotFound desc = could not find container \"bc8ecd2f1cd74194f32750b00bf386a87809f4c300e8e486eb851a805f913cbc\": container with ID starting with bc8ecd2f1cd74194f32750b00bf386a87809f4c300e8e486eb851a805f913cbc not found: ID does not exist" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.742777 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398414c1-4a02-4e87-9cc3-2a47b985c4d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "398414c1-4a02-4e87-9cc3-2a47b985c4d5" (UID: "398414c1-4a02-4e87-9cc3-2a47b985c4d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.804450 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398414c1-4a02-4e87-9cc3-2a47b985c4d5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.968673 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-swf2b"] Jan 30 06:19:14 crc kubenswrapper[4841]: I0130 06:19:14.979083 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-swf2b"] Jan 30 06:19:16 crc kubenswrapper[4841]: I0130 06:19:16.440057 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" path="/var/lib/kubelet/pods/398414c1-4a02-4e87-9cc3-2a47b985c4d5/volumes" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.473036 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-66mtv"] Jan 30 06:19:20 crc kubenswrapper[4841]: E0130 06:19:20.476136 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" containerName="extract-content" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.476172 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" containerName="extract-content" Jan 30 06:19:20 crc kubenswrapper[4841]: E0130 06:19:20.476210 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" containerName="registry-server" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.476223 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" containerName="registry-server" Jan 30 06:19:20 crc kubenswrapper[4841]: E0130 06:19:20.476258 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" containerName="extract-utilities" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.476273 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" containerName="extract-utilities" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.476838 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="398414c1-4a02-4e87-9cc3-2a47b985c4d5" containerName="registry-server" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.481100 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.518715 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-66mtv"] Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.587988 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c995dfe-8b43-43a9-b882-7b2549f32172-utilities\") pod \"certified-operators-66mtv\" (UID: \"4c995dfe-8b43-43a9-b882-7b2549f32172\") " pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.588224 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c995dfe-8b43-43a9-b882-7b2549f32172-catalog-content\") pod \"certified-operators-66mtv\" (UID: \"4c995dfe-8b43-43a9-b882-7b2549f32172\") " pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.588453 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8bgn\" (UniqueName: \"kubernetes.io/projected/4c995dfe-8b43-43a9-b882-7b2549f32172-kube-api-access-h8bgn\") pod \"certified-operators-66mtv\" (UID: \"4c995dfe-8b43-43a9-b882-7b2549f32172\") " pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.690322 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c995dfe-8b43-43a9-b882-7b2549f32172-utilities\") pod \"certified-operators-66mtv\" (UID: \"4c995dfe-8b43-43a9-b882-7b2549f32172\") " pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.690470 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c995dfe-8b43-43a9-b882-7b2549f32172-catalog-content\") pod \"certified-operators-66mtv\" (UID: \"4c995dfe-8b43-43a9-b882-7b2549f32172\") " pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.690597 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bgn\" (UniqueName: \"kubernetes.io/projected/4c995dfe-8b43-43a9-b882-7b2549f32172-kube-api-access-h8bgn\") pod \"certified-operators-66mtv\" (UID: \"4c995dfe-8b43-43a9-b882-7b2549f32172\") " pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.690974 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c995dfe-8b43-43a9-b882-7b2549f32172-utilities\") pod \"certified-operators-66mtv\" (UID: \"4c995dfe-8b43-43a9-b882-7b2549f32172\") " pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.691239 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c995dfe-8b43-43a9-b882-7b2549f32172-catalog-content\") pod \"certified-operators-66mtv\" (UID: \"4c995dfe-8b43-43a9-b882-7b2549f32172\") " pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.710439 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8bgn\" (UniqueName: \"kubernetes.io/projected/4c995dfe-8b43-43a9-b882-7b2549f32172-kube-api-access-h8bgn\") pod \"certified-operators-66mtv\" (UID: \"4c995dfe-8b43-43a9-b882-7b2549f32172\") " pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:20 crc kubenswrapper[4841]: I0130 06:19:20.825229 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:21 crc kubenswrapper[4841]: I0130 06:19:21.058041 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-66mtv"] Jan 30 06:19:21 crc kubenswrapper[4841]: I0130 06:19:21.687572 4841 generic.go:334] "Generic (PLEG): container finished" podID="4c995dfe-8b43-43a9-b882-7b2549f32172" containerID="4f21c0b20f0a3cbe5db63e8e3a673b5310d1128c78abcfe197e022389f07a0c4" exitCode=0 Jan 30 06:19:21 crc kubenswrapper[4841]: I0130 06:19:21.687624 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66mtv" event={"ID":"4c995dfe-8b43-43a9-b882-7b2549f32172","Type":"ContainerDied","Data":"4f21c0b20f0a3cbe5db63e8e3a673b5310d1128c78abcfe197e022389f07a0c4"} Jan 30 06:19:21 crc kubenswrapper[4841]: I0130 06:19:21.687940 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66mtv" event={"ID":"4c995dfe-8b43-43a9-b882-7b2549f32172","Type":"ContainerStarted","Data":"e56b45afafa607a95ae240238d9abfcde94b5e3f214e1da366fcefa51aed64a7"} Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.270801 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-znwlp"] Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.272533 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.304709 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-znwlp"] Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.312809 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7be-b680-4d67-ace5-892c478656fa-catalog-content\") pod \"community-operators-znwlp\" (UID: \"8bbac7be-b680-4d67-ace5-892c478656fa\") " pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.312866 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnzdh\" (UniqueName: \"kubernetes.io/projected/8bbac7be-b680-4d67-ace5-892c478656fa-kube-api-access-jnzdh\") pod \"community-operators-znwlp\" (UID: \"8bbac7be-b680-4d67-ace5-892c478656fa\") " pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.312971 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7be-b680-4d67-ace5-892c478656fa-utilities\") pod \"community-operators-znwlp\" (UID: \"8bbac7be-b680-4d67-ace5-892c478656fa\") " pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.414353 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7be-b680-4d67-ace5-892c478656fa-catalog-content\") pod \"community-operators-znwlp\" (UID: \"8bbac7be-b680-4d67-ace5-892c478656fa\") " pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.414609 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnzdh\" (UniqueName: \"kubernetes.io/projected/8bbac7be-b680-4d67-ace5-892c478656fa-kube-api-access-jnzdh\") pod \"community-operators-znwlp\" (UID: \"8bbac7be-b680-4d67-ace5-892c478656fa\") " pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.414634 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7be-b680-4d67-ace5-892c478656fa-utilities\") pod \"community-operators-znwlp\" (UID: \"8bbac7be-b680-4d67-ace5-892c478656fa\") " pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.414821 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7be-b680-4d67-ace5-892c478656fa-catalog-content\") pod \"community-operators-znwlp\" (UID: \"8bbac7be-b680-4d67-ace5-892c478656fa\") " pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.415003 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7be-b680-4d67-ace5-892c478656fa-utilities\") pod \"community-operators-znwlp\" (UID: \"8bbac7be-b680-4d67-ace5-892c478656fa\") " pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.441552 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnzdh\" (UniqueName: \"kubernetes.io/projected/8bbac7be-b680-4d67-ace5-892c478656fa-kube-api-access-jnzdh\") pod \"community-operators-znwlp\" (UID: \"8bbac7be-b680-4d67-ace5-892c478656fa\") " pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.605709 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:22 crc kubenswrapper[4841]: I0130 06:19:22.694856 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66mtv" event={"ID":"4c995dfe-8b43-43a9-b882-7b2549f32172","Type":"ContainerStarted","Data":"46dff822bd8fddc5d61566bff993bd1ad5dd64583bacd446f1a98b8824ca5ea6"} Jan 30 06:19:23 crc kubenswrapper[4841]: I0130 06:19:23.083139 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-znwlp"] Jan 30 06:19:23 crc kubenswrapper[4841]: I0130 06:19:23.710379 4841 generic.go:334] "Generic (PLEG): container finished" podID="4c995dfe-8b43-43a9-b882-7b2549f32172" containerID="46dff822bd8fddc5d61566bff993bd1ad5dd64583bacd446f1a98b8824ca5ea6" exitCode=0 Jan 30 06:19:23 crc kubenswrapper[4841]: I0130 06:19:23.710493 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66mtv" event={"ID":"4c995dfe-8b43-43a9-b882-7b2549f32172","Type":"ContainerDied","Data":"46dff822bd8fddc5d61566bff993bd1ad5dd64583bacd446f1a98b8824ca5ea6"} Jan 30 06:19:23 crc kubenswrapper[4841]: I0130 06:19:23.718706 4841 generic.go:334] "Generic (PLEG): container finished" podID="8bbac7be-b680-4d67-ace5-892c478656fa" containerID="6fbe1052f62498cd8642888de886366a26d453726ab777af0a5a67119e0c065f" exitCode=0 Jan 30 06:19:23 crc kubenswrapper[4841]: I0130 06:19:23.718759 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-znwlp" event={"ID":"8bbac7be-b680-4d67-ace5-892c478656fa","Type":"ContainerDied","Data":"6fbe1052f62498cd8642888de886366a26d453726ab777af0a5a67119e0c065f"} Jan 30 06:19:23 crc kubenswrapper[4841]: I0130 06:19:23.718798 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-znwlp" event={"ID":"8bbac7be-b680-4d67-ace5-892c478656fa","Type":"ContainerStarted","Data":"b8b3ed51d1fa51c57d280762860fa0b5b5c198bff109f210ea3f14ac2178b920"} Jan 30 06:19:24 crc kubenswrapper[4841]: I0130 06:19:24.727340 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66mtv" event={"ID":"4c995dfe-8b43-43a9-b882-7b2549f32172","Type":"ContainerStarted","Data":"ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f"} Jan 30 06:19:25 crc kubenswrapper[4841]: I0130 06:19:25.431873 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:19:25 crc kubenswrapper[4841]: E0130 06:19:25.432245 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:19:27 crc kubenswrapper[4841]: I0130 06:19:27.762651 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-znwlp" event={"ID":"8bbac7be-b680-4d67-ace5-892c478656fa","Type":"ContainerStarted","Data":"17824ccdc804f1ecd194f47685f47e2929e67c16e8cc98012589b6f095553de3"} Jan 30 06:19:27 crc kubenswrapper[4841]: I0130 06:19:27.780525 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-66mtv" podStartSLOduration=5.368287164 podStartE2EDuration="7.78051403s" podCreationTimestamp="2026-01-30 06:19:20 +0000 UTC" firstStartedPulling="2026-01-30 06:19:21.690106293 +0000 UTC m=+4298.683578961" lastFinishedPulling="2026-01-30 06:19:24.102333149 +0000 UTC m=+4301.095805827" observedRunningTime="2026-01-30 06:19:24.759897106 +0000 UTC m=+4301.753369744" watchObservedRunningTime="2026-01-30 06:19:27.78051403 +0000 UTC m=+4304.773986668" Jan 30 06:19:28 crc kubenswrapper[4841]: I0130 06:19:28.773490 4841 generic.go:334] "Generic (PLEG): container finished" podID="8bbac7be-b680-4d67-ace5-892c478656fa" containerID="17824ccdc804f1ecd194f47685f47e2929e67c16e8cc98012589b6f095553de3" exitCode=0 Jan 30 06:19:28 crc kubenswrapper[4841]: I0130 06:19:28.773556 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-znwlp" event={"ID":"8bbac7be-b680-4d67-ace5-892c478656fa","Type":"ContainerDied","Data":"17824ccdc804f1ecd194f47685f47e2929e67c16e8cc98012589b6f095553de3"} Jan 30 06:19:29 crc kubenswrapper[4841]: I0130 06:19:29.786253 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-znwlp" event={"ID":"8bbac7be-b680-4d67-ace5-892c478656fa","Type":"ContainerStarted","Data":"155b05eb254f48c2f10485a52e7e2bf94211a341ce668186c1fbc72cbeb1a48d"} Jan 30 06:19:29 crc kubenswrapper[4841]: I0130 06:19:29.829322 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-znwlp" podStartSLOduration=2.329699826 podStartE2EDuration="7.829292655s" podCreationTimestamp="2026-01-30 06:19:22 +0000 UTC" firstStartedPulling="2026-01-30 06:19:23.721326507 +0000 UTC m=+4300.714799155" lastFinishedPulling="2026-01-30 06:19:29.220919346 +0000 UTC m=+4306.214391984" observedRunningTime="2026-01-30 06:19:29.82264229 +0000 UTC m=+4306.816114938" watchObservedRunningTime="2026-01-30 06:19:29.829292655 +0000 UTC m=+4306.822765333" Jan 30 06:19:30 crc kubenswrapper[4841]: I0130 06:19:30.825784 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:30 crc kubenswrapper[4841]: I0130 06:19:30.825877 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:30 crc kubenswrapper[4841]: I0130 06:19:30.904961 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:31 crc kubenswrapper[4841]: I0130 06:19:31.851153 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:32 crc kubenswrapper[4841]: I0130 06:19:32.054010 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-66mtv"] Jan 30 06:19:32 crc kubenswrapper[4841]: I0130 06:19:32.606662 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:32 crc kubenswrapper[4841]: I0130 06:19:32.606712 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:32 crc kubenswrapper[4841]: I0130 06:19:32.680353 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:33 crc kubenswrapper[4841]: I0130 06:19:33.831387 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-66mtv" podUID="4c995dfe-8b43-43a9-b882-7b2549f32172" containerName="registry-server" containerID="cri-o://ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f" gracePeriod=2 Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.456106 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.605994 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c995dfe-8b43-43a9-b882-7b2549f32172-utilities\") pod \"4c995dfe-8b43-43a9-b882-7b2549f32172\" (UID: \"4c995dfe-8b43-43a9-b882-7b2549f32172\") " Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.606113 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c995dfe-8b43-43a9-b882-7b2549f32172-catalog-content\") pod \"4c995dfe-8b43-43a9-b882-7b2549f32172\" (UID: \"4c995dfe-8b43-43a9-b882-7b2549f32172\") " Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.606173 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8bgn\" (UniqueName: \"kubernetes.io/projected/4c995dfe-8b43-43a9-b882-7b2549f32172-kube-api-access-h8bgn\") pod \"4c995dfe-8b43-43a9-b882-7b2549f32172\" (UID: \"4c995dfe-8b43-43a9-b882-7b2549f32172\") " Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.607604 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c995dfe-8b43-43a9-b882-7b2549f32172-utilities" (OuterVolumeSpecName: "utilities") pod "4c995dfe-8b43-43a9-b882-7b2549f32172" (UID: "4c995dfe-8b43-43a9-b882-7b2549f32172"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.609007 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c995dfe-8b43-43a9-b882-7b2549f32172-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.614921 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c995dfe-8b43-43a9-b882-7b2549f32172-kube-api-access-h8bgn" (OuterVolumeSpecName: "kube-api-access-h8bgn") pod "4c995dfe-8b43-43a9-b882-7b2549f32172" (UID: "4c995dfe-8b43-43a9-b882-7b2549f32172"). InnerVolumeSpecName "kube-api-access-h8bgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.677661 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c995dfe-8b43-43a9-b882-7b2549f32172-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c995dfe-8b43-43a9-b882-7b2549f32172" (UID: "4c995dfe-8b43-43a9-b882-7b2549f32172"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.710196 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c995dfe-8b43-43a9-b882-7b2549f32172-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.710245 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8bgn\" (UniqueName: \"kubernetes.io/projected/4c995dfe-8b43-43a9-b882-7b2549f32172-kube-api-access-h8bgn\") on node \"crc\" DevicePath \"\"" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.844080 4841 generic.go:334] "Generic (PLEG): container finished" podID="4c995dfe-8b43-43a9-b882-7b2549f32172" containerID="ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f" exitCode=0 Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.844140 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66mtv" event={"ID":"4c995dfe-8b43-43a9-b882-7b2549f32172","Type":"ContainerDied","Data":"ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f"} Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.844178 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66mtv" event={"ID":"4c995dfe-8b43-43a9-b882-7b2549f32172","Type":"ContainerDied","Data":"e56b45afafa607a95ae240238d9abfcde94b5e3f214e1da366fcefa51aed64a7"} Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.844207 4841 scope.go:117] "RemoveContainer" containerID="ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.844393 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66mtv" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.876939 4841 scope.go:117] "RemoveContainer" containerID="46dff822bd8fddc5d61566bff993bd1ad5dd64583bacd446f1a98b8824ca5ea6" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.898165 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-66mtv"] Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.909719 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-66mtv"] Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.925535 4841 scope.go:117] "RemoveContainer" containerID="4f21c0b20f0a3cbe5db63e8e3a673b5310d1128c78abcfe197e022389f07a0c4" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.953625 4841 scope.go:117] "RemoveContainer" containerID="ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f" Jan 30 06:19:34 crc kubenswrapper[4841]: E0130 06:19:34.954342 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f\": container with ID starting with ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f not found: ID does not exist" containerID="ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.954390 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f"} err="failed to get container status \"ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f\": rpc error: code = NotFound desc = could not find container \"ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f\": container with ID starting with ea0fb37477c54db6f4d9f84717657b8ec1f9288231d8514e73f9118de7ba4b3f not found: ID does not exist" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.954486 4841 scope.go:117] "RemoveContainer" containerID="46dff822bd8fddc5d61566bff993bd1ad5dd64583bacd446f1a98b8824ca5ea6" Jan 30 06:19:34 crc kubenswrapper[4841]: E0130 06:19:34.955507 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46dff822bd8fddc5d61566bff993bd1ad5dd64583bacd446f1a98b8824ca5ea6\": container with ID starting with 46dff822bd8fddc5d61566bff993bd1ad5dd64583bacd446f1a98b8824ca5ea6 not found: ID does not exist" containerID="46dff822bd8fddc5d61566bff993bd1ad5dd64583bacd446f1a98b8824ca5ea6" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.955589 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46dff822bd8fddc5d61566bff993bd1ad5dd64583bacd446f1a98b8824ca5ea6"} err="failed to get container status \"46dff822bd8fddc5d61566bff993bd1ad5dd64583bacd446f1a98b8824ca5ea6\": rpc error: code = NotFound desc = could not find container \"46dff822bd8fddc5d61566bff993bd1ad5dd64583bacd446f1a98b8824ca5ea6\": container with ID starting with 46dff822bd8fddc5d61566bff993bd1ad5dd64583bacd446f1a98b8824ca5ea6 not found: ID does not exist" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.955645 4841 scope.go:117] "RemoveContainer" containerID="4f21c0b20f0a3cbe5db63e8e3a673b5310d1128c78abcfe197e022389f07a0c4" Jan 30 06:19:34 crc kubenswrapper[4841]: E0130 06:19:34.956112 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f21c0b20f0a3cbe5db63e8e3a673b5310d1128c78abcfe197e022389f07a0c4\": container with ID starting with 4f21c0b20f0a3cbe5db63e8e3a673b5310d1128c78abcfe197e022389f07a0c4 not found: ID does not exist" containerID="4f21c0b20f0a3cbe5db63e8e3a673b5310d1128c78abcfe197e022389f07a0c4" Jan 30 06:19:34 crc kubenswrapper[4841]: I0130 06:19:34.956303 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f21c0b20f0a3cbe5db63e8e3a673b5310d1128c78abcfe197e022389f07a0c4"} err="failed to get container status \"4f21c0b20f0a3cbe5db63e8e3a673b5310d1128c78abcfe197e022389f07a0c4\": rpc error: code = NotFound desc = could not find container \"4f21c0b20f0a3cbe5db63e8e3a673b5310d1128c78abcfe197e022389f07a0c4\": container with ID starting with 4f21c0b20f0a3cbe5db63e8e3a673b5310d1128c78abcfe197e022389f07a0c4 not found: ID does not exist" Jan 30 06:19:36 crc kubenswrapper[4841]: I0130 06:19:36.448353 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c995dfe-8b43-43a9-b882-7b2549f32172" path="/var/lib/kubelet/pods/4c995dfe-8b43-43a9-b882-7b2549f32172/volumes" Jan 30 06:19:39 crc kubenswrapper[4841]: I0130 06:19:39.431891 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:19:39 crc kubenswrapper[4841]: E0130 06:19:39.432610 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:19:42 crc kubenswrapper[4841]: I0130 06:19:42.671108 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-znwlp" Jan 30 06:19:42 crc kubenswrapper[4841]: I0130 06:19:42.748323 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-znwlp"] Jan 30 06:19:42 crc kubenswrapper[4841]: I0130 06:19:42.840967 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q52v7"] Jan 30 06:19:42 crc kubenswrapper[4841]: I0130 06:19:42.841638 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q52v7" podUID="0bb7f95e-112b-4fe7-89eb-398eea0d0472" containerName="registry-server" containerID="cri-o://2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67" gracePeriod=2 Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.282150 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q52v7" Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.454349 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bb7f95e-112b-4fe7-89eb-398eea0d0472-catalog-content\") pod \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\" (UID: \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\") " Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.454540 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bb7f95e-112b-4fe7-89eb-398eea0d0472-utilities\") pod \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\" (UID: \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\") " Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.454581 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqs2d\" (UniqueName: \"kubernetes.io/projected/0bb7f95e-112b-4fe7-89eb-398eea0d0472-kube-api-access-wqs2d\") pod \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\" (UID: \"0bb7f95e-112b-4fe7-89eb-398eea0d0472\") " Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.455051 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bb7f95e-112b-4fe7-89eb-398eea0d0472-utilities" (OuterVolumeSpecName: "utilities") pod "0bb7f95e-112b-4fe7-89eb-398eea0d0472" (UID: "0bb7f95e-112b-4fe7-89eb-398eea0d0472"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.465577 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bb7f95e-112b-4fe7-89eb-398eea0d0472-kube-api-access-wqs2d" (OuterVolumeSpecName: "kube-api-access-wqs2d") pod "0bb7f95e-112b-4fe7-89eb-398eea0d0472" (UID: "0bb7f95e-112b-4fe7-89eb-398eea0d0472"). InnerVolumeSpecName "kube-api-access-wqs2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.502801 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bb7f95e-112b-4fe7-89eb-398eea0d0472-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bb7f95e-112b-4fe7-89eb-398eea0d0472" (UID: "0bb7f95e-112b-4fe7-89eb-398eea0d0472"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.556036 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bb7f95e-112b-4fe7-89eb-398eea0d0472-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.556062 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqs2d\" (UniqueName: \"kubernetes.io/projected/0bb7f95e-112b-4fe7-89eb-398eea0d0472-kube-api-access-wqs2d\") on node \"crc\" DevicePath \"\"" Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.556073 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bb7f95e-112b-4fe7-89eb-398eea0d0472-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.949898 4841 generic.go:334] "Generic (PLEG): container finished" podID="0bb7f95e-112b-4fe7-89eb-398eea0d0472" containerID="2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67" exitCode=0 Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.949936 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q52v7" event={"ID":"0bb7f95e-112b-4fe7-89eb-398eea0d0472","Type":"ContainerDied","Data":"2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67"} Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.949961 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q52v7" event={"ID":"0bb7f95e-112b-4fe7-89eb-398eea0d0472","Type":"ContainerDied","Data":"42e3f78653573f945163d6a7de5c5140e1f008cee1e99223579aaca8b9d77c46"} Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.949978 4841 scope.go:117] "RemoveContainer" containerID="2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67" Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.949991 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q52v7" Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.968303 4841 scope.go:117] "RemoveContainer" containerID="9362a199c72b87cec3b2d1c31aeabfd87693215e2170d87296de65ee3ebe8cc3" Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.982569 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q52v7"] Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.990163 4841 scope.go:117] "RemoveContainer" containerID="db1e7cb6a870303d33a9e8f4ae67db7fafee5ae7f5ba69ff56b910eaf9e3c55e" Jan 30 06:19:43 crc kubenswrapper[4841]: I0130 06:19:43.990623 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q52v7"] Jan 30 06:19:44 crc kubenswrapper[4841]: I0130 06:19:44.016768 4841 scope.go:117] "RemoveContainer" containerID="2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67" Jan 30 06:19:44 crc kubenswrapper[4841]: E0130 06:19:44.017054 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67\": container with ID starting with 2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67 not found: ID does not exist" containerID="2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67" Jan 30 06:19:44 crc kubenswrapper[4841]: I0130 06:19:44.017100 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67"} err="failed to get container status \"2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67\": rpc error: code = NotFound desc = could not find container \"2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67\": container with ID starting with 2c72f4063a4f4855bd6bc1cf5f048fe0d3ad64acfd7da157400fae64fef0cd67 not found: ID does not exist" Jan 30 06:19:44 crc kubenswrapper[4841]: I0130 06:19:44.017125 4841 scope.go:117] "RemoveContainer" containerID="9362a199c72b87cec3b2d1c31aeabfd87693215e2170d87296de65ee3ebe8cc3" Jan 30 06:19:44 crc kubenswrapper[4841]: E0130 06:19:44.017324 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9362a199c72b87cec3b2d1c31aeabfd87693215e2170d87296de65ee3ebe8cc3\": container with ID starting with 9362a199c72b87cec3b2d1c31aeabfd87693215e2170d87296de65ee3ebe8cc3 not found: ID does not exist" containerID="9362a199c72b87cec3b2d1c31aeabfd87693215e2170d87296de65ee3ebe8cc3" Jan 30 06:19:44 crc kubenswrapper[4841]: I0130 06:19:44.017344 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9362a199c72b87cec3b2d1c31aeabfd87693215e2170d87296de65ee3ebe8cc3"} err="failed to get container status \"9362a199c72b87cec3b2d1c31aeabfd87693215e2170d87296de65ee3ebe8cc3\": rpc error: code = NotFound desc = could not find container \"9362a199c72b87cec3b2d1c31aeabfd87693215e2170d87296de65ee3ebe8cc3\": container with ID starting with 9362a199c72b87cec3b2d1c31aeabfd87693215e2170d87296de65ee3ebe8cc3 not found: ID does not exist" Jan 30 06:19:44 crc kubenswrapper[4841]: I0130 06:19:44.017358 4841 scope.go:117] "RemoveContainer" containerID="db1e7cb6a870303d33a9e8f4ae67db7fafee5ae7f5ba69ff56b910eaf9e3c55e" Jan 30 06:19:44 crc kubenswrapper[4841]: E0130 06:19:44.017600 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db1e7cb6a870303d33a9e8f4ae67db7fafee5ae7f5ba69ff56b910eaf9e3c55e\": container with ID starting with db1e7cb6a870303d33a9e8f4ae67db7fafee5ae7f5ba69ff56b910eaf9e3c55e not found: ID does not exist" containerID="db1e7cb6a870303d33a9e8f4ae67db7fafee5ae7f5ba69ff56b910eaf9e3c55e" Jan 30 06:19:44 crc kubenswrapper[4841]: I0130 06:19:44.017657 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db1e7cb6a870303d33a9e8f4ae67db7fafee5ae7f5ba69ff56b910eaf9e3c55e"} err="failed to get container status \"db1e7cb6a870303d33a9e8f4ae67db7fafee5ae7f5ba69ff56b910eaf9e3c55e\": rpc error: code = NotFound desc = could not find container \"db1e7cb6a870303d33a9e8f4ae67db7fafee5ae7f5ba69ff56b910eaf9e3c55e\": container with ID starting with db1e7cb6a870303d33a9e8f4ae67db7fafee5ae7f5ba69ff56b910eaf9e3c55e not found: ID does not exist" Jan 30 06:19:44 crc kubenswrapper[4841]: I0130 06:19:44.449552 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bb7f95e-112b-4fe7-89eb-398eea0d0472" path="/var/lib/kubelet/pods/0bb7f95e-112b-4fe7-89eb-398eea0d0472/volumes" Jan 30 06:19:53 crc kubenswrapper[4841]: I0130 06:19:53.432188 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:19:53 crc kubenswrapper[4841]: E0130 06:19:53.433183 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:20:08 crc kubenswrapper[4841]: I0130 06:20:08.433034 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:20:08 crc kubenswrapper[4841]: E0130 06:20:08.434164 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:20:21 crc kubenswrapper[4841]: I0130 06:20:21.432383 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:20:21 crc kubenswrapper[4841]: E0130 06:20:21.433325 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:20:36 crc kubenswrapper[4841]: I0130 06:20:36.432437 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:20:36 crc kubenswrapper[4841]: E0130 06:20:36.433520 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:20:51 crc kubenswrapper[4841]: I0130 06:20:51.432187 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:20:51 crc kubenswrapper[4841]: E0130 06:20:51.433321 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:21:04 crc kubenswrapper[4841]: I0130 06:21:04.439554 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:21:04 crc kubenswrapper[4841]: E0130 06:21:04.440528 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:21:18 crc kubenswrapper[4841]: I0130 06:21:18.432131 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:21:18 crc kubenswrapper[4841]: E0130 06:21:18.433394 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:21:33 crc kubenswrapper[4841]: I0130 06:21:33.432674 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:21:33 crc kubenswrapper[4841]: E0130 06:21:33.433847 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:21:47 crc kubenswrapper[4841]: I0130 06:21:47.432929 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:21:47 crc kubenswrapper[4841]: E0130 06:21:47.433939 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:21:59 crc kubenswrapper[4841]: I0130 06:21:59.432099 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:21:59 crc kubenswrapper[4841]: E0130 06:21:59.433238 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:22:14 crc kubenswrapper[4841]: I0130 06:22:14.438060 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:22:14 crc kubenswrapper[4841]: E0130 06:22:14.440632 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:22:28 crc kubenswrapper[4841]: I0130 06:22:28.432602 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:22:28 crc kubenswrapper[4841]: E0130 06:22:28.433453 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:22:42 crc kubenswrapper[4841]: I0130 06:22:42.432234 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:22:42 crc kubenswrapper[4841]: E0130 06:22:42.433093 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:22:53 crc kubenswrapper[4841]: I0130 06:22:53.431722 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:22:53 crc kubenswrapper[4841]: E0130 06:22:53.432666 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:23:08 crc kubenswrapper[4841]: I0130 06:23:08.431639 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:23:08 crc kubenswrapper[4841]: E0130 06:23:08.432817 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:23:22 crc kubenswrapper[4841]: I0130 06:23:22.432038 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:23:22 crc kubenswrapper[4841]: E0130 06:23:22.433161 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:23:35 crc kubenswrapper[4841]: I0130 06:23:35.431999 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:23:35 crc kubenswrapper[4841]: E0130 06:23:35.433032 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:23:50 crc kubenswrapper[4841]: I0130 06:23:50.432145 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:23:51 crc kubenswrapper[4841]: I0130 06:23:51.331890 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"420c2d89248701f2cd0f1e18fb3f4abc5f28d723a2143270388bf82acff15563"} Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.237988 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-j6z2x"] Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.247768 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-j6z2x"] Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.397011 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-67kfs"] Jan 30 06:25:18 crc kubenswrapper[4841]: E0130 06:25:18.397308 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c995dfe-8b43-43a9-b882-7b2549f32172" containerName="extract-content" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.397320 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c995dfe-8b43-43a9-b882-7b2549f32172" containerName="extract-content" Jan 30 06:25:18 crc kubenswrapper[4841]: E0130 06:25:18.397331 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bb7f95e-112b-4fe7-89eb-398eea0d0472" containerName="registry-server" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.397337 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb7f95e-112b-4fe7-89eb-398eea0d0472" containerName="registry-server" Jan 30 06:25:18 crc kubenswrapper[4841]: E0130 06:25:18.397351 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c995dfe-8b43-43a9-b882-7b2549f32172" containerName="registry-server" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.397357 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c995dfe-8b43-43a9-b882-7b2549f32172" containerName="registry-server" Jan 30 06:25:18 crc kubenswrapper[4841]: E0130 06:25:18.397370 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c995dfe-8b43-43a9-b882-7b2549f32172" containerName="extract-utilities" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.397377 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c995dfe-8b43-43a9-b882-7b2549f32172" containerName="extract-utilities" Jan 30 06:25:18 crc kubenswrapper[4841]: E0130 06:25:18.397387 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bb7f95e-112b-4fe7-89eb-398eea0d0472" containerName="extract-content" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.397393 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb7f95e-112b-4fe7-89eb-398eea0d0472" containerName="extract-content" Jan 30 06:25:18 crc kubenswrapper[4841]: E0130 06:25:18.397435 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bb7f95e-112b-4fe7-89eb-398eea0d0472" containerName="extract-utilities" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.397441 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb7f95e-112b-4fe7-89eb-398eea0d0472" containerName="extract-utilities" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.397552 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bb7f95e-112b-4fe7-89eb-398eea0d0472" containerName="registry-server" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.397564 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c995dfe-8b43-43a9-b882-7b2549f32172" containerName="registry-server" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.398048 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.401037 4841 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pw6w2" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.402494 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.403298 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.405849 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.422883 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-crc-storage\") pod \"crc-storage-crc-67kfs\" (UID: \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\") " pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.423051 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v85m\" (UniqueName: \"kubernetes.io/projected/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-kube-api-access-5v85m\") pod \"crc-storage-crc-67kfs\" (UID: \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\") " pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.423127 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-node-mnt\") pod \"crc-storage-crc-67kfs\" (UID: \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\") " pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.424132 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-67kfs"] Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.466072 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ced3c1ba-5caa-433f-b303-3513827bbb37" path="/var/lib/kubelet/pods/ced3c1ba-5caa-433f-b303-3513827bbb37/volumes" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.524762 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v85m\" (UniqueName: \"kubernetes.io/projected/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-kube-api-access-5v85m\") pod \"crc-storage-crc-67kfs\" (UID: \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\") " pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.524854 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-node-mnt\") pod \"crc-storage-crc-67kfs\" (UID: \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\") " pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.524925 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-crc-storage\") pod \"crc-storage-crc-67kfs\" (UID: \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\") " pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.525549 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-node-mnt\") pod \"crc-storage-crc-67kfs\" (UID: \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\") " pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.526216 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-crc-storage\") pod \"crc-storage-crc-67kfs\" (UID: \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\") " pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.559381 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v85m\" (UniqueName: \"kubernetes.io/projected/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-kube-api-access-5v85m\") pod \"crc-storage-crc-67kfs\" (UID: \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\") " pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:18 crc kubenswrapper[4841]: I0130 06:25:18.759684 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:19 crc kubenswrapper[4841]: I0130 06:25:19.073756 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-67kfs"] Jan 30 06:25:19 crc kubenswrapper[4841]: I0130 06:25:19.079734 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:25:19 crc kubenswrapper[4841]: I0130 06:25:19.187316 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-67kfs" event={"ID":"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482","Type":"ContainerStarted","Data":"2faa501c6c2a93a0d69920e9f4817c0f718df75f5b1c3482581f84348cbac0f1"} Jan 30 06:25:20 crc kubenswrapper[4841]: I0130 06:25:20.196995 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-67kfs" event={"ID":"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482","Type":"ContainerStarted","Data":"932f1a5c163df2bb79d66b8e0b351d3190a27f815cc4a1feb3d923ced9a08f84"} Jan 30 06:25:20 crc kubenswrapper[4841]: I0130 06:25:20.214787 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-67kfs" podStartSLOduration=1.429896821 podStartE2EDuration="2.214762515s" podCreationTimestamp="2026-01-30 06:25:18 +0000 UTC" firstStartedPulling="2026-01-30 06:25:19.079438492 +0000 UTC m=+4656.072911130" lastFinishedPulling="2026-01-30 06:25:19.864304146 +0000 UTC m=+4656.857776824" observedRunningTime="2026-01-30 06:25:20.21197026 +0000 UTC m=+4657.205442898" watchObservedRunningTime="2026-01-30 06:25:20.214762515 +0000 UTC m=+4657.208235163" Jan 30 06:25:21 crc kubenswrapper[4841]: I0130 06:25:21.228723 4841 generic.go:334] "Generic (PLEG): container finished" podID="fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482" containerID="932f1a5c163df2bb79d66b8e0b351d3190a27f815cc4a1feb3d923ced9a08f84" exitCode=0 Jan 30 06:25:21 crc kubenswrapper[4841]: I0130 06:25:21.228868 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-67kfs" event={"ID":"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482","Type":"ContainerDied","Data":"932f1a5c163df2bb79d66b8e0b351d3190a27f815cc4a1feb3d923ced9a08f84"} Jan 30 06:25:22 crc kubenswrapper[4841]: I0130 06:25:22.607234 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:22 crc kubenswrapper[4841]: I0130 06:25:22.694367 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-crc-storage\") pod \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\" (UID: \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\") " Jan 30 06:25:22 crc kubenswrapper[4841]: I0130 06:25:22.695086 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-node-mnt\") pod \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\" (UID: \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\") " Jan 30 06:25:22 crc kubenswrapper[4841]: I0130 06:25:22.695175 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482" (UID: "fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:25:22 crc kubenswrapper[4841]: I0130 06:25:22.695187 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v85m\" (UniqueName: \"kubernetes.io/projected/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-kube-api-access-5v85m\") pod \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\" (UID: \"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482\") " Jan 30 06:25:22 crc kubenswrapper[4841]: I0130 06:25:22.696181 4841 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 30 06:25:22 crc kubenswrapper[4841]: I0130 06:25:22.716097 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-kube-api-access-5v85m" (OuterVolumeSpecName: "kube-api-access-5v85m") pod "fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482" (UID: "fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482"). InnerVolumeSpecName "kube-api-access-5v85m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:25:22 crc kubenswrapper[4841]: I0130 06:25:22.722108 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482" (UID: "fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:25:22 crc kubenswrapper[4841]: I0130 06:25:22.797676 4841 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 30 06:25:22 crc kubenswrapper[4841]: I0130 06:25:22.797734 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v85m\" (UniqueName: \"kubernetes.io/projected/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482-kube-api-access-5v85m\") on node \"crc\" DevicePath \"\"" Jan 30 06:25:23 crc kubenswrapper[4841]: I0130 06:25:23.261317 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-67kfs" event={"ID":"fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482","Type":"ContainerDied","Data":"2faa501c6c2a93a0d69920e9f4817c0f718df75f5b1c3482581f84348cbac0f1"} Jan 30 06:25:23 crc kubenswrapper[4841]: I0130 06:25:23.261391 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2faa501c6c2a93a0d69920e9f4817c0f718df75f5b1c3482581f84348cbac0f1" Jan 30 06:25:23 crc kubenswrapper[4841]: I0130 06:25:23.261535 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-67kfs" Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.739631 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-67kfs"] Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.751767 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-67kfs"] Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.921893 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-knd4q"] Jan 30 06:25:24 crc kubenswrapper[4841]: E0130 06:25:24.922950 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482" containerName="storage" Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.922992 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482" containerName="storage" Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.923389 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482" containerName="storage" Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.924444 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.929088 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.929152 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.929184 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.929343 4841 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pw6w2" Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.935957 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-knd4q"] Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.936177 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/44a0f1db-953a-4765-aaa7-1e322049fa9e-node-mnt\") pod \"crc-storage-crc-knd4q\" (UID: \"44a0f1db-953a-4765-aaa7-1e322049fa9e\") " pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.936286 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/44a0f1db-953a-4765-aaa7-1e322049fa9e-crc-storage\") pod \"crc-storage-crc-knd4q\" (UID: \"44a0f1db-953a-4765-aaa7-1e322049fa9e\") " pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:24 crc kubenswrapper[4841]: I0130 06:25:24.936565 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2nnt\" (UniqueName: \"kubernetes.io/projected/44a0f1db-953a-4765-aaa7-1e322049fa9e-kube-api-access-f2nnt\") pod \"crc-storage-crc-knd4q\" (UID: \"44a0f1db-953a-4765-aaa7-1e322049fa9e\") " pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:25 crc kubenswrapper[4841]: I0130 06:25:25.038283 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/44a0f1db-953a-4765-aaa7-1e322049fa9e-node-mnt\") pod \"crc-storage-crc-knd4q\" (UID: \"44a0f1db-953a-4765-aaa7-1e322049fa9e\") " pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:25 crc kubenswrapper[4841]: I0130 06:25:25.038380 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/44a0f1db-953a-4765-aaa7-1e322049fa9e-crc-storage\") pod \"crc-storage-crc-knd4q\" (UID: \"44a0f1db-953a-4765-aaa7-1e322049fa9e\") " pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:25 crc kubenswrapper[4841]: I0130 06:25:25.038509 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2nnt\" (UniqueName: \"kubernetes.io/projected/44a0f1db-953a-4765-aaa7-1e322049fa9e-kube-api-access-f2nnt\") pod \"crc-storage-crc-knd4q\" (UID: \"44a0f1db-953a-4765-aaa7-1e322049fa9e\") " pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:25 crc kubenswrapper[4841]: I0130 06:25:25.038793 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/44a0f1db-953a-4765-aaa7-1e322049fa9e-node-mnt\") pod \"crc-storage-crc-knd4q\" (UID: \"44a0f1db-953a-4765-aaa7-1e322049fa9e\") " pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:25 crc kubenswrapper[4841]: I0130 06:25:25.039997 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/44a0f1db-953a-4765-aaa7-1e322049fa9e-crc-storage\") pod \"crc-storage-crc-knd4q\" (UID: \"44a0f1db-953a-4765-aaa7-1e322049fa9e\") " pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:25 crc kubenswrapper[4841]: I0130 06:25:25.069042 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2nnt\" (UniqueName: \"kubernetes.io/projected/44a0f1db-953a-4765-aaa7-1e322049fa9e-kube-api-access-f2nnt\") pod \"crc-storage-crc-knd4q\" (UID: \"44a0f1db-953a-4765-aaa7-1e322049fa9e\") " pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:25 crc kubenswrapper[4841]: I0130 06:25:25.250192 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:25 crc kubenswrapper[4841]: I0130 06:25:25.521930 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-knd4q"] Jan 30 06:25:26 crc kubenswrapper[4841]: I0130 06:25:26.296213 4841 generic.go:334] "Generic (PLEG): container finished" podID="44a0f1db-953a-4765-aaa7-1e322049fa9e" containerID="fa0610d8235b445c2fe02fad7284d2d3b6bf96b221738d4f39c2123e956a2fa7" exitCode=0 Jan 30 06:25:26 crc kubenswrapper[4841]: I0130 06:25:26.296372 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-knd4q" event={"ID":"44a0f1db-953a-4765-aaa7-1e322049fa9e","Type":"ContainerDied","Data":"fa0610d8235b445c2fe02fad7284d2d3b6bf96b221738d4f39c2123e956a2fa7"} Jan 30 06:25:26 crc kubenswrapper[4841]: I0130 06:25:26.296776 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-knd4q" event={"ID":"44a0f1db-953a-4765-aaa7-1e322049fa9e","Type":"ContainerStarted","Data":"f53bb13f74ba5eaea85190dea3bde4a71ca8f344d8ce8cd7b10637ca7a0964a6"} Jan 30 06:25:26 crc kubenswrapper[4841]: I0130 06:25:26.443693 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482" path="/var/lib/kubelet/pods/fd1d20a9-0edd-4a46-9a3b-a4b3a3d03482/volumes" Jan 30 06:25:27 crc kubenswrapper[4841]: I0130 06:25:27.724637 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:27 crc kubenswrapper[4841]: I0130 06:25:27.886354 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2nnt\" (UniqueName: \"kubernetes.io/projected/44a0f1db-953a-4765-aaa7-1e322049fa9e-kube-api-access-f2nnt\") pod \"44a0f1db-953a-4765-aaa7-1e322049fa9e\" (UID: \"44a0f1db-953a-4765-aaa7-1e322049fa9e\") " Jan 30 06:25:27 crc kubenswrapper[4841]: I0130 06:25:27.886554 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/44a0f1db-953a-4765-aaa7-1e322049fa9e-crc-storage\") pod \"44a0f1db-953a-4765-aaa7-1e322049fa9e\" (UID: \"44a0f1db-953a-4765-aaa7-1e322049fa9e\") " Jan 30 06:25:27 crc kubenswrapper[4841]: I0130 06:25:27.886661 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/44a0f1db-953a-4765-aaa7-1e322049fa9e-node-mnt\") pod \"44a0f1db-953a-4765-aaa7-1e322049fa9e\" (UID: \"44a0f1db-953a-4765-aaa7-1e322049fa9e\") " Jan 30 06:25:27 crc kubenswrapper[4841]: I0130 06:25:27.886931 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a0f1db-953a-4765-aaa7-1e322049fa9e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "44a0f1db-953a-4765-aaa7-1e322049fa9e" (UID: "44a0f1db-953a-4765-aaa7-1e322049fa9e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:25:27 crc kubenswrapper[4841]: I0130 06:25:27.887451 4841 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/44a0f1db-953a-4765-aaa7-1e322049fa9e-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 30 06:25:28 crc kubenswrapper[4841]: I0130 06:25:28.261972 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a0f1db-953a-4765-aaa7-1e322049fa9e-kube-api-access-f2nnt" (OuterVolumeSpecName: "kube-api-access-f2nnt") pod "44a0f1db-953a-4765-aaa7-1e322049fa9e" (UID: "44a0f1db-953a-4765-aaa7-1e322049fa9e"). InnerVolumeSpecName "kube-api-access-f2nnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:25:28 crc kubenswrapper[4841]: I0130 06:25:28.285350 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a0f1db-953a-4765-aaa7-1e322049fa9e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "44a0f1db-953a-4765-aaa7-1e322049fa9e" (UID: "44a0f1db-953a-4765-aaa7-1e322049fa9e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:25:28 crc kubenswrapper[4841]: I0130 06:25:28.293132 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2nnt\" (UniqueName: \"kubernetes.io/projected/44a0f1db-953a-4765-aaa7-1e322049fa9e-kube-api-access-f2nnt\") on node \"crc\" DevicePath \"\"" Jan 30 06:25:28 crc kubenswrapper[4841]: I0130 06:25:28.293172 4841 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/44a0f1db-953a-4765-aaa7-1e322049fa9e-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 30 06:25:28 crc kubenswrapper[4841]: I0130 06:25:28.327682 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-knd4q" event={"ID":"44a0f1db-953a-4765-aaa7-1e322049fa9e","Type":"ContainerDied","Data":"f53bb13f74ba5eaea85190dea3bde4a71ca8f344d8ce8cd7b10637ca7a0964a6"} Jan 30 06:25:28 crc kubenswrapper[4841]: I0130 06:25:28.327741 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f53bb13f74ba5eaea85190dea3bde4a71ca8f344d8ce8cd7b10637ca7a0964a6" Jan 30 06:25:28 crc kubenswrapper[4841]: I0130 06:25:28.327758 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-knd4q" Jan 30 06:25:50 crc kubenswrapper[4841]: I0130 06:25:50.794471 4841 scope.go:117] "RemoveContainer" containerID="08f903c700f19c8be6dcb8a9d2516ee34a4d54ef1ac4910e363afcd5175145d3" Jan 30 06:26:10 crc kubenswrapper[4841]: I0130 06:26:10.463959 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:26:10 crc kubenswrapper[4841]: I0130 06:26:10.464936 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:26:40 crc kubenswrapper[4841]: I0130 06:26:40.463280 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:26:40 crc kubenswrapper[4841]: I0130 06:26:40.463939 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:27:10 crc kubenswrapper[4841]: I0130 06:27:10.464169 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:27:10 crc kubenswrapper[4841]: I0130 06:27:10.465599 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:27:10 crc kubenswrapper[4841]: I0130 06:27:10.465709 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 06:27:10 crc kubenswrapper[4841]: I0130 06:27:10.466481 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"420c2d89248701f2cd0f1e18fb3f4abc5f28d723a2143270388bf82acff15563"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:27:10 crc kubenswrapper[4841]: I0130 06:27:10.466589 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://420c2d89248701f2cd0f1e18fb3f4abc5f28d723a2143270388bf82acff15563" gracePeriod=600 Jan 30 06:27:11 crc kubenswrapper[4841]: I0130 06:27:11.360284 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="420c2d89248701f2cd0f1e18fb3f4abc5f28d723a2143270388bf82acff15563" exitCode=0 Jan 30 06:27:11 crc kubenswrapper[4841]: I0130 06:27:11.360355 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"420c2d89248701f2cd0f1e18fb3f4abc5f28d723a2143270388bf82acff15563"} Jan 30 06:27:11 crc kubenswrapper[4841]: I0130 06:27:11.360715 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176"} Jan 30 06:27:11 crc kubenswrapper[4841]: I0130 06:27:11.360751 4841 scope.go:117] "RemoveContainer" containerID="dd6ce03c4a950be0ea9145b9c71723c813d327d8f14e269efb8c5d52277d9842" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.149455 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5986db9b4f-7f9vc"] Jan 30 06:27:28 crc kubenswrapper[4841]: E0130 06:27:28.151109 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a0f1db-953a-4765-aaa7-1e322049fa9e" containerName="storage" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.151192 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a0f1db-953a-4765-aaa7-1e322049fa9e" containerName="storage" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.151385 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a0f1db-953a-4765-aaa7-1e322049fa9e" containerName="storage" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.152111 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.153977 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56bbd59dc5-jmd7c"] Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.154909 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.154975 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.155143 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.156533 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.157250 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.157699 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ts6fx" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.160591 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56bbd59dc5-jmd7c"] Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.177877 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5986db9b4f-7f9vc"] Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.241412 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45qv7\" (UniqueName: \"kubernetes.io/projected/ee3c0806-1f54-4280-8acc-280f1b360e10-kube-api-access-45qv7\") pod \"dnsmasq-dns-5986db9b4f-7f9vc\" (UID: \"ee3c0806-1f54-4280-8acc-280f1b360e10\") " pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.241530 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-config\") pod \"dnsmasq-dns-56bbd59dc5-jmd7c\" (UID: \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\") " pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.241560 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp28z\" (UniqueName: \"kubernetes.io/projected/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-kube-api-access-zp28z\") pod \"dnsmasq-dns-56bbd59dc5-jmd7c\" (UID: \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\") " pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.241591 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee3c0806-1f54-4280-8acc-280f1b360e10-config\") pod \"dnsmasq-dns-5986db9b4f-7f9vc\" (UID: \"ee3c0806-1f54-4280-8acc-280f1b360e10\") " pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.241609 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-dns-svc\") pod \"dnsmasq-dns-56bbd59dc5-jmd7c\" (UID: \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\") " pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.343282 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-config\") pod \"dnsmasq-dns-56bbd59dc5-jmd7c\" (UID: \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\") " pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.343327 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp28z\" (UniqueName: \"kubernetes.io/projected/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-kube-api-access-zp28z\") pod \"dnsmasq-dns-56bbd59dc5-jmd7c\" (UID: \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\") " pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.343363 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee3c0806-1f54-4280-8acc-280f1b360e10-config\") pod \"dnsmasq-dns-5986db9b4f-7f9vc\" (UID: \"ee3c0806-1f54-4280-8acc-280f1b360e10\") " pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.343380 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-dns-svc\") pod \"dnsmasq-dns-56bbd59dc5-jmd7c\" (UID: \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\") " pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.343433 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45qv7\" (UniqueName: \"kubernetes.io/projected/ee3c0806-1f54-4280-8acc-280f1b360e10-kube-api-access-45qv7\") pod \"dnsmasq-dns-5986db9b4f-7f9vc\" (UID: \"ee3c0806-1f54-4280-8acc-280f1b360e10\") " pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.344160 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-config\") pod \"dnsmasq-dns-56bbd59dc5-jmd7c\" (UID: \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\") " pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.344354 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-dns-svc\") pod \"dnsmasq-dns-56bbd59dc5-jmd7c\" (UID: \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\") " pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.344520 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee3c0806-1f54-4280-8acc-280f1b360e10-config\") pod \"dnsmasq-dns-5986db9b4f-7f9vc\" (UID: \"ee3c0806-1f54-4280-8acc-280f1b360e10\") " pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.365549 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45qv7\" (UniqueName: \"kubernetes.io/projected/ee3c0806-1f54-4280-8acc-280f1b360e10-kube-api-access-45qv7\") pod \"dnsmasq-dns-5986db9b4f-7f9vc\" (UID: \"ee3c0806-1f54-4280-8acc-280f1b360e10\") " pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.370235 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp28z\" (UniqueName: \"kubernetes.io/projected/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-kube-api-access-zp28z\") pod \"dnsmasq-dns-56bbd59dc5-jmd7c\" (UID: \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\") " pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.393702 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56bbd59dc5-jmd7c"] Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.394149 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.421036 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865d9b578f-r9qc4"] Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.422135 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.433320 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865d9b578f-r9qc4"] Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.470332 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.547426 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238ad799-58fb-4f52-8873-055218e25807-config\") pod \"dnsmasq-dns-865d9b578f-r9qc4\" (UID: \"238ad799-58fb-4f52-8873-055218e25807\") " pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.547485 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/238ad799-58fb-4f52-8873-055218e25807-dns-svc\") pod \"dnsmasq-dns-865d9b578f-r9qc4\" (UID: \"238ad799-58fb-4f52-8873-055218e25807\") " pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.547509 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gll22\" (UniqueName: \"kubernetes.io/projected/238ad799-58fb-4f52-8873-055218e25807-kube-api-access-gll22\") pod \"dnsmasq-dns-865d9b578f-r9qc4\" (UID: \"238ad799-58fb-4f52-8873-055218e25807\") " pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.649755 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238ad799-58fb-4f52-8873-055218e25807-config\") pod \"dnsmasq-dns-865d9b578f-r9qc4\" (UID: \"238ad799-58fb-4f52-8873-055218e25807\") " pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.649827 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/238ad799-58fb-4f52-8873-055218e25807-dns-svc\") pod \"dnsmasq-dns-865d9b578f-r9qc4\" (UID: \"238ad799-58fb-4f52-8873-055218e25807\") " pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.649856 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gll22\" (UniqueName: \"kubernetes.io/projected/238ad799-58fb-4f52-8873-055218e25807-kube-api-access-gll22\") pod \"dnsmasq-dns-865d9b578f-r9qc4\" (UID: \"238ad799-58fb-4f52-8873-055218e25807\") " pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.652932 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238ad799-58fb-4f52-8873-055218e25807-config\") pod \"dnsmasq-dns-865d9b578f-r9qc4\" (UID: \"238ad799-58fb-4f52-8873-055218e25807\") " pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.654464 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/238ad799-58fb-4f52-8873-055218e25807-dns-svc\") pod \"dnsmasq-dns-865d9b578f-r9qc4\" (UID: \"238ad799-58fb-4f52-8873-055218e25807\") " pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.689499 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gll22\" (UniqueName: \"kubernetes.io/projected/238ad799-58fb-4f52-8873-055218e25807-kube-api-access-gll22\") pod \"dnsmasq-dns-865d9b578f-r9qc4\" (UID: \"238ad799-58fb-4f52-8873-055218e25807\") " pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.702658 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5986db9b4f-7f9vc"] Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.715028 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-r27xj"] Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.717971 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.728856 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-r27xj"] Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.770920 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.854235 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24b52\" (UniqueName: \"kubernetes.io/projected/17731592-890a-403a-86f6-61223bcd5320-kube-api-access-24b52\") pod \"dnsmasq-dns-5d79f765b5-r27xj\" (UID: \"17731592-890a-403a-86f6-61223bcd5320\") " pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.854276 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17731592-890a-403a-86f6-61223bcd5320-config\") pod \"dnsmasq-dns-5d79f765b5-r27xj\" (UID: \"17731592-890a-403a-86f6-61223bcd5320\") " pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.854472 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17731592-890a-403a-86f6-61223bcd5320-dns-svc\") pod \"dnsmasq-dns-5d79f765b5-r27xj\" (UID: \"17731592-890a-403a-86f6-61223bcd5320\") " pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.956184 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24b52\" (UniqueName: \"kubernetes.io/projected/17731592-890a-403a-86f6-61223bcd5320-kube-api-access-24b52\") pod \"dnsmasq-dns-5d79f765b5-r27xj\" (UID: \"17731592-890a-403a-86f6-61223bcd5320\") " pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.956237 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17731592-890a-403a-86f6-61223bcd5320-config\") pod \"dnsmasq-dns-5d79f765b5-r27xj\" (UID: \"17731592-890a-403a-86f6-61223bcd5320\") " pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.956294 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17731592-890a-403a-86f6-61223bcd5320-dns-svc\") pod \"dnsmasq-dns-5d79f765b5-r27xj\" (UID: \"17731592-890a-403a-86f6-61223bcd5320\") " pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.957195 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17731592-890a-403a-86f6-61223bcd5320-config\") pod \"dnsmasq-dns-5d79f765b5-r27xj\" (UID: \"17731592-890a-403a-86f6-61223bcd5320\") " pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.957347 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17731592-890a-403a-86f6-61223bcd5320-dns-svc\") pod \"dnsmasq-dns-5d79f765b5-r27xj\" (UID: \"17731592-890a-403a-86f6-61223bcd5320\") " pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.968071 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56bbd59dc5-jmd7c"] Jan 30 06:27:28 crc kubenswrapper[4841]: I0130 06:27:28.972628 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24b52\" (UniqueName: \"kubernetes.io/projected/17731592-890a-403a-86f6-61223bcd5320-kube-api-access-24b52\") pod \"dnsmasq-dns-5d79f765b5-r27xj\" (UID: \"17731592-890a-403a-86f6-61223bcd5320\") " pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:28 crc kubenswrapper[4841]: W0130 06:27:28.981781 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8a4c4ea_eb9b_4fff_a2ef_fb7de34ca5fe.slice/crio-ccab9018a1f288e8802df32ffa158d7c53889724ab7510a44b5be17a36359cfa WatchSource:0}: Error finding container ccab9018a1f288e8802df32ffa158d7c53889724ab7510a44b5be17a36359cfa: Status 404 returned error can't find the container with id ccab9018a1f288e8802df32ffa158d7c53889724ab7510a44b5be17a36359cfa Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.037238 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.116537 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5986db9b4f-7f9vc"] Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.294150 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865d9b578f-r9qc4"] Jan 30 06:27:29 crc kubenswrapper[4841]: W0130 06:27:29.297461 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod238ad799_58fb_4f52_8873_055218e25807.slice/crio-63971e926a0650f04298faccb80504a79858aff98018b43f53c0a6bce8298197 WatchSource:0}: Error finding container 63971e926a0650f04298faccb80504a79858aff98018b43f53c0a6bce8298197: Status 404 returned error can't find the container with id 63971e926a0650f04298faccb80504a79858aff98018b43f53c0a6bce8298197 Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.484949 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-r27xj"] Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.540355 4841 generic.go:334] "Generic (PLEG): container finished" podID="f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe" containerID="d4afb9e26af85c934dae9d73300fe97d0b1191f494cdeaf6c8dbff0b2dda015b" exitCode=0 Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.540483 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" event={"ID":"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe","Type":"ContainerDied","Data":"d4afb9e26af85c934dae9d73300fe97d0b1191f494cdeaf6c8dbff0b2dda015b"} Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.540526 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" event={"ID":"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe","Type":"ContainerStarted","Data":"ccab9018a1f288e8802df32ffa158d7c53889724ab7510a44b5be17a36359cfa"} Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.542484 4841 generic.go:334] "Generic (PLEG): container finished" podID="238ad799-58fb-4f52-8873-055218e25807" containerID="8b31fddee532dd3148666f46a730c8823bd9270b3770fd432bb7b4fcfc146d4e" exitCode=0 Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.542558 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" event={"ID":"238ad799-58fb-4f52-8873-055218e25807","Type":"ContainerDied","Data":"8b31fddee532dd3148666f46a730c8823bd9270b3770fd432bb7b4fcfc146d4e"} Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.542588 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" event={"ID":"238ad799-58fb-4f52-8873-055218e25807","Type":"ContainerStarted","Data":"63971e926a0650f04298faccb80504a79858aff98018b43f53c0a6bce8298197"} Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.545653 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" event={"ID":"17731592-890a-403a-86f6-61223bcd5320","Type":"ContainerStarted","Data":"b50eb118fbf4ffed03434a01e8639c548780a2e050e3faddfcf663b225838241"} Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.547031 4841 generic.go:334] "Generic (PLEG): container finished" podID="ee3c0806-1f54-4280-8acc-280f1b360e10" containerID="ebb004b906d4f9e41e44ae75a82b6d0b90cce1e16aa2033e6548d4c3324303de" exitCode=0 Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.547064 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" event={"ID":"ee3c0806-1f54-4280-8acc-280f1b360e10","Type":"ContainerDied","Data":"ebb004b906d4f9e41e44ae75a82b6d0b90cce1e16aa2033e6548d4c3324303de"} Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.547084 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" event={"ID":"ee3c0806-1f54-4280-8acc-280f1b360e10","Type":"ContainerStarted","Data":"1c8c41eed6a639db1b29a53a022942c9ba2c5079e934b9bae3fd97b828784d4d"} Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.610640 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.611819 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.618930 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.619126 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.619243 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.619365 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.619699 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cwgq7" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.619740 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.619873 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.621266 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.693131 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.693411 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fb4c14d-b45a-48d9-8233-d69c8928f10a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.693438 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4954808d-f069-4715-a5fb-55a2c904e83a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.693471 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.693503 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.693532 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.693593 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2w5v\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-kube-api-access-n2w5v\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.693613 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.693633 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fb4c14d-b45a-48d9-8233-d69c8928f10a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.693661 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.693677 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.797277 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2w5v\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-kube-api-access-n2w5v\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.797329 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.797353 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fb4c14d-b45a-48d9-8233-d69c8928f10a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.797389 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.797429 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.797485 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.797507 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fb4c14d-b45a-48d9-8233-d69c8928f10a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.797533 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4954808d-f069-4715-a5fb-55a2c904e83a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.797553 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.797587 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.797626 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.798090 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.802044 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.804696 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.807597 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fb4c14d-b45a-48d9-8233-d69c8928f10a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.807778 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.808185 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fb4c14d-b45a-48d9-8233-d69c8928f10a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.808482 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.809127 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.809466 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.812309 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.812337 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4954808d-f069-4715-a5fb-55a2c904e83a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/21cc17baf9198502c98b0abbf0c18d1a023cbe3555ccedbc346e9059d9fdd614/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.825313 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2w5v\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-kube-api-access-n2w5v\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.831728 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.839095 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.839211 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.842662 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.842938 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.843144 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.843232 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.843305 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.847516 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.847669 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x22vp" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.866916 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4954808d-f069-4715-a5fb-55a2c904e83a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.898612 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.898677 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-config-data\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.898695 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.898733 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvk2x\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-kube-api-access-wvk2x\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.898749 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db910dc4-3d79-4346-bce7-5ee16ef0576d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.898765 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.898784 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db910dc4-3d79-4346-bce7-5ee16ef0576d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.898810 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.898828 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.898851 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.898868 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.918231 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.923767 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" Jan 30 06:27:29 crc kubenswrapper[4841]: I0130 06:27:29.963106 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.000080 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45qv7\" (UniqueName: \"kubernetes.io/projected/ee3c0806-1f54-4280-8acc-280f1b360e10-kube-api-access-45qv7\") pod \"ee3c0806-1f54-4280-8acc-280f1b360e10\" (UID: \"ee3c0806-1f54-4280-8acc-280f1b360e10\") " Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.000232 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-dns-svc\") pod \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\" (UID: \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\") " Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.000437 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-config\") pod \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\" (UID: \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\") " Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.000618 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee3c0806-1f54-4280-8acc-280f1b360e10-config\") pod \"ee3c0806-1f54-4280-8acc-280f1b360e10\" (UID: \"ee3c0806-1f54-4280-8acc-280f1b360e10\") " Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.000748 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp28z\" (UniqueName: \"kubernetes.io/projected/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-kube-api-access-zp28z\") pod \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\" (UID: \"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe\") " Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.001044 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.001174 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-config-data\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.001303 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.001455 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvk2x\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-kube-api-access-wvk2x\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.001567 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db910dc4-3d79-4346-bce7-5ee16ef0576d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.001670 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.001764 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db910dc4-3d79-4346-bce7-5ee16ef0576d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.001872 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.001968 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.002115 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.002227 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.002997 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-config-data\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.003284 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.003994 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.007115 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-kube-api-access-zp28z" (OuterVolumeSpecName: "kube-api-access-zp28z") pod "f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe" (UID: "f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe"). InnerVolumeSpecName "kube-api-access-zp28z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.007291 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.007468 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.007840 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.007956 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7e36f4f0e8e3b9d51d5bc08cffff477e2bd49540e6100ea70e379b52ab5390c0/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.007895 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.009355 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db910dc4-3d79-4346-bce7-5ee16ef0576d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.010005 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.010981 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3c0806-1f54-4280-8acc-280f1b360e10-kube-api-access-45qv7" (OuterVolumeSpecName: "kube-api-access-45qv7") pod "ee3c0806-1f54-4280-8acc-280f1b360e10" (UID: "ee3c0806-1f54-4280-8acc-280f1b360e10"). InnerVolumeSpecName "kube-api-access-45qv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.015507 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db910dc4-3d79-4346-bce7-5ee16ef0576d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.019721 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvk2x\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-kube-api-access-wvk2x\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.024741 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-config" (OuterVolumeSpecName: "config") pod "f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe" (UID: "f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.027197 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe" (UID: "f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.027273 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee3c0806-1f54-4280-8acc-280f1b360e10-config" (OuterVolumeSpecName: "config") pod "ee3c0806-1f54-4280-8acc-280f1b360e10" (UID: "ee3c0806-1f54-4280-8acc-280f1b360e10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.039252 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\") pod \"rabbitmq-server-0\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.103970 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45qv7\" (UniqueName: \"kubernetes.io/projected/ee3c0806-1f54-4280-8acc-280f1b360e10-kube-api-access-45qv7\") on node \"crc\" DevicePath \"\"" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.104007 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.104019 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.104030 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee3c0806-1f54-4280-8acc-280f1b360e10-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.104042 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp28z\" (UniqueName: \"kubernetes.io/projected/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe-kube-api-access-zp28z\") on node \"crc\" DevicePath \"\"" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.199615 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:27:30 crc kubenswrapper[4841]: W0130 06:27:30.201728 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fb4c14d_b45a_48d9_8233_d69c8928f10a.slice/crio-bfc9c909cf056d8c0027597c3254e35eb7fb2d657113bc9e38c4c961543b5a92 WatchSource:0}: Error finding container bfc9c909cf056d8c0027597c3254e35eb7fb2d657113bc9e38c4c961543b5a92: Status 404 returned error can't find the container with id bfc9c909cf056d8c0027597c3254e35eb7fb2d657113bc9e38c4c961543b5a92 Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.214807 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.449109 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.554685 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" event={"ID":"f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe","Type":"ContainerDied","Data":"ccab9018a1f288e8802df32ffa158d7c53889724ab7510a44b5be17a36359cfa"} Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.554763 4841 scope.go:117] "RemoveContainer" containerID="d4afb9e26af85c934dae9d73300fe97d0b1191f494cdeaf6c8dbff0b2dda015b" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.554882 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56bbd59dc5-jmd7c" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.558355 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" event={"ID":"238ad799-58fb-4f52-8873-055218e25807","Type":"ContainerStarted","Data":"d789deb7f5417992aa89e3a85761f1b1117a6985ff588d45222bc12c6f050cb8"} Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.558540 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.560291 4841 generic.go:334] "Generic (PLEG): container finished" podID="17731592-890a-403a-86f6-61223bcd5320" containerID="6f8d8fd789a3a8c5538159b9fa3cbdbb5f76436fde77899c3b06b5e6c78713f0" exitCode=0 Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.560358 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" event={"ID":"17731592-890a-403a-86f6-61223bcd5320","Type":"ContainerDied","Data":"6f8d8fd789a3a8c5538159b9fa3cbdbb5f76436fde77899c3b06b5e6c78713f0"} Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.564716 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" event={"ID":"ee3c0806-1f54-4280-8acc-280f1b360e10","Type":"ContainerDied","Data":"1c8c41eed6a639db1b29a53a022942c9ba2c5079e934b9bae3fd97b828784d4d"} Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.564759 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5986db9b4f-7f9vc" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.568817 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"db910dc4-3d79-4346-bce7-5ee16ef0576d","Type":"ContainerStarted","Data":"df9d5ad62a832444f4497f40f83814cbf1ecd3e1add32d084c77a5480c17989a"} Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.573321 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fb4c14d-b45a-48d9-8233-d69c8928f10a","Type":"ContainerStarted","Data":"bfc9c909cf056d8c0027597c3254e35eb7fb2d657113bc9e38c4c961543b5a92"} Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.596287 4841 scope.go:117] "RemoveContainer" containerID="ebb004b906d4f9e41e44ae75a82b6d0b90cce1e16aa2033e6548d4c3324303de" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.623846 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56bbd59dc5-jmd7c"] Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.653185 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56bbd59dc5-jmd7c"] Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.656880 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" podStartSLOduration=2.656863895 podStartE2EDuration="2.656863895s" podCreationTimestamp="2026-01-30 06:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:27:30.60857125 +0000 UTC m=+4787.602043898" watchObservedRunningTime="2026-01-30 06:27:30.656863895 +0000 UTC m=+4787.650336523" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.693320 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5986db9b4f-7f9vc"] Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.700629 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5986db9b4f-7f9vc"] Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.871258 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 30 06:27:30 crc kubenswrapper[4841]: E0130 06:27:30.871744 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe" containerName="init" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.871758 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe" containerName="init" Jan 30 06:27:30 crc kubenswrapper[4841]: E0130 06:27:30.871771 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3c0806-1f54-4280-8acc-280f1b360e10" containerName="init" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.871776 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3c0806-1f54-4280-8acc-280f1b360e10" containerName="init" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.871903 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe" containerName="init" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.871914 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3c0806-1f54-4280-8acc-280f1b360e10" containerName="init" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.872611 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.875856 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.876024 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.876140 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.876642 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-p525r" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.887555 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.894138 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.919346 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f5fdf7cb-89ff-42f2-b405-f0423ab54224-config-data-default\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.919388 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fdf7cb-89ff-42f2-b405-f0423ab54224-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.919434 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5fdf7cb-89ff-42f2-b405-f0423ab54224-kolla-config\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.919452 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5fdf7cb-89ff-42f2-b405-f0423ab54224-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.919519 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5fdf7cb-89ff-42f2-b405-f0423ab54224-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.919638 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7d181ba5-6359-4f4c-98a6-8fb76f55acc5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7d181ba5-6359-4f4c-98a6-8fb76f55acc5\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.919710 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f5fdf7cb-89ff-42f2-b405-f0423ab54224-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:30 crc kubenswrapper[4841]: I0130 06:27:30.919747 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-542xt\" (UniqueName: \"kubernetes.io/projected/f5fdf7cb-89ff-42f2-b405-f0423ab54224-kube-api-access-542xt\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.021689 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f5fdf7cb-89ff-42f2-b405-f0423ab54224-config-data-default\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.021755 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fdf7cb-89ff-42f2-b405-f0423ab54224-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.021816 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5fdf7cb-89ff-42f2-b405-f0423ab54224-kolla-config\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.021857 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5fdf7cb-89ff-42f2-b405-f0423ab54224-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.021896 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5fdf7cb-89ff-42f2-b405-f0423ab54224-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.021959 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7d181ba5-6359-4f4c-98a6-8fb76f55acc5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7d181ba5-6359-4f4c-98a6-8fb76f55acc5\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.022041 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f5fdf7cb-89ff-42f2-b405-f0423ab54224-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.022078 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-542xt\" (UniqueName: \"kubernetes.io/projected/f5fdf7cb-89ff-42f2-b405-f0423ab54224-kube-api-access-542xt\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.022868 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5fdf7cb-89ff-42f2-b405-f0423ab54224-kolla-config\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.023159 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f5fdf7cb-89ff-42f2-b405-f0423ab54224-config-data-default\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.023239 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f5fdf7cb-89ff-42f2-b405-f0423ab54224-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.025472 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5fdf7cb-89ff-42f2-b405-f0423ab54224-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.025975 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.026040 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7d181ba5-6359-4f4c-98a6-8fb76f55acc5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7d181ba5-6359-4f4c-98a6-8fb76f55acc5\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c5163da5b150b841183976341835b5050c456f071f1fb4a1f57689b13764a0de/globalmount\"" pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.027050 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fdf7cb-89ff-42f2-b405-f0423ab54224-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.028336 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5fdf7cb-89ff-42f2-b405-f0423ab54224-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.054607 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-542xt\" (UniqueName: \"kubernetes.io/projected/f5fdf7cb-89ff-42f2-b405-f0423ab54224-kube-api-access-542xt\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.059533 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7d181ba5-6359-4f4c-98a6-8fb76f55acc5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7d181ba5-6359-4f4c-98a6-8fb76f55acc5\") pod \"openstack-galera-0\" (UID: \"f5fdf7cb-89ff-42f2-b405-f0423ab54224\") " pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.191129 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.585758 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fb4c14d-b45a-48d9-8233-d69c8928f10a","Type":"ContainerStarted","Data":"1c65b7743403721ba54e28250bc6aafe2e1aecebcf4cfc247c63bbd4ad1e756b"} Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.601516 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" event={"ID":"17731592-890a-403a-86f6-61223bcd5320","Type":"ContainerStarted","Data":"b2d4a08ed12b126208cc5f04e62bd96329a7d5ab50ba0d85f08e1eb21bfe2519"} Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.602130 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.644183 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" podStartSLOduration=3.6430739880000003 podStartE2EDuration="3.643073988s" podCreationTimestamp="2026-01-30 06:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:27:31.641055364 +0000 UTC m=+4788.634528002" watchObservedRunningTime="2026-01-30 06:27:31.643073988 +0000 UTC m=+4788.636546626" Jan 30 06:27:31 crc kubenswrapper[4841]: W0130 06:27:31.682750 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5fdf7cb_89ff_42f2_b405_f0423ab54224.slice/crio-01b497a9e9ffe943b71101611c802ed8dab3370a3be49d805ac5465188acf566 WatchSource:0}: Error finding container 01b497a9e9ffe943b71101611c802ed8dab3370a3be49d805ac5465188acf566: Status 404 returned error can't find the container with id 01b497a9e9ffe943b71101611c802ed8dab3370a3be49d805ac5465188acf566 Jan 30 06:27:31 crc kubenswrapper[4841]: I0130 06:27:31.698048 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.369804 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.372108 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.376418 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.376475 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-f4bb5" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.377328 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.377620 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.453835 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee3c0806-1f54-4280-8acc-280f1b360e10" path="/var/lib/kubelet/pods/ee3c0806-1f54-4280-8acc-280f1b360e10/volumes" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.454720 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe" path="/var/lib/kubelet/pods/f8a4c4ea-eb9b-4fff-a2ef-fb7de34ca5fe/volumes" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.458921 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.549605 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1dbc4eca-4725-4ac6-a657-c191f0e9b1f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1dbc4eca-4725-4ac6-a657-c191f0e9b1f2\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.549667 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b274ae67-2cd8-40ad-a998-0483646d84a1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.549688 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b274ae67-2cd8-40ad-a998-0483646d84a1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.549706 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b274ae67-2cd8-40ad-a998-0483646d84a1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.549725 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6sjq\" (UniqueName: \"kubernetes.io/projected/b274ae67-2cd8-40ad-a998-0483646d84a1-kube-api-access-s6sjq\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.549744 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b274ae67-2cd8-40ad-a998-0483646d84a1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.549778 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b274ae67-2cd8-40ad-a998-0483646d84a1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.549802 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b274ae67-2cd8-40ad-a998-0483646d84a1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.613890 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f5fdf7cb-89ff-42f2-b405-f0423ab54224","Type":"ContainerStarted","Data":"3f05bfd4a65fca9db45f544c82cbdc48444605053922972fbb5011f41cc7e36d"} Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.613972 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f5fdf7cb-89ff-42f2-b405-f0423ab54224","Type":"ContainerStarted","Data":"01b497a9e9ffe943b71101611c802ed8dab3370a3be49d805ac5465188acf566"} Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.622495 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"db910dc4-3d79-4346-bce7-5ee16ef0576d","Type":"ContainerStarted","Data":"65fa24b5fc8dbd46d053c318fb8572c25ea465821895b136e8708788173b117b"} Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.653921 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b274ae67-2cd8-40ad-a998-0483646d84a1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.654035 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b274ae67-2cd8-40ad-a998-0483646d84a1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.654082 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b274ae67-2cd8-40ad-a998-0483646d84a1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.654117 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6sjq\" (UniqueName: \"kubernetes.io/projected/b274ae67-2cd8-40ad-a998-0483646d84a1-kube-api-access-s6sjq\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.654158 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b274ae67-2cd8-40ad-a998-0483646d84a1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.654231 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b274ae67-2cd8-40ad-a998-0483646d84a1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.654279 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b274ae67-2cd8-40ad-a998-0483646d84a1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.654352 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1dbc4eca-4725-4ac6-a657-c191f0e9b1f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1dbc4eca-4725-4ac6-a657-c191f0e9b1f2\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.656265 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b274ae67-2cd8-40ad-a998-0483646d84a1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.658041 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b274ae67-2cd8-40ad-a998-0483646d84a1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.659574 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b274ae67-2cd8-40ad-a998-0483646d84a1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.664105 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b274ae67-2cd8-40ad-a998-0483646d84a1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.664154 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b274ae67-2cd8-40ad-a998-0483646d84a1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.668232 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.668288 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1dbc4eca-4725-4ac6-a657-c191f0e9b1f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1dbc4eca-4725-4ac6-a657-c191f0e9b1f2\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/82fe53e9510c5ab1b4de14bb766d9f2e24bbb667140133800250ea5a3b0d983d/globalmount\"" pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.670918 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.671893 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.675638 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.675978 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.676211 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vr5km" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.683952 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b274ae67-2cd8-40ad-a998-0483646d84a1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.693463 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.693505 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6sjq\" (UniqueName: \"kubernetes.io/projected/b274ae67-2cd8-40ad-a998-0483646d84a1-kube-api-access-s6sjq\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.746308 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1dbc4eca-4725-4ac6-a657-c191f0e9b1f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1dbc4eca-4725-4ac6-a657-c191f0e9b1f2\") pod \"openstack-cell1-galera-0\" (UID: \"b274ae67-2cd8-40ad-a998-0483646d84a1\") " pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.762307 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.857146 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bcdeb8de-47da-4a31-b19f-e178726e3676-config-data\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.857187 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bcdeb8de-47da-4a31-b19f-e178726e3676-kolla-config\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.857228 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcdeb8de-47da-4a31-b19f-e178726e3676-memcached-tls-certs\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.857267 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkftt\" (UniqueName: \"kubernetes.io/projected/bcdeb8de-47da-4a31-b19f-e178726e3676-kube-api-access-xkftt\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.857294 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdeb8de-47da-4a31-b19f-e178726e3676-combined-ca-bundle\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.958516 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcdeb8de-47da-4a31-b19f-e178726e3676-memcached-tls-certs\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.958613 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkftt\" (UniqueName: \"kubernetes.io/projected/bcdeb8de-47da-4a31-b19f-e178726e3676-kube-api-access-xkftt\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.958673 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdeb8de-47da-4a31-b19f-e178726e3676-combined-ca-bundle\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.958762 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bcdeb8de-47da-4a31-b19f-e178726e3676-config-data\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.958805 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bcdeb8de-47da-4a31-b19f-e178726e3676-kolla-config\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.960201 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bcdeb8de-47da-4a31-b19f-e178726e3676-config-data\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.960229 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bcdeb8de-47da-4a31-b19f-e178726e3676-kolla-config\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.964566 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcdeb8de-47da-4a31-b19f-e178726e3676-memcached-tls-certs\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.964912 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdeb8de-47da-4a31-b19f-e178726e3676-combined-ca-bundle\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:32 crc kubenswrapper[4841]: I0130 06:27:32.977624 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkftt\" (UniqueName: \"kubernetes.io/projected/bcdeb8de-47da-4a31-b19f-e178726e3676-kube-api-access-xkftt\") pod \"memcached-0\" (UID: \"bcdeb8de-47da-4a31-b19f-e178726e3676\") " pod="openstack/memcached-0" Jan 30 06:27:33 crc kubenswrapper[4841]: I0130 06:27:33.057913 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 06:27:33 crc kubenswrapper[4841]: I0130 06:27:33.192546 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 06:27:33 crc kubenswrapper[4841]: W0130 06:27:33.200682 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb274ae67_2cd8_40ad_a998_0483646d84a1.slice/crio-39995fa919f47f14c41a8b1f8b10a28976121890e64ad1eb620148a2e3d4bff3 WatchSource:0}: Error finding container 39995fa919f47f14c41a8b1f8b10a28976121890e64ad1eb620148a2e3d4bff3: Status 404 returned error can't find the container with id 39995fa919f47f14c41a8b1f8b10a28976121890e64ad1eb620148a2e3d4bff3 Jan 30 06:27:33 crc kubenswrapper[4841]: I0130 06:27:33.574934 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 06:27:33 crc kubenswrapper[4841]: I0130 06:27:33.632210 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b274ae67-2cd8-40ad-a998-0483646d84a1","Type":"ContainerStarted","Data":"b5eb4174d79b7bdfb2d3a6b6cf3a0262f55150c7d3b5f64547e98418ea383aeb"} Jan 30 06:27:33 crc kubenswrapper[4841]: I0130 06:27:33.632275 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b274ae67-2cd8-40ad-a998-0483646d84a1","Type":"ContainerStarted","Data":"39995fa919f47f14c41a8b1f8b10a28976121890e64ad1eb620148a2e3d4bff3"} Jan 30 06:27:33 crc kubenswrapper[4841]: I0130 06:27:33.633160 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"bcdeb8de-47da-4a31-b19f-e178726e3676","Type":"ContainerStarted","Data":"fc39849d0779db524aa55e1e4f935191b45931c49231edd8301ca1c56404f319"} Jan 30 06:27:34 crc kubenswrapper[4841]: I0130 06:27:34.646238 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"bcdeb8de-47da-4a31-b19f-e178726e3676","Type":"ContainerStarted","Data":"633d9a289e1f63a804e002a9fc482e9c7fe5db5f84d81e5998d542087ad3341e"} Jan 30 06:27:34 crc kubenswrapper[4841]: I0130 06:27:34.675537 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.675513411 podStartE2EDuration="2.675513411s" podCreationTimestamp="2026-01-30 06:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:27:34.674787342 +0000 UTC m=+4791.668260030" watchObservedRunningTime="2026-01-30 06:27:34.675513411 +0000 UTC m=+4791.668986089" Jan 30 06:27:35 crc kubenswrapper[4841]: I0130 06:27:35.658832 4841 generic.go:334] "Generic (PLEG): container finished" podID="f5fdf7cb-89ff-42f2-b405-f0423ab54224" containerID="3f05bfd4a65fca9db45f544c82cbdc48444605053922972fbb5011f41cc7e36d" exitCode=0 Jan 30 06:27:35 crc kubenswrapper[4841]: I0130 06:27:35.658954 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f5fdf7cb-89ff-42f2-b405-f0423ab54224","Type":"ContainerDied","Data":"3f05bfd4a65fca9db45f544c82cbdc48444605053922972fbb5011f41cc7e36d"} Jan 30 06:27:35 crc kubenswrapper[4841]: I0130 06:27:35.659639 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 30 06:27:36 crc kubenswrapper[4841]: I0130 06:27:36.676182 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f5fdf7cb-89ff-42f2-b405-f0423ab54224","Type":"ContainerStarted","Data":"96accc0172fd8209842098a44e62f57dfd4a188ab994ca2f82b491a5b8ae48c5"} Jan 30 06:27:36 crc kubenswrapper[4841]: I0130 06:27:36.720573 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.720534799 podStartE2EDuration="7.720534799s" podCreationTimestamp="2026-01-30 06:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:27:36.706319271 +0000 UTC m=+4793.699791989" watchObservedRunningTime="2026-01-30 06:27:36.720534799 +0000 UTC m=+4793.714007477" Jan 30 06:27:37 crc kubenswrapper[4841]: I0130 06:27:37.695907 4841 generic.go:334] "Generic (PLEG): container finished" podID="b274ae67-2cd8-40ad-a998-0483646d84a1" containerID="b5eb4174d79b7bdfb2d3a6b6cf3a0262f55150c7d3b5f64547e98418ea383aeb" exitCode=0 Jan 30 06:27:37 crc kubenswrapper[4841]: I0130 06:27:37.695974 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b274ae67-2cd8-40ad-a998-0483646d84a1","Type":"ContainerDied","Data":"b5eb4174d79b7bdfb2d3a6b6cf3a0262f55150c7d3b5f64547e98418ea383aeb"} Jan 30 06:27:38 crc kubenswrapper[4841]: I0130 06:27:38.060065 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 30 06:27:38 crc kubenswrapper[4841]: I0130 06:27:38.702379 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b274ae67-2cd8-40ad-a998-0483646d84a1","Type":"ContainerStarted","Data":"7cd3c972b3006e00a64e4e5a77a4a171296623393d70633dd62a1e2a90548ba2"} Jan 30 06:27:38 crc kubenswrapper[4841]: I0130 06:27:38.724503 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.724484863 podStartE2EDuration="7.724484863s" podCreationTimestamp="2026-01-30 06:27:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:27:38.719189553 +0000 UTC m=+4795.712662191" watchObservedRunningTime="2026-01-30 06:27:38.724484863 +0000 UTC m=+4795.717957501" Jan 30 06:27:38 crc kubenswrapper[4841]: I0130 06:27:38.772640 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:39 crc kubenswrapper[4841]: I0130 06:27:39.038891 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:27:39 crc kubenswrapper[4841]: I0130 06:27:39.095917 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865d9b578f-r9qc4"] Jan 30 06:27:39 crc kubenswrapper[4841]: I0130 06:27:39.711434 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" podUID="238ad799-58fb-4f52-8873-055218e25807" containerName="dnsmasq-dns" containerID="cri-o://d789deb7f5417992aa89e3a85761f1b1117a6985ff588d45222bc12c6f050cb8" gracePeriod=10 Jan 30 06:27:40 crc kubenswrapper[4841]: I0130 06:27:40.723636 4841 generic.go:334] "Generic (PLEG): container finished" podID="238ad799-58fb-4f52-8873-055218e25807" containerID="d789deb7f5417992aa89e3a85761f1b1117a6985ff588d45222bc12c6f050cb8" exitCode=0 Jan 30 06:27:40 crc kubenswrapper[4841]: I0130 06:27:40.724083 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" event={"ID":"238ad799-58fb-4f52-8873-055218e25807","Type":"ContainerDied","Data":"d789deb7f5417992aa89e3a85761f1b1117a6985ff588d45222bc12c6f050cb8"} Jan 30 06:27:40 crc kubenswrapper[4841]: I0130 06:27:40.862173 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:40 crc kubenswrapper[4841]: I0130 06:27:40.997658 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/238ad799-58fb-4f52-8873-055218e25807-dns-svc\") pod \"238ad799-58fb-4f52-8873-055218e25807\" (UID: \"238ad799-58fb-4f52-8873-055218e25807\") " Jan 30 06:27:40 crc kubenswrapper[4841]: I0130 06:27:40.997739 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238ad799-58fb-4f52-8873-055218e25807-config\") pod \"238ad799-58fb-4f52-8873-055218e25807\" (UID: \"238ad799-58fb-4f52-8873-055218e25807\") " Jan 30 06:27:40 crc kubenswrapper[4841]: I0130 06:27:40.997940 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gll22\" (UniqueName: \"kubernetes.io/projected/238ad799-58fb-4f52-8873-055218e25807-kube-api-access-gll22\") pod \"238ad799-58fb-4f52-8873-055218e25807\" (UID: \"238ad799-58fb-4f52-8873-055218e25807\") " Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.007205 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/238ad799-58fb-4f52-8873-055218e25807-kube-api-access-gll22" (OuterVolumeSpecName: "kube-api-access-gll22") pod "238ad799-58fb-4f52-8873-055218e25807" (UID: "238ad799-58fb-4f52-8873-055218e25807"). InnerVolumeSpecName "kube-api-access-gll22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.044107 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/238ad799-58fb-4f52-8873-055218e25807-config" (OuterVolumeSpecName: "config") pod "238ad799-58fb-4f52-8873-055218e25807" (UID: "238ad799-58fb-4f52-8873-055218e25807"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.044731 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/238ad799-58fb-4f52-8873-055218e25807-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "238ad799-58fb-4f52-8873-055218e25807" (UID: "238ad799-58fb-4f52-8873-055218e25807"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.100253 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gll22\" (UniqueName: \"kubernetes.io/projected/238ad799-58fb-4f52-8873-055218e25807-kube-api-access-gll22\") on node \"crc\" DevicePath \"\"" Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.100470 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/238ad799-58fb-4f52-8873-055218e25807-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.100579 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238ad799-58fb-4f52-8873-055218e25807-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.191832 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.193569 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.739132 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.739162 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865d9b578f-r9qc4" event={"ID":"238ad799-58fb-4f52-8873-055218e25807","Type":"ContainerDied","Data":"63971e926a0650f04298faccb80504a79858aff98018b43f53c0a6bce8298197"} Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.739247 4841 scope.go:117] "RemoveContainer" containerID="d789deb7f5417992aa89e3a85761f1b1117a6985ff588d45222bc12c6f050cb8" Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.778321 4841 scope.go:117] "RemoveContainer" containerID="8b31fddee532dd3148666f46a730c8823bd9270b3770fd432bb7b4fcfc146d4e" Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.790615 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865d9b578f-r9qc4"] Jan 30 06:27:41 crc kubenswrapper[4841]: I0130 06:27:41.801945 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865d9b578f-r9qc4"] Jan 30 06:27:42 crc kubenswrapper[4841]: I0130 06:27:42.449371 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="238ad799-58fb-4f52-8873-055218e25807" path="/var/lib/kubelet/pods/238ad799-58fb-4f52-8873-055218e25807/volumes" Jan 30 06:27:42 crc kubenswrapper[4841]: I0130 06:27:42.763338 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:42 crc kubenswrapper[4841]: I0130 06:27:42.763427 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:43 crc kubenswrapper[4841]: I0130 06:27:43.156021 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:43 crc kubenswrapper[4841]: I0130 06:27:43.642329 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 30 06:27:43 crc kubenswrapper[4841]: I0130 06:27:43.752594 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 30 06:27:43 crc kubenswrapper[4841]: I0130 06:27:43.967226 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 30 06:27:49 crc kubenswrapper[4841]: I0130 06:27:49.878250 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qwbn6"] Jan 30 06:27:49 crc kubenswrapper[4841]: E0130 06:27:49.879576 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="238ad799-58fb-4f52-8873-055218e25807" containerName="init" Jan 30 06:27:49 crc kubenswrapper[4841]: I0130 06:27:49.879594 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="238ad799-58fb-4f52-8873-055218e25807" containerName="init" Jan 30 06:27:49 crc kubenswrapper[4841]: E0130 06:27:49.879611 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="238ad799-58fb-4f52-8873-055218e25807" containerName="dnsmasq-dns" Jan 30 06:27:49 crc kubenswrapper[4841]: I0130 06:27:49.879618 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="238ad799-58fb-4f52-8873-055218e25807" containerName="dnsmasq-dns" Jan 30 06:27:49 crc kubenswrapper[4841]: I0130 06:27:49.879825 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="238ad799-58fb-4f52-8873-055218e25807" containerName="dnsmasq-dns" Jan 30 06:27:49 crc kubenswrapper[4841]: I0130 06:27:49.880425 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qwbn6" Jan 30 06:27:49 crc kubenswrapper[4841]: I0130 06:27:49.884090 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 06:27:49 crc kubenswrapper[4841]: I0130 06:27:49.895981 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qwbn6"] Jan 30 06:27:49 crc kubenswrapper[4841]: I0130 06:27:49.973078 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be771af4-4ff8-46ac-99ca-0c0453db6203-operator-scripts\") pod \"root-account-create-update-qwbn6\" (UID: \"be771af4-4ff8-46ac-99ca-0c0453db6203\") " pod="openstack/root-account-create-update-qwbn6" Jan 30 06:27:49 crc kubenswrapper[4841]: I0130 06:27:49.973165 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lszwg\" (UniqueName: \"kubernetes.io/projected/be771af4-4ff8-46ac-99ca-0c0453db6203-kube-api-access-lszwg\") pod \"root-account-create-update-qwbn6\" (UID: \"be771af4-4ff8-46ac-99ca-0c0453db6203\") " pod="openstack/root-account-create-update-qwbn6" Jan 30 06:27:50 crc kubenswrapper[4841]: I0130 06:27:50.075099 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be771af4-4ff8-46ac-99ca-0c0453db6203-operator-scripts\") pod \"root-account-create-update-qwbn6\" (UID: \"be771af4-4ff8-46ac-99ca-0c0453db6203\") " pod="openstack/root-account-create-update-qwbn6" Jan 30 06:27:50 crc kubenswrapper[4841]: I0130 06:27:50.075249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lszwg\" (UniqueName: \"kubernetes.io/projected/be771af4-4ff8-46ac-99ca-0c0453db6203-kube-api-access-lszwg\") pod \"root-account-create-update-qwbn6\" (UID: \"be771af4-4ff8-46ac-99ca-0c0453db6203\") " pod="openstack/root-account-create-update-qwbn6" Jan 30 06:27:50 crc kubenswrapper[4841]: I0130 06:27:50.076169 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be771af4-4ff8-46ac-99ca-0c0453db6203-operator-scripts\") pod \"root-account-create-update-qwbn6\" (UID: \"be771af4-4ff8-46ac-99ca-0c0453db6203\") " pod="openstack/root-account-create-update-qwbn6" Jan 30 06:27:50 crc kubenswrapper[4841]: I0130 06:27:50.102654 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lszwg\" (UniqueName: \"kubernetes.io/projected/be771af4-4ff8-46ac-99ca-0c0453db6203-kube-api-access-lszwg\") pod \"root-account-create-update-qwbn6\" (UID: \"be771af4-4ff8-46ac-99ca-0c0453db6203\") " pod="openstack/root-account-create-update-qwbn6" Jan 30 06:27:50 crc kubenswrapper[4841]: I0130 06:27:50.202778 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qwbn6" Jan 30 06:27:50 crc kubenswrapper[4841]: I0130 06:27:50.801883 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qwbn6"] Jan 30 06:27:50 crc kubenswrapper[4841]: W0130 06:27:50.804003 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe771af4_4ff8_46ac_99ca_0c0453db6203.slice/crio-7ab5d2d14be18e7af7acf982ae16147d16fc134183112dea78c6bdcf92742d6b WatchSource:0}: Error finding container 7ab5d2d14be18e7af7acf982ae16147d16fc134183112dea78c6bdcf92742d6b: Status 404 returned error can't find the container with id 7ab5d2d14be18e7af7acf982ae16147d16fc134183112dea78c6bdcf92742d6b Jan 30 06:27:50 crc kubenswrapper[4841]: I0130 06:27:50.852008 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qwbn6" event={"ID":"be771af4-4ff8-46ac-99ca-0c0453db6203","Type":"ContainerStarted","Data":"7ab5d2d14be18e7af7acf982ae16147d16fc134183112dea78c6bdcf92742d6b"} Jan 30 06:27:51 crc kubenswrapper[4841]: I0130 06:27:51.864308 4841 generic.go:334] "Generic (PLEG): container finished" podID="be771af4-4ff8-46ac-99ca-0c0453db6203" containerID="b5f822efc82114dae2c8983ded39a30c20fed1924b327059e0a01920c8fe50e4" exitCode=0 Jan 30 06:27:51 crc kubenswrapper[4841]: I0130 06:27:51.864392 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qwbn6" event={"ID":"be771af4-4ff8-46ac-99ca-0c0453db6203","Type":"ContainerDied","Data":"b5f822efc82114dae2c8983ded39a30c20fed1924b327059e0a01920c8fe50e4"} Jan 30 06:27:53 crc kubenswrapper[4841]: I0130 06:27:53.324114 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qwbn6" Jan 30 06:27:53 crc kubenswrapper[4841]: I0130 06:27:53.446238 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lszwg\" (UniqueName: \"kubernetes.io/projected/be771af4-4ff8-46ac-99ca-0c0453db6203-kube-api-access-lszwg\") pod \"be771af4-4ff8-46ac-99ca-0c0453db6203\" (UID: \"be771af4-4ff8-46ac-99ca-0c0453db6203\") " Jan 30 06:27:53 crc kubenswrapper[4841]: I0130 06:27:53.446322 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be771af4-4ff8-46ac-99ca-0c0453db6203-operator-scripts\") pod \"be771af4-4ff8-46ac-99ca-0c0453db6203\" (UID: \"be771af4-4ff8-46ac-99ca-0c0453db6203\") " Jan 30 06:27:53 crc kubenswrapper[4841]: I0130 06:27:53.447096 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be771af4-4ff8-46ac-99ca-0c0453db6203-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be771af4-4ff8-46ac-99ca-0c0453db6203" (UID: "be771af4-4ff8-46ac-99ca-0c0453db6203"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:27:53 crc kubenswrapper[4841]: I0130 06:27:53.456599 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be771af4-4ff8-46ac-99ca-0c0453db6203-kube-api-access-lszwg" (OuterVolumeSpecName: "kube-api-access-lszwg") pod "be771af4-4ff8-46ac-99ca-0c0453db6203" (UID: "be771af4-4ff8-46ac-99ca-0c0453db6203"). InnerVolumeSpecName "kube-api-access-lszwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:27:53 crc kubenswrapper[4841]: I0130 06:27:53.548957 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lszwg\" (UniqueName: \"kubernetes.io/projected/be771af4-4ff8-46ac-99ca-0c0453db6203-kube-api-access-lszwg\") on node \"crc\" DevicePath \"\"" Jan 30 06:27:53 crc kubenswrapper[4841]: I0130 06:27:53.548997 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be771af4-4ff8-46ac-99ca-0c0453db6203-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:27:53 crc kubenswrapper[4841]: I0130 06:27:53.907638 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qwbn6" event={"ID":"be771af4-4ff8-46ac-99ca-0c0453db6203","Type":"ContainerDied","Data":"7ab5d2d14be18e7af7acf982ae16147d16fc134183112dea78c6bdcf92742d6b"} Jan 30 06:27:53 crc kubenswrapper[4841]: I0130 06:27:53.907708 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ab5d2d14be18e7af7acf982ae16147d16fc134183112dea78c6bdcf92742d6b" Jan 30 06:27:53 crc kubenswrapper[4841]: I0130 06:27:53.907830 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qwbn6" Jan 30 06:27:56 crc kubenswrapper[4841]: I0130 06:27:56.370247 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qwbn6"] Jan 30 06:27:56 crc kubenswrapper[4841]: I0130 06:27:56.381571 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qwbn6"] Jan 30 06:27:56 crc kubenswrapper[4841]: I0130 06:27:56.448185 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be771af4-4ff8-46ac-99ca-0c0453db6203" path="/var/lib/kubelet/pods/be771af4-4ff8-46ac-99ca-0c0453db6203/volumes" Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.374599 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zx4n9"] Jan 30 06:28:01 crc kubenswrapper[4841]: E0130 06:28:01.375315 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be771af4-4ff8-46ac-99ca-0c0453db6203" containerName="mariadb-account-create-update" Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.375336 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="be771af4-4ff8-46ac-99ca-0c0453db6203" containerName="mariadb-account-create-update" Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.375620 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="be771af4-4ff8-46ac-99ca-0c0453db6203" containerName="mariadb-account-create-update" Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.376425 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zx4n9" Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.379243 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.389980 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zx4n9"] Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.492867 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebbafff3-ee90-444f-94fb-a03bcf07a439-operator-scripts\") pod \"root-account-create-update-zx4n9\" (UID: \"ebbafff3-ee90-444f-94fb-a03bcf07a439\") " pod="openstack/root-account-create-update-zx4n9" Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.492940 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx9kx\" (UniqueName: \"kubernetes.io/projected/ebbafff3-ee90-444f-94fb-a03bcf07a439-kube-api-access-jx9kx\") pod \"root-account-create-update-zx4n9\" (UID: \"ebbafff3-ee90-444f-94fb-a03bcf07a439\") " pod="openstack/root-account-create-update-zx4n9" Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.593959 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebbafff3-ee90-444f-94fb-a03bcf07a439-operator-scripts\") pod \"root-account-create-update-zx4n9\" (UID: \"ebbafff3-ee90-444f-94fb-a03bcf07a439\") " pod="openstack/root-account-create-update-zx4n9" Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.594010 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx9kx\" (UniqueName: \"kubernetes.io/projected/ebbafff3-ee90-444f-94fb-a03bcf07a439-kube-api-access-jx9kx\") pod \"root-account-create-update-zx4n9\" (UID: \"ebbafff3-ee90-444f-94fb-a03bcf07a439\") " pod="openstack/root-account-create-update-zx4n9" Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.595253 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebbafff3-ee90-444f-94fb-a03bcf07a439-operator-scripts\") pod \"root-account-create-update-zx4n9\" (UID: \"ebbafff3-ee90-444f-94fb-a03bcf07a439\") " pod="openstack/root-account-create-update-zx4n9" Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.617821 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx9kx\" (UniqueName: \"kubernetes.io/projected/ebbafff3-ee90-444f-94fb-a03bcf07a439-kube-api-access-jx9kx\") pod \"root-account-create-update-zx4n9\" (UID: \"ebbafff3-ee90-444f-94fb-a03bcf07a439\") " pod="openstack/root-account-create-update-zx4n9" Jan 30 06:28:01 crc kubenswrapper[4841]: I0130 06:28:01.695156 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zx4n9" Jan 30 06:28:02 crc kubenswrapper[4841]: I0130 06:28:02.006861 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zx4n9"] Jan 30 06:28:03 crc kubenswrapper[4841]: I0130 06:28:03.001439 4841 generic.go:334] "Generic (PLEG): container finished" podID="ebbafff3-ee90-444f-94fb-a03bcf07a439" containerID="aa67f1ce5ea8c43fbb1ff000f094fb3e174145b2d53cf88544796a4d2c0691ea" exitCode=0 Jan 30 06:28:03 crc kubenswrapper[4841]: I0130 06:28:03.001514 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zx4n9" event={"ID":"ebbafff3-ee90-444f-94fb-a03bcf07a439","Type":"ContainerDied","Data":"aa67f1ce5ea8c43fbb1ff000f094fb3e174145b2d53cf88544796a4d2c0691ea"} Jan 30 06:28:03 crc kubenswrapper[4841]: I0130 06:28:03.001767 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zx4n9" event={"ID":"ebbafff3-ee90-444f-94fb-a03bcf07a439","Type":"ContainerStarted","Data":"f6855005305b36b88c6910d55d5359a6b23412264971f544c16976208a2c83de"} Jan 30 06:28:04 crc kubenswrapper[4841]: I0130 06:28:04.019858 4841 generic.go:334] "Generic (PLEG): container finished" podID="8fb4c14d-b45a-48d9-8233-d69c8928f10a" containerID="1c65b7743403721ba54e28250bc6aafe2e1aecebcf4cfc247c63bbd4ad1e756b" exitCode=0 Jan 30 06:28:04 crc kubenswrapper[4841]: I0130 06:28:04.019972 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fb4c14d-b45a-48d9-8233-d69c8928f10a","Type":"ContainerDied","Data":"1c65b7743403721ba54e28250bc6aafe2e1aecebcf4cfc247c63bbd4ad1e756b"} Jan 30 06:28:04 crc kubenswrapper[4841]: I0130 06:28:04.462843 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zx4n9" Jan 30 06:28:04 crc kubenswrapper[4841]: I0130 06:28:04.583285 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx9kx\" (UniqueName: \"kubernetes.io/projected/ebbafff3-ee90-444f-94fb-a03bcf07a439-kube-api-access-jx9kx\") pod \"ebbafff3-ee90-444f-94fb-a03bcf07a439\" (UID: \"ebbafff3-ee90-444f-94fb-a03bcf07a439\") " Jan 30 06:28:04 crc kubenswrapper[4841]: I0130 06:28:04.583454 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebbafff3-ee90-444f-94fb-a03bcf07a439-operator-scripts\") pod \"ebbafff3-ee90-444f-94fb-a03bcf07a439\" (UID: \"ebbafff3-ee90-444f-94fb-a03bcf07a439\") " Jan 30 06:28:04 crc kubenswrapper[4841]: I0130 06:28:04.583782 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebbafff3-ee90-444f-94fb-a03bcf07a439-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ebbafff3-ee90-444f-94fb-a03bcf07a439" (UID: "ebbafff3-ee90-444f-94fb-a03bcf07a439"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:28:04 crc kubenswrapper[4841]: I0130 06:28:04.584668 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebbafff3-ee90-444f-94fb-a03bcf07a439-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:04 crc kubenswrapper[4841]: I0130 06:28:04.590189 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebbafff3-ee90-444f-94fb-a03bcf07a439-kube-api-access-jx9kx" (OuterVolumeSpecName: "kube-api-access-jx9kx") pod "ebbafff3-ee90-444f-94fb-a03bcf07a439" (UID: "ebbafff3-ee90-444f-94fb-a03bcf07a439"). InnerVolumeSpecName "kube-api-access-jx9kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:28:04 crc kubenswrapper[4841]: E0130 06:28:04.665865 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb910dc4_3d79_4346_bce7_5ee16ef0576d.slice/crio-conmon-65fa24b5fc8dbd46d053c318fb8572c25ea465821895b136e8708788173b117b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb910dc4_3d79_4346_bce7_5ee16ef0576d.slice/crio-65fa24b5fc8dbd46d053c318fb8572c25ea465821895b136e8708788173b117b.scope\": RecentStats: unable to find data in memory cache]" Jan 30 06:28:04 crc kubenswrapper[4841]: I0130 06:28:04.686049 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx9kx\" (UniqueName: \"kubernetes.io/projected/ebbafff3-ee90-444f-94fb-a03bcf07a439-kube-api-access-jx9kx\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:05 crc kubenswrapper[4841]: I0130 06:28:05.031296 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zx4n9" event={"ID":"ebbafff3-ee90-444f-94fb-a03bcf07a439","Type":"ContainerDied","Data":"f6855005305b36b88c6910d55d5359a6b23412264971f544c16976208a2c83de"} Jan 30 06:28:05 crc kubenswrapper[4841]: I0130 06:28:05.031341 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6855005305b36b88c6910d55d5359a6b23412264971f544c16976208a2c83de" Jan 30 06:28:05 crc kubenswrapper[4841]: I0130 06:28:05.031809 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zx4n9" Jan 30 06:28:05 crc kubenswrapper[4841]: I0130 06:28:05.033539 4841 generic.go:334] "Generic (PLEG): container finished" podID="db910dc4-3d79-4346-bce7-5ee16ef0576d" containerID="65fa24b5fc8dbd46d053c318fb8572c25ea465821895b136e8708788173b117b" exitCode=0 Jan 30 06:28:05 crc kubenswrapper[4841]: I0130 06:28:05.033613 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"db910dc4-3d79-4346-bce7-5ee16ef0576d","Type":"ContainerDied","Data":"65fa24b5fc8dbd46d053c318fb8572c25ea465821895b136e8708788173b117b"} Jan 30 06:28:05 crc kubenswrapper[4841]: I0130 06:28:05.039132 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fb4c14d-b45a-48d9-8233-d69c8928f10a","Type":"ContainerStarted","Data":"73550872df766b45850a9da92b066ded5db7e2ee29e90d70829cea000398b60e"} Jan 30 06:28:05 crc kubenswrapper[4841]: I0130 06:28:05.039454 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:05 crc kubenswrapper[4841]: I0130 06:28:05.113688 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.113657862 podStartE2EDuration="37.113657862s" podCreationTimestamp="2026-01-30 06:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:28:05.099850694 +0000 UTC m=+4822.093323372" watchObservedRunningTime="2026-01-30 06:28:05.113657862 +0000 UTC m=+4822.107130540" Jan 30 06:28:06 crc kubenswrapper[4841]: I0130 06:28:06.047154 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"db910dc4-3d79-4346-bce7-5ee16ef0576d","Type":"ContainerStarted","Data":"5787bc454e10577dc1291261cb2b749b629818e5df6e4e5977b0c283ada08adf"} Jan 30 06:28:06 crc kubenswrapper[4841]: I0130 06:28:06.048688 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 06:28:06 crc kubenswrapper[4841]: I0130 06:28:06.080511 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.080496218 podStartE2EDuration="38.080496218s" podCreationTimestamp="2026-01-30 06:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:28:06.077286173 +0000 UTC m=+4823.070758811" watchObservedRunningTime="2026-01-30 06:28:06.080496218 +0000 UTC m=+4823.073968856" Jan 30 06:28:19 crc kubenswrapper[4841]: I0130 06:28:19.967698 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:20 crc kubenswrapper[4841]: I0130 06:28:20.217596 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.658824 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-699964fbc-nrhwp"] Jan 30 06:28:24 crc kubenswrapper[4841]: E0130 06:28:24.659612 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbafff3-ee90-444f-94fb-a03bcf07a439" containerName="mariadb-account-create-update" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.659627 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbafff3-ee90-444f-94fb-a03bcf07a439" containerName="mariadb-account-create-update" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.659782 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebbafff3-ee90-444f-94fb-a03bcf07a439" containerName="mariadb-account-create-update" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.660583 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.669457 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-nrhwp"] Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.714844 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e71987d-6237-432a-9a8d-d360b62b497d-dns-svc\") pod \"dnsmasq-dns-699964fbc-nrhwp\" (UID: \"3e71987d-6237-432a-9a8d-d360b62b497d\") " pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.714917 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e71987d-6237-432a-9a8d-d360b62b497d-config\") pod \"dnsmasq-dns-699964fbc-nrhwp\" (UID: \"3e71987d-6237-432a-9a8d-d360b62b497d\") " pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.714968 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d45l\" (UniqueName: \"kubernetes.io/projected/3e71987d-6237-432a-9a8d-d360b62b497d-kube-api-access-6d45l\") pod \"dnsmasq-dns-699964fbc-nrhwp\" (UID: \"3e71987d-6237-432a-9a8d-d360b62b497d\") " pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.816630 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d45l\" (UniqueName: \"kubernetes.io/projected/3e71987d-6237-432a-9a8d-d360b62b497d-kube-api-access-6d45l\") pod \"dnsmasq-dns-699964fbc-nrhwp\" (UID: \"3e71987d-6237-432a-9a8d-d360b62b497d\") " pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.816808 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e71987d-6237-432a-9a8d-d360b62b497d-dns-svc\") pod \"dnsmasq-dns-699964fbc-nrhwp\" (UID: \"3e71987d-6237-432a-9a8d-d360b62b497d\") " pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.816894 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e71987d-6237-432a-9a8d-d360b62b497d-config\") pod \"dnsmasq-dns-699964fbc-nrhwp\" (UID: \"3e71987d-6237-432a-9a8d-d360b62b497d\") " pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.817643 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e71987d-6237-432a-9a8d-d360b62b497d-dns-svc\") pod \"dnsmasq-dns-699964fbc-nrhwp\" (UID: \"3e71987d-6237-432a-9a8d-d360b62b497d\") " pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.818131 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e71987d-6237-432a-9a8d-d360b62b497d-config\") pod \"dnsmasq-dns-699964fbc-nrhwp\" (UID: \"3e71987d-6237-432a-9a8d-d360b62b497d\") " pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.841457 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d45l\" (UniqueName: \"kubernetes.io/projected/3e71987d-6237-432a-9a8d-d360b62b497d-kube-api-access-6d45l\") pod \"dnsmasq-dns-699964fbc-nrhwp\" (UID: \"3e71987d-6237-432a-9a8d-d360b62b497d\") " pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:24 crc kubenswrapper[4841]: I0130 06:28:24.983239 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:25 crc kubenswrapper[4841]: I0130 06:28:25.398776 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:28:25 crc kubenswrapper[4841]: I0130 06:28:25.489128 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-nrhwp"] Jan 30 06:28:26 crc kubenswrapper[4841]: I0130 06:28:26.215830 4841 generic.go:334] "Generic (PLEG): container finished" podID="3e71987d-6237-432a-9a8d-d360b62b497d" containerID="34b5a83249a194f4fa49ef98e101d2ef5d94ee834c156e1c69652642c04065ce" exitCode=0 Jan 30 06:28:26 crc kubenswrapper[4841]: I0130 06:28:26.216116 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" event={"ID":"3e71987d-6237-432a-9a8d-d360b62b497d","Type":"ContainerDied","Data":"34b5a83249a194f4fa49ef98e101d2ef5d94ee834c156e1c69652642c04065ce"} Jan 30 06:28:26 crc kubenswrapper[4841]: I0130 06:28:26.216184 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" event={"ID":"3e71987d-6237-432a-9a8d-d360b62b497d","Type":"ContainerStarted","Data":"eb48a1f42d0ca3dee9456f06d6e8bc5154ee0e42da70b38d5c892e6a37e51327"} Jan 30 06:28:26 crc kubenswrapper[4841]: I0130 06:28:26.221027 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:28:27 crc kubenswrapper[4841]: I0130 06:28:27.231446 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" event={"ID":"3e71987d-6237-432a-9a8d-d360b62b497d","Type":"ContainerStarted","Data":"602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419"} Jan 30 06:28:27 crc kubenswrapper[4841]: I0130 06:28:27.232897 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:29 crc kubenswrapper[4841]: I0130 06:28:29.738500 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="db910dc4-3d79-4346-bce7-5ee16ef0576d" containerName="rabbitmq" containerID="cri-o://5787bc454e10577dc1291261cb2b749b629818e5df6e4e5977b0c283ada08adf" gracePeriod=604796 Jan 30 06:28:30 crc kubenswrapper[4841]: I0130 06:28:30.215985 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="db910dc4-3d79-4346-bce7-5ee16ef0576d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.250:5671: connect: connection refused" Jan 30 06:28:30 crc kubenswrapper[4841]: I0130 06:28:30.858715 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="8fb4c14d-b45a-48d9-8233-d69c8928f10a" containerName="rabbitmq" containerID="cri-o://73550872df766b45850a9da92b066ded5db7e2ee29e90d70829cea000398b60e" gracePeriod=604796 Jan 30 06:28:34 crc kubenswrapper[4841]: I0130 06:28:34.984663 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.017100 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" podStartSLOduration=11.017070666 podStartE2EDuration="11.017070666s" podCreationTimestamp="2026-01-30 06:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:28:27.267022371 +0000 UTC m=+4844.260495049" watchObservedRunningTime="2026-01-30 06:28:35.017070666 +0000 UTC m=+4852.010543344" Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.056808 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-r27xj"] Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.057150 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" podUID="17731592-890a-403a-86f6-61223bcd5320" containerName="dnsmasq-dns" containerID="cri-o://b2d4a08ed12b126208cc5f04e62bd96329a7d5ab50ba0d85f08e1eb21bfe2519" gracePeriod=10 Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.296578 4841 generic.go:334] "Generic (PLEG): container finished" podID="17731592-890a-403a-86f6-61223bcd5320" containerID="b2d4a08ed12b126208cc5f04e62bd96329a7d5ab50ba0d85f08e1eb21bfe2519" exitCode=0 Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.296621 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" event={"ID":"17731592-890a-403a-86f6-61223bcd5320","Type":"ContainerDied","Data":"b2d4a08ed12b126208cc5f04e62bd96329a7d5ab50ba0d85f08e1eb21bfe2519"} Jan 30 06:28:35 crc kubenswrapper[4841]: E0130 06:28:35.323029 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17731592_890a_403a_86f6_61223bcd5320.slice/crio-conmon-b2d4a08ed12b126208cc5f04e62bd96329a7d5ab50ba0d85f08e1eb21bfe2519.scope\": RecentStats: unable to find data in memory cache]" Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.500090 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.608743 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24b52\" (UniqueName: \"kubernetes.io/projected/17731592-890a-403a-86f6-61223bcd5320-kube-api-access-24b52\") pod \"17731592-890a-403a-86f6-61223bcd5320\" (UID: \"17731592-890a-403a-86f6-61223bcd5320\") " Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.608843 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17731592-890a-403a-86f6-61223bcd5320-config\") pod \"17731592-890a-403a-86f6-61223bcd5320\" (UID: \"17731592-890a-403a-86f6-61223bcd5320\") " Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.608890 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17731592-890a-403a-86f6-61223bcd5320-dns-svc\") pod \"17731592-890a-403a-86f6-61223bcd5320\" (UID: \"17731592-890a-403a-86f6-61223bcd5320\") " Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.617966 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17731592-890a-403a-86f6-61223bcd5320-kube-api-access-24b52" (OuterVolumeSpecName: "kube-api-access-24b52") pod "17731592-890a-403a-86f6-61223bcd5320" (UID: "17731592-890a-403a-86f6-61223bcd5320"). InnerVolumeSpecName "kube-api-access-24b52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.646678 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17731592-890a-403a-86f6-61223bcd5320-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17731592-890a-403a-86f6-61223bcd5320" (UID: "17731592-890a-403a-86f6-61223bcd5320"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.658128 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17731592-890a-403a-86f6-61223bcd5320-config" (OuterVolumeSpecName: "config") pod "17731592-890a-403a-86f6-61223bcd5320" (UID: "17731592-890a-403a-86f6-61223bcd5320"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.711198 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17731592-890a-403a-86f6-61223bcd5320-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.711252 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24b52\" (UniqueName: \"kubernetes.io/projected/17731592-890a-403a-86f6-61223bcd5320-kube-api-access-24b52\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:35 crc kubenswrapper[4841]: I0130 06:28:35.711271 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17731592-890a-403a-86f6-61223bcd5320-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.309459 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" event={"ID":"17731592-890a-403a-86f6-61223bcd5320","Type":"ContainerDied","Data":"b50eb118fbf4ffed03434a01e8639c548780a2e050e3faddfcf663b225838241"} Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.309821 4841 scope.go:117] "RemoveContainer" containerID="b2d4a08ed12b126208cc5f04e62bd96329a7d5ab50ba0d85f08e1eb21bfe2519" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.310075 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79f765b5-r27xj" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.323103 4841 generic.go:334] "Generic (PLEG): container finished" podID="db910dc4-3d79-4346-bce7-5ee16ef0576d" containerID="5787bc454e10577dc1291261cb2b749b629818e5df6e4e5977b0c283ada08adf" exitCode=0 Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.323180 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"db910dc4-3d79-4346-bce7-5ee16ef0576d","Type":"ContainerDied","Data":"5787bc454e10577dc1291261cb2b749b629818e5df6e4e5977b0c283ada08adf"} Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.341516 4841 scope.go:117] "RemoveContainer" containerID="6f8d8fd789a3a8c5538159b9fa3cbdbb5f76436fde77899c3b06b5e6c78713f0" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.388380 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-r27xj"] Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.398471 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-r27xj"] Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.448584 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17731592-890a-403a-86f6-61223bcd5320" path="/var/lib/kubelet/pods/17731592-890a-403a-86f6-61223bcd5320/volumes" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.816619 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.931898 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-plugins-conf\") pod \"db910dc4-3d79-4346-bce7-5ee16ef0576d\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.932008 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-plugins\") pod \"db910dc4-3d79-4346-bce7-5ee16ef0576d\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.932049 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvk2x\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-kube-api-access-wvk2x\") pod \"db910dc4-3d79-4346-bce7-5ee16ef0576d\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.932136 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-confd\") pod \"db910dc4-3d79-4346-bce7-5ee16ef0576d\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.932188 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db910dc4-3d79-4346-bce7-5ee16ef0576d-erlang-cookie-secret\") pod \"db910dc4-3d79-4346-bce7-5ee16ef0576d\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.932250 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-server-conf\") pod \"db910dc4-3d79-4346-bce7-5ee16ef0576d\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.932629 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\") pod \"db910dc4-3d79-4346-bce7-5ee16ef0576d\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.932687 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-config-data\") pod \"db910dc4-3d79-4346-bce7-5ee16ef0576d\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.932722 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-erlang-cookie\") pod \"db910dc4-3d79-4346-bce7-5ee16ef0576d\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.932809 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-tls\") pod \"db910dc4-3d79-4346-bce7-5ee16ef0576d\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.932852 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db910dc4-3d79-4346-bce7-5ee16ef0576d-pod-info\") pod \"db910dc4-3d79-4346-bce7-5ee16ef0576d\" (UID: \"db910dc4-3d79-4346-bce7-5ee16ef0576d\") " Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.932659 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "db910dc4-3d79-4346-bce7-5ee16ef0576d" (UID: "db910dc4-3d79-4346-bce7-5ee16ef0576d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.933211 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "db910dc4-3d79-4346-bce7-5ee16ef0576d" (UID: "db910dc4-3d79-4346-bce7-5ee16ef0576d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.934005 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "db910dc4-3d79-4346-bce7-5ee16ef0576d" (UID: "db910dc4-3d79-4346-bce7-5ee16ef0576d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.941532 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/db910dc4-3d79-4346-bce7-5ee16ef0576d-pod-info" (OuterVolumeSpecName: "pod-info") pod "db910dc4-3d79-4346-bce7-5ee16ef0576d" (UID: "db910dc4-3d79-4346-bce7-5ee16ef0576d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.941573 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "db910dc4-3d79-4346-bce7-5ee16ef0576d" (UID: "db910dc4-3d79-4346-bce7-5ee16ef0576d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.941601 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db910dc4-3d79-4346-bce7-5ee16ef0576d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "db910dc4-3d79-4346-bce7-5ee16ef0576d" (UID: "db910dc4-3d79-4346-bce7-5ee16ef0576d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.941601 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-kube-api-access-wvk2x" (OuterVolumeSpecName: "kube-api-access-wvk2x") pod "db910dc4-3d79-4346-bce7-5ee16ef0576d" (UID: "db910dc4-3d79-4346-bce7-5ee16ef0576d"). InnerVolumeSpecName "kube-api-access-wvk2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.969026 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b" (OuterVolumeSpecName: "persistence") pod "db910dc4-3d79-4346-bce7-5ee16ef0576d" (UID: "db910dc4-3d79-4346-bce7-5ee16ef0576d"). InnerVolumeSpecName "pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.987136 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-config-data" (OuterVolumeSpecName: "config-data") pod "db910dc4-3d79-4346-bce7-5ee16ef0576d" (UID: "db910dc4-3d79-4346-bce7-5ee16ef0576d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:28:36 crc kubenswrapper[4841]: I0130 06:28:36.989161 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-server-conf" (OuterVolumeSpecName: "server-conf") pod "db910dc4-3d79-4346-bce7-5ee16ef0576d" (UID: "db910dc4-3d79-4346-bce7-5ee16ef0576d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.025920 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "db910dc4-3d79-4346-bce7-5ee16ef0576d" (UID: "db910dc4-3d79-4346-bce7-5ee16ef0576d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.036149 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.036174 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvk2x\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-kube-api-access-wvk2x\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.036182 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.036190 4841 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db910dc4-3d79-4346-bce7-5ee16ef0576d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.036201 4841 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.036225 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\") on node \"crc\" " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.036237 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.036247 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.036257 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db910dc4-3d79-4346-bce7-5ee16ef0576d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.036265 4841 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db910dc4-3d79-4346-bce7-5ee16ef0576d-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.036272 4841 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db910dc4-3d79-4346-bce7-5ee16ef0576d-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.050608 4841 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.050723 4841 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b") on node "crc" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.137323 4841 reconciler_common.go:293] "Volume detached for volume \"pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.334420 4841 generic.go:334] "Generic (PLEG): container finished" podID="8fb4c14d-b45a-48d9-8233-d69c8928f10a" containerID="73550872df766b45850a9da92b066ded5db7e2ee29e90d70829cea000398b60e" exitCode=0 Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.334501 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fb4c14d-b45a-48d9-8233-d69c8928f10a","Type":"ContainerDied","Data":"73550872df766b45850a9da92b066ded5db7e2ee29e90d70829cea000398b60e"} Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.338458 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"db910dc4-3d79-4346-bce7-5ee16ef0576d","Type":"ContainerDied","Data":"df9d5ad62a832444f4497f40f83814cbf1ecd3e1add32d084c77a5480c17989a"} Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.338514 4841 scope.go:117] "RemoveContainer" containerID="5787bc454e10577dc1291261cb2b749b629818e5df6e4e5977b0c283ada08adf" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.338526 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.383181 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.398030 4841 scope.go:117] "RemoveContainer" containerID="65fa24b5fc8dbd46d053c318fb8572c25ea465821895b136e8708788173b117b" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.404905 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.424210 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.444390 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-server-conf\") pod \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.444512 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-confd\") pod \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.444543 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fb4c14d-b45a-48d9-8233-d69c8928f10a-erlang-cookie-secret\") pod \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.444718 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a\") pod \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.444742 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2w5v\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-kube-api-access-n2w5v\") pod \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.444769 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-plugins\") pod \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.444807 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-tls\") pod \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.444826 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-erlang-cookie\") pod \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.444858 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-config-data\") pod \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.444893 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-plugins-conf\") pod \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.444962 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fb4c14d-b45a-48d9-8233-d69c8928f10a-pod-info\") pod \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\" (UID: \"8fb4c14d-b45a-48d9-8233-d69c8928f10a\") " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.446888 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8fb4c14d-b45a-48d9-8233-d69c8928f10a" (UID: "8fb4c14d-b45a-48d9-8233-d69c8928f10a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.447130 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8fb4c14d-b45a-48d9-8233-d69c8928f10a" (UID: "8fb4c14d-b45a-48d9-8233-d69c8928f10a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.447767 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8fb4c14d-b45a-48d9-8233-d69c8928f10a-pod-info" (OuterVolumeSpecName: "pod-info") pod "8fb4c14d-b45a-48d9-8233-d69c8928f10a" (UID: "8fb4c14d-b45a-48d9-8233-d69c8928f10a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.449540 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:28:37 crc kubenswrapper[4841]: E0130 06:28:37.449896 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb4c14d-b45a-48d9-8233-d69c8928f10a" containerName="setup-container" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.449918 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb4c14d-b45a-48d9-8233-d69c8928f10a" containerName="setup-container" Jan 30 06:28:37 crc kubenswrapper[4841]: E0130 06:28:37.449932 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17731592-890a-403a-86f6-61223bcd5320" containerName="dnsmasq-dns" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.449939 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="17731592-890a-403a-86f6-61223bcd5320" containerName="dnsmasq-dns" Jan 30 06:28:37 crc kubenswrapper[4841]: E0130 06:28:37.449950 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db910dc4-3d79-4346-bce7-5ee16ef0576d" containerName="rabbitmq" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.449957 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db910dc4-3d79-4346-bce7-5ee16ef0576d" containerName="rabbitmq" Jan 30 06:28:37 crc kubenswrapper[4841]: E0130 06:28:37.449969 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db910dc4-3d79-4346-bce7-5ee16ef0576d" containerName="setup-container" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.449974 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db910dc4-3d79-4346-bce7-5ee16ef0576d" containerName="setup-container" Jan 30 06:28:37 crc kubenswrapper[4841]: E0130 06:28:37.449990 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb4c14d-b45a-48d9-8233-d69c8928f10a" containerName="rabbitmq" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.449996 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb4c14d-b45a-48d9-8233-d69c8928f10a" containerName="rabbitmq" Jan 30 06:28:37 crc kubenswrapper[4841]: E0130 06:28:37.450005 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17731592-890a-403a-86f6-61223bcd5320" containerName="init" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.450010 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="17731592-890a-403a-86f6-61223bcd5320" containerName="init" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.450114 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8fb4c14d-b45a-48d9-8233-d69c8928f10a" (UID: "8fb4c14d-b45a-48d9-8233-d69c8928f10a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.450140 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb4c14d-b45a-48d9-8233-d69c8928f10a" containerName="rabbitmq" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.450239 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="17731592-890a-403a-86f6-61223bcd5320" containerName="dnsmasq-dns" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.450267 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db910dc4-3d79-4346-bce7-5ee16ef0576d" containerName="rabbitmq" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.451785 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8fb4c14d-b45a-48d9-8233-d69c8928f10a" (UID: "8fb4c14d-b45a-48d9-8233-d69c8928f10a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.452098 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.453458 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.456706 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb4c14d-b45a-48d9-8233-d69c8928f10a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8fb4c14d-b45a-48d9-8233-d69c8928f10a" (UID: "8fb4c14d-b45a-48d9-8233-d69c8928f10a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.457070 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.457166 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x22vp" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.457369 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.457541 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.457675 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.458447 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.461679 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.463283 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-kube-api-access-n2w5v" (OuterVolumeSpecName: "kube-api-access-n2w5v") pod "8fb4c14d-b45a-48d9-8233-d69c8928f10a" (UID: "8fb4c14d-b45a-48d9-8233-d69c8928f10a"). InnerVolumeSpecName "kube-api-access-n2w5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.482863 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a" (OuterVolumeSpecName: "persistence") pod "8fb4c14d-b45a-48d9-8233-d69c8928f10a" (UID: "8fb4c14d-b45a-48d9-8233-d69c8928f10a"). InnerVolumeSpecName "pvc-4954808d-f069-4715-a5fb-55a2c904e83a". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.486590 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-config-data" (OuterVolumeSpecName: "config-data") pod "8fb4c14d-b45a-48d9-8233-d69c8928f10a" (UID: "8fb4c14d-b45a-48d9-8233-d69c8928f10a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.522590 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-server-conf" (OuterVolumeSpecName: "server-conf") pod "8fb4c14d-b45a-48d9-8233-d69c8928f10a" (UID: "8fb4c14d-b45a-48d9-8233-d69c8928f10a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.537724 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8fb4c14d-b45a-48d9-8233-d69c8928f10a" (UID: "8fb4c14d-b45a-48d9-8233-d69c8928f10a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546379 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a957beed-affa-4b58-9ac2-f3fe95d3a50c-config-data\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546435 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a957beed-affa-4b58-9ac2-f3fe95d3a50c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546456 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546515 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a957beed-affa-4b58-9ac2-f3fe95d3a50c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546534 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a957beed-affa-4b58-9ac2-f3fe95d3a50c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546559 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a957beed-affa-4b58-9ac2-f3fe95d3a50c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546576 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq6gv\" (UniqueName: \"kubernetes.io/projected/a957beed-affa-4b58-9ac2-f3fe95d3a50c-kube-api-access-nq6gv\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546593 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a957beed-affa-4b58-9ac2-f3fe95d3a50c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546623 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a957beed-affa-4b58-9ac2-f3fe95d3a50c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546641 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a957beed-affa-4b58-9ac2-f3fe95d3a50c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546673 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a957beed-affa-4b58-9ac2-f3fe95d3a50c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546720 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546731 4841 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fb4c14d-b45a-48d9-8233-d69c8928f10a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546753 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4954808d-f069-4715-a5fb-55a2c904e83a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a\") on node \"crc\" " Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546763 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2w5v\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-kube-api-access-n2w5v\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546772 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546780 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546789 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fb4c14d-b45a-48d9-8233-d69c8928f10a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546797 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546806 4841 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546813 4841 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fb4c14d-b45a-48d9-8233-d69c8928f10a-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.546822 4841 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fb4c14d-b45a-48d9-8233-d69c8928f10a-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.561098 4841 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.561210 4841 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4954808d-f069-4715-a5fb-55a2c904e83a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a") on node "crc" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.648431 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a957beed-affa-4b58-9ac2-f3fe95d3a50c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.648482 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a957beed-affa-4b58-9ac2-f3fe95d3a50c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.648519 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a957beed-affa-4b58-9ac2-f3fe95d3a50c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.648539 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq6gv\" (UniqueName: \"kubernetes.io/projected/a957beed-affa-4b58-9ac2-f3fe95d3a50c-kube-api-access-nq6gv\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.648553 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a957beed-affa-4b58-9ac2-f3fe95d3a50c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.648583 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a957beed-affa-4b58-9ac2-f3fe95d3a50c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.648599 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a957beed-affa-4b58-9ac2-f3fe95d3a50c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.648619 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a957beed-affa-4b58-9ac2-f3fe95d3a50c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.648644 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a957beed-affa-4b58-9ac2-f3fe95d3a50c-config-data\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.648661 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a957beed-affa-4b58-9ac2-f3fe95d3a50c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.648680 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.648745 4841 reconciler_common.go:293] "Volume detached for volume \"pvc-4954808d-f069-4715-a5fb-55a2c904e83a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a\") on node \"crc\" DevicePath \"\"" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.649332 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a957beed-affa-4b58-9ac2-f3fe95d3a50c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.649745 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a957beed-affa-4b58-9ac2-f3fe95d3a50c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.650775 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a957beed-affa-4b58-9ac2-f3fe95d3a50c-config-data\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.651167 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a957beed-affa-4b58-9ac2-f3fe95d3a50c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.651248 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a957beed-affa-4b58-9ac2-f3fe95d3a50c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.652495 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.652521 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7e36f4f0e8e3b9d51d5bc08cffff477e2bd49540e6100ea70e379b52ab5390c0/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.963428 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a957beed-affa-4b58-9ac2-f3fe95d3a50c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.963578 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a957beed-affa-4b58-9ac2-f3fe95d3a50c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.963735 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a957beed-affa-4b58-9ac2-f3fe95d3a50c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.964070 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a957beed-affa-4b58-9ac2-f3fe95d3a50c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:37 crc kubenswrapper[4841]: I0130 06:28:37.964869 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq6gv\" (UniqueName: \"kubernetes.io/projected/a957beed-affa-4b58-9ac2-f3fe95d3a50c-kube-api-access-nq6gv\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.094386 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38d35e4f-f99c-4a2a-9248-9286a1e6a10b\") pod \"rabbitmq-server-0\" (UID: \"a957beed-affa-4b58-9ac2-f3fe95d3a50c\") " pod="openstack/rabbitmq-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.118528 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.352370 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fb4c14d-b45a-48d9-8233-d69c8928f10a","Type":"ContainerDied","Data":"bfc9c909cf056d8c0027597c3254e35eb7fb2d657113bc9e38c4c961543b5a92"} Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.352695 4841 scope.go:117] "RemoveContainer" containerID="73550872df766b45850a9da92b066ded5db7e2ee29e90d70829cea000398b60e" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.352457 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.373690 4841 scope.go:117] "RemoveContainer" containerID="1c65b7743403721ba54e28250bc6aafe2e1aecebcf4cfc247c63bbd4ad1e756b" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.386024 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.413007 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.427178 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.428716 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.432632 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.432833 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.433219 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.433293 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cwgq7" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.433455 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.433521 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.433589 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.444681 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb4c14d-b45a-48d9-8233-d69c8928f10a" path="/var/lib/kubelet/pods/8fb4c14d-b45a-48d9-8233-d69c8928f10a/volumes" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.445438 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db910dc4-3d79-4346-bce7-5ee16ef0576d" path="/var/lib/kubelet/pods/db910dc4-3d79-4346-bce7-5ee16ef0576d/volumes" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.447971 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.464681 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a6b64b15-a098-4533-98d1-c9d8ac355ada-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.464780 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a6b64b15-a098-4533-98d1-c9d8ac355ada-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.464805 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a6b64b15-a098-4533-98d1-c9d8ac355ada-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.464849 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4954808d-f069-4715-a5fb-55a2c904e83a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.464895 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a6b64b15-a098-4533-98d1-c9d8ac355ada-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.464918 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a6b64b15-a098-4533-98d1-c9d8ac355ada-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.464948 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clbxw\" (UniqueName: \"kubernetes.io/projected/a6b64b15-a098-4533-98d1-c9d8ac355ada-kube-api-access-clbxw\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.464979 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a6b64b15-a098-4533-98d1-c9d8ac355ada-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.465003 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a6b64b15-a098-4533-98d1-c9d8ac355ada-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.465021 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a6b64b15-a098-4533-98d1-c9d8ac355ada-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.465042 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6b64b15-a098-4533-98d1-c9d8ac355ada-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.566158 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4954808d-f069-4715-a5fb-55a2c904e83a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.566241 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a6b64b15-a098-4533-98d1-c9d8ac355ada-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.566282 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a6b64b15-a098-4533-98d1-c9d8ac355ada-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.566321 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clbxw\" (UniqueName: \"kubernetes.io/projected/a6b64b15-a098-4533-98d1-c9d8ac355ada-kube-api-access-clbxw\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.566367 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a6b64b15-a098-4533-98d1-c9d8ac355ada-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.566426 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a6b64b15-a098-4533-98d1-c9d8ac355ada-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.566488 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a6b64b15-a098-4533-98d1-c9d8ac355ada-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.566538 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6b64b15-a098-4533-98d1-c9d8ac355ada-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.566592 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a6b64b15-a098-4533-98d1-c9d8ac355ada-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.566670 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a6b64b15-a098-4533-98d1-c9d8ac355ada-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.566756 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a6b64b15-a098-4533-98d1-c9d8ac355ada-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.567715 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a6b64b15-a098-4533-98d1-c9d8ac355ada-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.567745 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6b64b15-a098-4533-98d1-c9d8ac355ada-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.567808 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a6b64b15-a098-4533-98d1-c9d8ac355ada-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.567827 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a6b64b15-a098-4533-98d1-c9d8ac355ada-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.569105 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a6b64b15-a098-4533-98d1-c9d8ac355ada-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.572601 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a6b64b15-a098-4533-98d1-c9d8ac355ada-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.572703 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a6b64b15-a098-4533-98d1-c9d8ac355ada-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.573246 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.573275 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4954808d-f069-4715-a5fb-55a2c904e83a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/21cc17baf9198502c98b0abbf0c18d1a023cbe3555ccedbc346e9059d9fdd614/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.578532 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a6b64b15-a098-4533-98d1-c9d8ac355ada-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.579502 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a6b64b15-a098-4533-98d1-c9d8ac355ada-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.583475 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clbxw\" (UniqueName: \"kubernetes.io/projected/a6b64b15-a098-4533-98d1-c9d8ac355ada-kube-api-access-clbxw\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.617833 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4954808d-f069-4715-a5fb-55a2c904e83a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4954808d-f069-4715-a5fb-55a2c904e83a\") pod \"rabbitmq-cell1-server-0\" (UID: \"a6b64b15-a098-4533-98d1-c9d8ac355ada\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.643848 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 06:28:38 crc kubenswrapper[4841]: I0130 06:28:38.758215 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:28:39 crc kubenswrapper[4841]: I0130 06:28:39.341859 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 06:28:39 crc kubenswrapper[4841]: I0130 06:28:39.373320 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a6b64b15-a098-4533-98d1-c9d8ac355ada","Type":"ContainerStarted","Data":"ae47e19a10c1e170253e2d6ccfe4b04470247a93e551db36621c400cdfb9d5c9"} Jan 30 06:28:39 crc kubenswrapper[4841]: I0130 06:28:39.375245 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a957beed-affa-4b58-9ac2-f3fe95d3a50c","Type":"ContainerStarted","Data":"b6509436e8054ba08d143fcb45ee5f5ace46239a3652f2f2a453147d0fd02879"} Jan 30 06:28:41 crc kubenswrapper[4841]: I0130 06:28:41.394513 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a957beed-affa-4b58-9ac2-f3fe95d3a50c","Type":"ContainerStarted","Data":"555827f8e6b0e973dd176be00cdfc3e2535a30ce19f5b115225edac4ac9736c4"} Jan 30 06:28:41 crc kubenswrapper[4841]: I0130 06:28:41.398187 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a6b64b15-a098-4533-98d1-c9d8ac355ada","Type":"ContainerStarted","Data":"a86067ee113a57d2142d6fb05b325a4186f4be587fec79bda3f5e982f0383720"} Jan 30 06:29:13 crc kubenswrapper[4841]: I0130 06:29:13.388124 4841 generic.go:334] "Generic (PLEG): container finished" podID="a957beed-affa-4b58-9ac2-f3fe95d3a50c" containerID="555827f8e6b0e973dd176be00cdfc3e2535a30ce19f5b115225edac4ac9736c4" exitCode=0 Jan 30 06:29:13 crc kubenswrapper[4841]: I0130 06:29:13.388266 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a957beed-affa-4b58-9ac2-f3fe95d3a50c","Type":"ContainerDied","Data":"555827f8e6b0e973dd176be00cdfc3e2535a30ce19f5b115225edac4ac9736c4"} Jan 30 06:29:13 crc kubenswrapper[4841]: I0130 06:29:13.857961 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qwqp9"] Jan 30 06:29:13 crc kubenswrapper[4841]: I0130 06:29:13.861116 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:13 crc kubenswrapper[4841]: I0130 06:29:13.904282 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qwqp9"] Jan 30 06:29:13 crc kubenswrapper[4841]: I0130 06:29:13.961303 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7759a329-326b-45a9-b2a3-c6513030d60d-utilities\") pod \"redhat-operators-qwqp9\" (UID: \"7759a329-326b-45a9-b2a3-c6513030d60d\") " pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:13 crc kubenswrapper[4841]: I0130 06:29:13.961435 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sz8g\" (UniqueName: \"kubernetes.io/projected/7759a329-326b-45a9-b2a3-c6513030d60d-kube-api-access-8sz8g\") pod \"redhat-operators-qwqp9\" (UID: \"7759a329-326b-45a9-b2a3-c6513030d60d\") " pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:13 crc kubenswrapper[4841]: I0130 06:29:13.961635 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7759a329-326b-45a9-b2a3-c6513030d60d-catalog-content\") pod \"redhat-operators-qwqp9\" (UID: \"7759a329-326b-45a9-b2a3-c6513030d60d\") " pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:14 crc kubenswrapper[4841]: I0130 06:29:14.062851 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7759a329-326b-45a9-b2a3-c6513030d60d-utilities\") pod \"redhat-operators-qwqp9\" (UID: \"7759a329-326b-45a9-b2a3-c6513030d60d\") " pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:14 crc kubenswrapper[4841]: I0130 06:29:14.062907 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sz8g\" (UniqueName: \"kubernetes.io/projected/7759a329-326b-45a9-b2a3-c6513030d60d-kube-api-access-8sz8g\") pod \"redhat-operators-qwqp9\" (UID: \"7759a329-326b-45a9-b2a3-c6513030d60d\") " pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:14 crc kubenswrapper[4841]: I0130 06:29:14.062979 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7759a329-326b-45a9-b2a3-c6513030d60d-catalog-content\") pod \"redhat-operators-qwqp9\" (UID: \"7759a329-326b-45a9-b2a3-c6513030d60d\") " pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:14 crc kubenswrapper[4841]: I0130 06:29:14.063385 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7759a329-326b-45a9-b2a3-c6513030d60d-utilities\") pod \"redhat-operators-qwqp9\" (UID: \"7759a329-326b-45a9-b2a3-c6513030d60d\") " pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:14 crc kubenswrapper[4841]: I0130 06:29:14.063436 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7759a329-326b-45a9-b2a3-c6513030d60d-catalog-content\") pod \"redhat-operators-qwqp9\" (UID: \"7759a329-326b-45a9-b2a3-c6513030d60d\") " pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:14 crc kubenswrapper[4841]: I0130 06:29:14.085938 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sz8g\" (UniqueName: \"kubernetes.io/projected/7759a329-326b-45a9-b2a3-c6513030d60d-kube-api-access-8sz8g\") pod \"redhat-operators-qwqp9\" (UID: \"7759a329-326b-45a9-b2a3-c6513030d60d\") " pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:14 crc kubenswrapper[4841]: I0130 06:29:14.190374 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:14 crc kubenswrapper[4841]: I0130 06:29:14.399869 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a957beed-affa-4b58-9ac2-f3fe95d3a50c","Type":"ContainerStarted","Data":"6e09a2b7f029dc24f886e955ee9731dce89c48ed8ec03a3df8c3126e510a1415"} Jan 30 06:29:14 crc kubenswrapper[4841]: I0130 06:29:14.400478 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 06:29:14 crc kubenswrapper[4841]: I0130 06:29:14.433773 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.433751997 podStartE2EDuration="37.433751997s" podCreationTimestamp="2026-01-30 06:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:29:14.426415762 +0000 UTC m=+4891.419888400" watchObservedRunningTime="2026-01-30 06:29:14.433751997 +0000 UTC m=+4891.427224635" Jan 30 06:29:14 crc kubenswrapper[4841]: I0130 06:29:14.639952 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qwqp9"] Jan 30 06:29:15 crc kubenswrapper[4841]: I0130 06:29:15.410897 4841 generic.go:334] "Generic (PLEG): container finished" podID="7759a329-326b-45a9-b2a3-c6513030d60d" containerID="1dcfd509f122cfd68b339dffdc4e998002cf347de70de315a71c2ed23c774ef6" exitCode=0 Jan 30 06:29:15 crc kubenswrapper[4841]: I0130 06:29:15.410979 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwqp9" event={"ID":"7759a329-326b-45a9-b2a3-c6513030d60d","Type":"ContainerDied","Data":"1dcfd509f122cfd68b339dffdc4e998002cf347de70de315a71c2ed23c774ef6"} Jan 30 06:29:15 crc kubenswrapper[4841]: I0130 06:29:15.411188 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwqp9" event={"ID":"7759a329-326b-45a9-b2a3-c6513030d60d","Type":"ContainerStarted","Data":"a9962b35d449b26eb0c109023a288f4e352565f627d015ddd0f4cba15d082065"} Jan 30 06:29:15 crc kubenswrapper[4841]: I0130 06:29:15.416790 4841 generic.go:334] "Generic (PLEG): container finished" podID="a6b64b15-a098-4533-98d1-c9d8ac355ada" containerID="a86067ee113a57d2142d6fb05b325a4186f4be587fec79bda3f5e982f0383720" exitCode=0 Jan 30 06:29:15 crc kubenswrapper[4841]: I0130 06:29:15.416948 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a6b64b15-a098-4533-98d1-c9d8ac355ada","Type":"ContainerDied","Data":"a86067ee113a57d2142d6fb05b325a4186f4be587fec79bda3f5e982f0383720"} Jan 30 06:29:16 crc kubenswrapper[4841]: I0130 06:29:16.424776 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwqp9" event={"ID":"7759a329-326b-45a9-b2a3-c6513030d60d","Type":"ContainerStarted","Data":"3d900dd59373a27f9455c5fc99a2dd709f32de02c3a51d3dfc34af75157b6ae3"} Jan 30 06:29:16 crc kubenswrapper[4841]: I0130 06:29:16.427343 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a6b64b15-a098-4533-98d1-c9d8ac355ada","Type":"ContainerStarted","Data":"7f8b8f9da49f54b1ac55f988ca6ec2a5308e03d1832f5730af745355d9b009ee"} Jan 30 06:29:16 crc kubenswrapper[4841]: I0130 06:29:16.427860 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:29:16 crc kubenswrapper[4841]: I0130 06:29:16.472838 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.472812787 podStartE2EDuration="38.472812787s" podCreationTimestamp="2026-01-30 06:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:29:16.469301043 +0000 UTC m=+4893.462773741" watchObservedRunningTime="2026-01-30 06:29:16.472812787 +0000 UTC m=+4893.466285465" Jan 30 06:29:17 crc kubenswrapper[4841]: I0130 06:29:17.437422 4841 generic.go:334] "Generic (PLEG): container finished" podID="7759a329-326b-45a9-b2a3-c6513030d60d" containerID="3d900dd59373a27f9455c5fc99a2dd709f32de02c3a51d3dfc34af75157b6ae3" exitCode=0 Jan 30 06:29:17 crc kubenswrapper[4841]: I0130 06:29:17.437528 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwqp9" event={"ID":"7759a329-326b-45a9-b2a3-c6513030d60d","Type":"ContainerDied","Data":"3d900dd59373a27f9455c5fc99a2dd709f32de02c3a51d3dfc34af75157b6ae3"} Jan 30 06:29:18 crc kubenswrapper[4841]: I0130 06:29:18.450476 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwqp9" event={"ID":"7759a329-326b-45a9-b2a3-c6513030d60d","Type":"ContainerStarted","Data":"43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a"} Jan 30 06:29:18 crc kubenswrapper[4841]: I0130 06:29:18.489723 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qwqp9" podStartSLOduration=3.071627857 podStartE2EDuration="5.489696165s" podCreationTimestamp="2026-01-30 06:29:13 +0000 UTC" firstStartedPulling="2026-01-30 06:29:15.412930093 +0000 UTC m=+4892.406402731" lastFinishedPulling="2026-01-30 06:29:17.830998401 +0000 UTC m=+4894.824471039" observedRunningTime="2026-01-30 06:29:18.483279664 +0000 UTC m=+4895.476752312" watchObservedRunningTime="2026-01-30 06:29:18.489696165 +0000 UTC m=+4895.483168833" Jan 30 06:29:24 crc kubenswrapper[4841]: I0130 06:29:24.190768 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:24 crc kubenswrapper[4841]: I0130 06:29:24.191475 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:25 crc kubenswrapper[4841]: I0130 06:29:25.267860 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qwqp9" podUID="7759a329-326b-45a9-b2a3-c6513030d60d" containerName="registry-server" probeResult="failure" output=< Jan 30 06:29:25 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 06:29:25 crc kubenswrapper[4841]: > Jan 30 06:29:28 crc kubenswrapper[4841]: I0130 06:29:28.123767 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 06:29:28 crc kubenswrapper[4841]: I0130 06:29:28.763696 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 06:29:32 crc kubenswrapper[4841]: I0130 06:29:32.694463 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:32 crc kubenswrapper[4841]: I0130 06:29:32.696540 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:29:32 crc kubenswrapper[4841]: I0130 06:29:32.698350 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-b4dbj" Jan 30 06:29:32 crc kubenswrapper[4841]: I0130 06:29:32.702073 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:32 crc kubenswrapper[4841]: I0130 06:29:32.799362 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkqj6\" (UniqueName: \"kubernetes.io/projected/4f0e0612-e1b5-49c9-8bfe-8db12be3954e-kube-api-access-qkqj6\") pod \"mariadb-client\" (UID: \"4f0e0612-e1b5-49c9-8bfe-8db12be3954e\") " pod="openstack/mariadb-client" Jan 30 06:29:32 crc kubenswrapper[4841]: I0130 06:29:32.901162 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkqj6\" (UniqueName: \"kubernetes.io/projected/4f0e0612-e1b5-49c9-8bfe-8db12be3954e-kube-api-access-qkqj6\") pod \"mariadb-client\" (UID: \"4f0e0612-e1b5-49c9-8bfe-8db12be3954e\") " pod="openstack/mariadb-client" Jan 30 06:29:33 crc kubenswrapper[4841]: I0130 06:29:33.543451 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkqj6\" (UniqueName: \"kubernetes.io/projected/4f0e0612-e1b5-49c9-8bfe-8db12be3954e-kube-api-access-qkqj6\") pod \"mariadb-client\" (UID: \"4f0e0612-e1b5-49c9-8bfe-8db12be3954e\") " pod="openstack/mariadb-client" Jan 30 06:29:33 crc kubenswrapper[4841]: I0130 06:29:33.828079 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:29:34 crc kubenswrapper[4841]: I0130 06:29:34.257791 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:34 crc kubenswrapper[4841]: I0130 06:29:34.333934 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:34 crc kubenswrapper[4841]: I0130 06:29:34.442889 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:34 crc kubenswrapper[4841]: W0130 06:29:34.446601 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f0e0612_e1b5_49c9_8bfe_8db12be3954e.slice/crio-59fba5c5a40af488c292a91402354f61e0de82f6ba6dcddb2e510cde7ad0a436 WatchSource:0}: Error finding container 59fba5c5a40af488c292a91402354f61e0de82f6ba6dcddb2e510cde7ad0a436: Status 404 returned error can't find the container with id 59fba5c5a40af488c292a91402354f61e0de82f6ba6dcddb2e510cde7ad0a436 Jan 30 06:29:34 crc kubenswrapper[4841]: I0130 06:29:34.500801 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qwqp9"] Jan 30 06:29:34 crc kubenswrapper[4841]: I0130 06:29:34.606509 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4f0e0612-e1b5-49c9-8bfe-8db12be3954e","Type":"ContainerStarted","Data":"59fba5c5a40af488c292a91402354f61e0de82f6ba6dcddb2e510cde7ad0a436"} Jan 30 06:29:35 crc kubenswrapper[4841]: I0130 06:29:35.616870 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4f0e0612-e1b5-49c9-8bfe-8db12be3954e","Type":"ContainerStarted","Data":"8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a"} Jan 30 06:29:35 crc kubenswrapper[4841]: I0130 06:29:35.617579 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qwqp9" podUID="7759a329-326b-45a9-b2a3-c6513030d60d" containerName="registry-server" containerID="cri-o://43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a" gracePeriod=2 Jan 30 06:29:35 crc kubenswrapper[4841]: I0130 06:29:35.639125 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=3.17751881 podStartE2EDuration="3.639106768s" podCreationTimestamp="2026-01-30 06:29:32 +0000 UTC" firstStartedPulling="2026-01-30 06:29:34.448659049 +0000 UTC m=+4911.442131687" lastFinishedPulling="2026-01-30 06:29:34.910246967 +0000 UTC m=+4911.903719645" observedRunningTime="2026-01-30 06:29:35.634128786 +0000 UTC m=+4912.627601464" watchObservedRunningTime="2026-01-30 06:29:35.639106768 +0000 UTC m=+4912.632579406" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.121601 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.256099 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sz8g\" (UniqueName: \"kubernetes.io/projected/7759a329-326b-45a9-b2a3-c6513030d60d-kube-api-access-8sz8g\") pod \"7759a329-326b-45a9-b2a3-c6513030d60d\" (UID: \"7759a329-326b-45a9-b2a3-c6513030d60d\") " Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.256548 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7759a329-326b-45a9-b2a3-c6513030d60d-catalog-content\") pod \"7759a329-326b-45a9-b2a3-c6513030d60d\" (UID: \"7759a329-326b-45a9-b2a3-c6513030d60d\") " Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.256720 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7759a329-326b-45a9-b2a3-c6513030d60d-utilities\") pod \"7759a329-326b-45a9-b2a3-c6513030d60d\" (UID: \"7759a329-326b-45a9-b2a3-c6513030d60d\") " Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.257926 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7759a329-326b-45a9-b2a3-c6513030d60d-utilities" (OuterVolumeSpecName: "utilities") pod "7759a329-326b-45a9-b2a3-c6513030d60d" (UID: "7759a329-326b-45a9-b2a3-c6513030d60d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.267756 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7759a329-326b-45a9-b2a3-c6513030d60d-kube-api-access-8sz8g" (OuterVolumeSpecName: "kube-api-access-8sz8g") pod "7759a329-326b-45a9-b2a3-c6513030d60d" (UID: "7759a329-326b-45a9-b2a3-c6513030d60d"). InnerVolumeSpecName "kube-api-access-8sz8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.358960 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sz8g\" (UniqueName: \"kubernetes.io/projected/7759a329-326b-45a9-b2a3-c6513030d60d-kube-api-access-8sz8g\") on node \"crc\" DevicePath \"\"" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.359014 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7759a329-326b-45a9-b2a3-c6513030d60d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.401766 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7759a329-326b-45a9-b2a3-c6513030d60d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7759a329-326b-45a9-b2a3-c6513030d60d" (UID: "7759a329-326b-45a9-b2a3-c6513030d60d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.460210 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7759a329-326b-45a9-b2a3-c6513030d60d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.626199 4841 generic.go:334] "Generic (PLEG): container finished" podID="7759a329-326b-45a9-b2a3-c6513030d60d" containerID="43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a" exitCode=0 Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.626294 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwqp9" event={"ID":"7759a329-326b-45a9-b2a3-c6513030d60d","Type":"ContainerDied","Data":"43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a"} Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.627468 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwqp9" event={"ID":"7759a329-326b-45a9-b2a3-c6513030d60d","Type":"ContainerDied","Data":"a9962b35d449b26eb0c109023a288f4e352565f627d015ddd0f4cba15d082065"} Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.626302 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qwqp9" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.627512 4841 scope.go:117] "RemoveContainer" containerID="43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.655498 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qwqp9"] Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.655776 4841 scope.go:117] "RemoveContainer" containerID="3d900dd59373a27f9455c5fc99a2dd709f32de02c3a51d3dfc34af75157b6ae3" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.662897 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qwqp9"] Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.679232 4841 scope.go:117] "RemoveContainer" containerID="1dcfd509f122cfd68b339dffdc4e998002cf347de70de315a71c2ed23c774ef6" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.710042 4841 scope.go:117] "RemoveContainer" containerID="43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a" Jan 30 06:29:36 crc kubenswrapper[4841]: E0130 06:29:36.710862 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a\": container with ID starting with 43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a not found: ID does not exist" containerID="43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.710914 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a"} err="failed to get container status \"43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a\": rpc error: code = NotFound desc = could not find container \"43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a\": container with ID starting with 43bf9e3133c05d2598f6defe1184e7230b760ef109432f8c331836f5048c718a not found: ID does not exist" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.710939 4841 scope.go:117] "RemoveContainer" containerID="3d900dd59373a27f9455c5fc99a2dd709f32de02c3a51d3dfc34af75157b6ae3" Jan 30 06:29:36 crc kubenswrapper[4841]: E0130 06:29:36.711498 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d900dd59373a27f9455c5fc99a2dd709f32de02c3a51d3dfc34af75157b6ae3\": container with ID starting with 3d900dd59373a27f9455c5fc99a2dd709f32de02c3a51d3dfc34af75157b6ae3 not found: ID does not exist" containerID="3d900dd59373a27f9455c5fc99a2dd709f32de02c3a51d3dfc34af75157b6ae3" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.711563 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d900dd59373a27f9455c5fc99a2dd709f32de02c3a51d3dfc34af75157b6ae3"} err="failed to get container status \"3d900dd59373a27f9455c5fc99a2dd709f32de02c3a51d3dfc34af75157b6ae3\": rpc error: code = NotFound desc = could not find container \"3d900dd59373a27f9455c5fc99a2dd709f32de02c3a51d3dfc34af75157b6ae3\": container with ID starting with 3d900dd59373a27f9455c5fc99a2dd709f32de02c3a51d3dfc34af75157b6ae3 not found: ID does not exist" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.711606 4841 scope.go:117] "RemoveContainer" containerID="1dcfd509f122cfd68b339dffdc4e998002cf347de70de315a71c2ed23c774ef6" Jan 30 06:29:36 crc kubenswrapper[4841]: E0130 06:29:36.712008 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dcfd509f122cfd68b339dffdc4e998002cf347de70de315a71c2ed23c774ef6\": container with ID starting with 1dcfd509f122cfd68b339dffdc4e998002cf347de70de315a71c2ed23c774ef6 not found: ID does not exist" containerID="1dcfd509f122cfd68b339dffdc4e998002cf347de70de315a71c2ed23c774ef6" Jan 30 06:29:36 crc kubenswrapper[4841]: I0130 06:29:36.712044 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dcfd509f122cfd68b339dffdc4e998002cf347de70de315a71c2ed23c774ef6"} err="failed to get container status \"1dcfd509f122cfd68b339dffdc4e998002cf347de70de315a71c2ed23c774ef6\": rpc error: code = NotFound desc = could not find container \"1dcfd509f122cfd68b339dffdc4e998002cf347de70de315a71c2ed23c774ef6\": container with ID starting with 1dcfd509f122cfd68b339dffdc4e998002cf347de70de315a71c2ed23c774ef6 not found: ID does not exist" Jan 30 06:29:38 crc kubenswrapper[4841]: I0130 06:29:38.448787 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7759a329-326b-45a9-b2a3-c6513030d60d" path="/var/lib/kubelet/pods/7759a329-326b-45a9-b2a3-c6513030d60d/volumes" Jan 30 06:29:40 crc kubenswrapper[4841]: I0130 06:29:40.463895 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:29:40 crc kubenswrapper[4841]: I0130 06:29:40.464246 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:29:48 crc kubenswrapper[4841]: I0130 06:29:48.630444 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:48 crc kubenswrapper[4841]: I0130 06:29:48.631197 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="4f0e0612-e1b5-49c9-8bfe-8db12be3954e" containerName="mariadb-client" containerID="cri-o://8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a" gracePeriod=30 Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.641629 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.732521 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkqj6\" (UniqueName: \"kubernetes.io/projected/4f0e0612-e1b5-49c9-8bfe-8db12be3954e-kube-api-access-qkqj6\") pod \"4f0e0612-e1b5-49c9-8bfe-8db12be3954e\" (UID: \"4f0e0612-e1b5-49c9-8bfe-8db12be3954e\") " Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.741992 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f0e0612-e1b5-49c9-8bfe-8db12be3954e-kube-api-access-qkqj6" (OuterVolumeSpecName: "kube-api-access-qkqj6") pod "4f0e0612-e1b5-49c9-8bfe-8db12be3954e" (UID: "4f0e0612-e1b5-49c9-8bfe-8db12be3954e"). InnerVolumeSpecName "kube-api-access-qkqj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.751888 4841 generic.go:334] "Generic (PLEG): container finished" podID="4f0e0612-e1b5-49c9-8bfe-8db12be3954e" containerID="8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a" exitCode=143 Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.751955 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4f0e0612-e1b5-49c9-8bfe-8db12be3954e","Type":"ContainerDied","Data":"8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a"} Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.751994 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4f0e0612-e1b5-49c9-8bfe-8db12be3954e","Type":"ContainerDied","Data":"59fba5c5a40af488c292a91402354f61e0de82f6ba6dcddb2e510cde7ad0a436"} Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.752022 4841 scope.go:117] "RemoveContainer" containerID="8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a" Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.752080 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.801630 4841 scope.go:117] "RemoveContainer" containerID="8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a" Jan 30 06:29:49 crc kubenswrapper[4841]: E0130 06:29:49.802226 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a\": container with ID starting with 8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a not found: ID does not exist" containerID="8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a" Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.803925 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a"} err="failed to get container status \"8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a\": rpc error: code = NotFound desc = could not find container \"8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a\": container with ID starting with 8f203a67d8a0aea9f604f67b72ab64b23773a7312fa4899b455bcbd724d4cc1a not found: ID does not exist" Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.808373 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.816749 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:29:49 crc kubenswrapper[4841]: I0130 06:29:49.834147 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkqj6\" (UniqueName: \"kubernetes.io/projected/4f0e0612-e1b5-49c9-8bfe-8db12be3954e-kube-api-access-qkqj6\") on node \"crc\" DevicePath \"\"" Jan 30 06:29:50 crc kubenswrapper[4841]: I0130 06:29:50.449725 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f0e0612-e1b5-49c9-8bfe-8db12be3954e" path="/var/lib/kubelet/pods/4f0e0612-e1b5-49c9-8bfe-8db12be3954e/volumes" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.172285 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7"] Jan 30 06:30:00 crc kubenswrapper[4841]: E0130 06:30:00.173535 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7759a329-326b-45a9-b2a3-c6513030d60d" containerName="extract-content" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.173563 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7759a329-326b-45a9-b2a3-c6513030d60d" containerName="extract-content" Jan 30 06:30:00 crc kubenswrapper[4841]: E0130 06:30:00.173618 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7759a329-326b-45a9-b2a3-c6513030d60d" containerName="extract-utilities" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.173636 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7759a329-326b-45a9-b2a3-c6513030d60d" containerName="extract-utilities" Jan 30 06:30:00 crc kubenswrapper[4841]: E0130 06:30:00.173663 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7759a329-326b-45a9-b2a3-c6513030d60d" containerName="registry-server" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.173675 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7759a329-326b-45a9-b2a3-c6513030d60d" containerName="registry-server" Jan 30 06:30:00 crc kubenswrapper[4841]: E0130 06:30:00.173701 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f0e0612-e1b5-49c9-8bfe-8db12be3954e" containerName="mariadb-client" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.173713 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0e0612-e1b5-49c9-8bfe-8db12be3954e" containerName="mariadb-client" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.173956 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7759a329-326b-45a9-b2a3-c6513030d60d" containerName="registry-server" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.173983 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f0e0612-e1b5-49c9-8bfe-8db12be3954e" containerName="mariadb-client" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.174823 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.178811 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.178972 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.194476 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7"] Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.304318 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-secret-volume\") pod \"collect-profiles-29495910-664t7\" (UID: \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.304370 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-config-volume\") pod \"collect-profiles-29495910-664t7\" (UID: \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.304550 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxzh\" (UniqueName: \"kubernetes.io/projected/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-kube-api-access-dgxzh\") pod \"collect-profiles-29495910-664t7\" (UID: \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.405915 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-secret-volume\") pod \"collect-profiles-29495910-664t7\" (UID: \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.406021 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-config-volume\") pod \"collect-profiles-29495910-664t7\" (UID: \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.406247 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxzh\" (UniqueName: \"kubernetes.io/projected/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-kube-api-access-dgxzh\") pod \"collect-profiles-29495910-664t7\" (UID: \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.407223 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-config-volume\") pod \"collect-profiles-29495910-664t7\" (UID: \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.464091 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-secret-volume\") pod \"collect-profiles-29495910-664t7\" (UID: \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.464375 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxzh\" (UniqueName: \"kubernetes.io/projected/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-kube-api-access-dgxzh\") pod \"collect-profiles-29495910-664t7\" (UID: \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.506327 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:00 crc kubenswrapper[4841]: I0130 06:30:00.854708 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7"] Jan 30 06:30:01 crc kubenswrapper[4841]: I0130 06:30:01.854220 4841 generic.go:334] "Generic (PLEG): container finished" podID="8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b" containerID="69bdfd8c97f13c1bedb72d6c61327afd7e2b82f25e9a0b2e96788d5b0c1a23fa" exitCode=0 Jan 30 06:30:01 crc kubenswrapper[4841]: I0130 06:30:01.854300 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" event={"ID":"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b","Type":"ContainerDied","Data":"69bdfd8c97f13c1bedb72d6c61327afd7e2b82f25e9a0b2e96788d5b0c1a23fa"} Jan 30 06:30:01 crc kubenswrapper[4841]: I0130 06:30:01.854637 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" event={"ID":"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b","Type":"ContainerStarted","Data":"4113ad993479c2684e9e080a74de0b7be90eee74900a100c7ec76f5e657e5844"} Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.221341 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.356630 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-secret-volume\") pod \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\" (UID: \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\") " Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.356778 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-config-volume\") pod \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\" (UID: \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\") " Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.357023 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgxzh\" (UniqueName: \"kubernetes.io/projected/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-kube-api-access-dgxzh\") pod \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\" (UID: \"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b\") " Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.357669 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-config-volume" (OuterVolumeSpecName: "config-volume") pod "8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b" (UID: "8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.362230 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-kube-api-access-dgxzh" (OuterVolumeSpecName: "kube-api-access-dgxzh") pod "8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b" (UID: "8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b"). InnerVolumeSpecName "kube-api-access-dgxzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.367644 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b" (UID: "8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.462145 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgxzh\" (UniqueName: \"kubernetes.io/projected/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-kube-api-access-dgxzh\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.462195 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.462248 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.874269 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" event={"ID":"8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b","Type":"ContainerDied","Data":"4113ad993479c2684e9e080a74de0b7be90eee74900a100c7ec76f5e657e5844"} Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.874924 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4113ad993479c2684e9e080a74de0b7be90eee74900a100c7ec76f5e657e5844" Jan 30 06:30:03 crc kubenswrapper[4841]: I0130 06:30:03.874377 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495910-664t7" Jan 30 06:30:04 crc kubenswrapper[4841]: I0130 06:30:04.330133 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk"] Jan 30 06:30:04 crc kubenswrapper[4841]: I0130 06:30:04.340537 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495865-f8fjk"] Jan 30 06:30:04 crc kubenswrapper[4841]: I0130 06:30:04.447138 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eedeb5d2-eeb0-44f1-919d-23fe3b5c0316" path="/var/lib/kubelet/pods/eedeb5d2-eeb0-44f1-919d-23fe3b5c0316/volumes" Jan 30 06:30:10 crc kubenswrapper[4841]: I0130 06:30:10.464332 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:30:10 crc kubenswrapper[4841]: I0130 06:30:10.464957 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.398907 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jw7x7"] Jan 30 06:30:20 crc kubenswrapper[4841]: E0130 06:30:20.400049 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b" containerName="collect-profiles" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.400071 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b" containerName="collect-profiles" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.400438 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e03b3d0-b593-4ee3-b5ad-4a0fe81eaf8b" containerName="collect-profiles" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.402464 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.414334 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jw7x7"] Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.602796 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-utilities\") pod \"community-operators-jw7x7\" (UID: \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\") " pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.602858 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prx62\" (UniqueName: \"kubernetes.io/projected/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-kube-api-access-prx62\") pod \"community-operators-jw7x7\" (UID: \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\") " pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.602886 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-catalog-content\") pod \"community-operators-jw7x7\" (UID: \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\") " pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.704651 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prx62\" (UniqueName: \"kubernetes.io/projected/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-kube-api-access-prx62\") pod \"community-operators-jw7x7\" (UID: \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\") " pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.704766 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-catalog-content\") pod \"community-operators-jw7x7\" (UID: \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\") " pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.704910 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-utilities\") pod \"community-operators-jw7x7\" (UID: \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\") " pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.705356 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-catalog-content\") pod \"community-operators-jw7x7\" (UID: \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\") " pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.705571 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-utilities\") pod \"community-operators-jw7x7\" (UID: \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\") " pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:20 crc kubenswrapper[4841]: I0130 06:30:20.729783 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prx62\" (UniqueName: \"kubernetes.io/projected/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-kube-api-access-prx62\") pod \"community-operators-jw7x7\" (UID: \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\") " pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:21 crc kubenswrapper[4841]: I0130 06:30:21.030140 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:21 crc kubenswrapper[4841]: I0130 06:30:21.340490 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jw7x7"] Jan 30 06:30:22 crc kubenswrapper[4841]: I0130 06:30:22.097704 4841 generic.go:334] "Generic (PLEG): container finished" podID="ebdfadcd-66c4-432d-b2f1-d6bb281c168b" containerID="5e89f5b4ba93b4b70cef42ab74b434603dd5611937ef57aa8f815c0adb994a4b" exitCode=0 Jan 30 06:30:22 crc kubenswrapper[4841]: I0130 06:30:22.097811 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jw7x7" event={"ID":"ebdfadcd-66c4-432d-b2f1-d6bb281c168b","Type":"ContainerDied","Data":"5e89f5b4ba93b4b70cef42ab74b434603dd5611937ef57aa8f815c0adb994a4b"} Jan 30 06:30:22 crc kubenswrapper[4841]: I0130 06:30:22.098376 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jw7x7" event={"ID":"ebdfadcd-66c4-432d-b2f1-d6bb281c168b","Type":"ContainerStarted","Data":"564435bc8ebfd62b8c7e653b62afcf087eb2c43ad200bb5cb2415a9020d80382"} Jan 30 06:30:22 crc kubenswrapper[4841]: I0130 06:30:22.100532 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:30:23 crc kubenswrapper[4841]: I0130 06:30:23.111785 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jw7x7" event={"ID":"ebdfadcd-66c4-432d-b2f1-d6bb281c168b","Type":"ContainerStarted","Data":"2c38c681d3808f30fbc8e3933ff50b66f28d23f521fd31799da7aed97b56d2f0"} Jan 30 06:30:24 crc kubenswrapper[4841]: I0130 06:30:24.126962 4841 generic.go:334] "Generic (PLEG): container finished" podID="ebdfadcd-66c4-432d-b2f1-d6bb281c168b" containerID="2c38c681d3808f30fbc8e3933ff50b66f28d23f521fd31799da7aed97b56d2f0" exitCode=0 Jan 30 06:30:24 crc kubenswrapper[4841]: I0130 06:30:24.127024 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jw7x7" event={"ID":"ebdfadcd-66c4-432d-b2f1-d6bb281c168b","Type":"ContainerDied","Data":"2c38c681d3808f30fbc8e3933ff50b66f28d23f521fd31799da7aed97b56d2f0"} Jan 30 06:30:25 crc kubenswrapper[4841]: I0130 06:30:25.163380 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jw7x7" event={"ID":"ebdfadcd-66c4-432d-b2f1-d6bb281c168b","Type":"ContainerStarted","Data":"7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5"} Jan 30 06:30:25 crc kubenswrapper[4841]: I0130 06:30:25.195396 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jw7x7" podStartSLOduration=2.5791331250000002 podStartE2EDuration="5.195365329s" podCreationTimestamp="2026-01-30 06:30:20 +0000 UTC" firstStartedPulling="2026-01-30 06:30:22.100204066 +0000 UTC m=+4959.093676704" lastFinishedPulling="2026-01-30 06:30:24.7164362 +0000 UTC m=+4961.709908908" observedRunningTime="2026-01-30 06:30:25.186537584 +0000 UTC m=+4962.180010282" watchObservedRunningTime="2026-01-30 06:30:25.195365329 +0000 UTC m=+4962.188838007" Jan 30 06:30:31 crc kubenswrapper[4841]: I0130 06:30:31.032534 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:31 crc kubenswrapper[4841]: I0130 06:30:31.033238 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:31 crc kubenswrapper[4841]: I0130 06:30:31.114068 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:31 crc kubenswrapper[4841]: I0130 06:30:31.356917 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:31 crc kubenswrapper[4841]: I0130 06:30:31.406219 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jw7x7"] Jan 30 06:30:33 crc kubenswrapper[4841]: I0130 06:30:33.262771 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jw7x7" podUID="ebdfadcd-66c4-432d-b2f1-d6bb281c168b" containerName="registry-server" containerID="cri-o://7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5" gracePeriod=2 Jan 30 06:30:33 crc kubenswrapper[4841]: I0130 06:30:33.766590 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:33 crc kubenswrapper[4841]: I0130 06:30:33.943680 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-utilities\") pod \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\" (UID: \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\") " Jan 30 06:30:33 crc kubenswrapper[4841]: I0130 06:30:33.943732 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prx62\" (UniqueName: \"kubernetes.io/projected/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-kube-api-access-prx62\") pod \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\" (UID: \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\") " Jan 30 06:30:33 crc kubenswrapper[4841]: I0130 06:30:33.943768 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-catalog-content\") pod \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\" (UID: \"ebdfadcd-66c4-432d-b2f1-d6bb281c168b\") " Jan 30 06:30:33 crc kubenswrapper[4841]: I0130 06:30:33.944606 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-utilities" (OuterVolumeSpecName: "utilities") pod "ebdfadcd-66c4-432d-b2f1-d6bb281c168b" (UID: "ebdfadcd-66c4-432d-b2f1-d6bb281c168b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:30:33 crc kubenswrapper[4841]: I0130 06:30:33.954386 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-kube-api-access-prx62" (OuterVolumeSpecName: "kube-api-access-prx62") pod "ebdfadcd-66c4-432d-b2f1-d6bb281c168b" (UID: "ebdfadcd-66c4-432d-b2f1-d6bb281c168b"). InnerVolumeSpecName "kube-api-access-prx62". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.024596 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebdfadcd-66c4-432d-b2f1-d6bb281c168b" (UID: "ebdfadcd-66c4-432d-b2f1-d6bb281c168b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.045753 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.045792 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prx62\" (UniqueName: \"kubernetes.io/projected/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-kube-api-access-prx62\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.045806 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebdfadcd-66c4-432d-b2f1-d6bb281c168b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.281291 4841 generic.go:334] "Generic (PLEG): container finished" podID="ebdfadcd-66c4-432d-b2f1-d6bb281c168b" containerID="7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5" exitCode=0 Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.281372 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jw7x7" event={"ID":"ebdfadcd-66c4-432d-b2f1-d6bb281c168b","Type":"ContainerDied","Data":"7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5"} Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.281438 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jw7x7" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.281507 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jw7x7" event={"ID":"ebdfadcd-66c4-432d-b2f1-d6bb281c168b","Type":"ContainerDied","Data":"564435bc8ebfd62b8c7e653b62afcf087eb2c43ad200bb5cb2415a9020d80382"} Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.281550 4841 scope.go:117] "RemoveContainer" containerID="7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.317613 4841 scope.go:117] "RemoveContainer" containerID="2c38c681d3808f30fbc8e3933ff50b66f28d23f521fd31799da7aed97b56d2f0" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.342782 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jw7x7"] Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.358976 4841 scope.go:117] "RemoveContainer" containerID="5e89f5b4ba93b4b70cef42ab74b434603dd5611937ef57aa8f815c0adb994a4b" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.358998 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jw7x7"] Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.398821 4841 scope.go:117] "RemoveContainer" containerID="7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5" Jan 30 06:30:34 crc kubenswrapper[4841]: E0130 06:30:34.399250 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5\": container with ID starting with 7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5 not found: ID does not exist" containerID="7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.399299 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5"} err="failed to get container status \"7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5\": rpc error: code = NotFound desc = could not find container \"7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5\": container with ID starting with 7d7c622c6bb3df430f56b060a25a23219b774bee4af8be3361945d280ce66da5 not found: ID does not exist" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.399331 4841 scope.go:117] "RemoveContainer" containerID="2c38c681d3808f30fbc8e3933ff50b66f28d23f521fd31799da7aed97b56d2f0" Jan 30 06:30:34 crc kubenswrapper[4841]: E0130 06:30:34.399843 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c38c681d3808f30fbc8e3933ff50b66f28d23f521fd31799da7aed97b56d2f0\": container with ID starting with 2c38c681d3808f30fbc8e3933ff50b66f28d23f521fd31799da7aed97b56d2f0 not found: ID does not exist" containerID="2c38c681d3808f30fbc8e3933ff50b66f28d23f521fd31799da7aed97b56d2f0" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.399889 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c38c681d3808f30fbc8e3933ff50b66f28d23f521fd31799da7aed97b56d2f0"} err="failed to get container status \"2c38c681d3808f30fbc8e3933ff50b66f28d23f521fd31799da7aed97b56d2f0\": rpc error: code = NotFound desc = could not find container \"2c38c681d3808f30fbc8e3933ff50b66f28d23f521fd31799da7aed97b56d2f0\": container with ID starting with 2c38c681d3808f30fbc8e3933ff50b66f28d23f521fd31799da7aed97b56d2f0 not found: ID does not exist" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.399921 4841 scope.go:117] "RemoveContainer" containerID="5e89f5b4ba93b4b70cef42ab74b434603dd5611937ef57aa8f815c0adb994a4b" Jan 30 06:30:34 crc kubenswrapper[4841]: E0130 06:30:34.400235 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e89f5b4ba93b4b70cef42ab74b434603dd5611937ef57aa8f815c0adb994a4b\": container with ID starting with 5e89f5b4ba93b4b70cef42ab74b434603dd5611937ef57aa8f815c0adb994a4b not found: ID does not exist" containerID="5e89f5b4ba93b4b70cef42ab74b434603dd5611937ef57aa8f815c0adb994a4b" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.400270 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e89f5b4ba93b4b70cef42ab74b434603dd5611937ef57aa8f815c0adb994a4b"} err="failed to get container status \"5e89f5b4ba93b4b70cef42ab74b434603dd5611937ef57aa8f815c0adb994a4b\": rpc error: code = NotFound desc = could not find container \"5e89f5b4ba93b4b70cef42ab74b434603dd5611937ef57aa8f815c0adb994a4b\": container with ID starting with 5e89f5b4ba93b4b70cef42ab74b434603dd5611937ef57aa8f815c0adb994a4b not found: ID does not exist" Jan 30 06:30:34 crc kubenswrapper[4841]: I0130 06:30:34.441757 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebdfadcd-66c4-432d-b2f1-d6bb281c168b" path="/var/lib/kubelet/pods/ebdfadcd-66c4-432d-b2f1-d6bb281c168b/volumes" Jan 30 06:30:40 crc kubenswrapper[4841]: I0130 06:30:40.463121 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:30:40 crc kubenswrapper[4841]: I0130 06:30:40.463718 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:30:40 crc kubenswrapper[4841]: I0130 06:30:40.468358 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 06:30:40 crc kubenswrapper[4841]: I0130 06:30:40.469131 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:30:40 crc kubenswrapper[4841]: I0130 06:30:40.469311 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" gracePeriod=600 Jan 30 06:30:40 crc kubenswrapper[4841]: E0130 06:30:40.602055 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:30:41 crc kubenswrapper[4841]: I0130 06:30:41.410080 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" exitCode=0 Jan 30 06:30:41 crc kubenswrapper[4841]: I0130 06:30:41.410128 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176"} Jan 30 06:30:41 crc kubenswrapper[4841]: I0130 06:30:41.410163 4841 scope.go:117] "RemoveContainer" containerID="420c2d89248701f2cd0f1e18fb3f4abc5f28d723a2143270388bf82acff15563" Jan 30 06:30:41 crc kubenswrapper[4841]: I0130 06:30:41.410777 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:30:41 crc kubenswrapper[4841]: E0130 06:30:41.411030 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:30:51 crc kubenswrapper[4841]: I0130 06:30:51.067962 4841 scope.go:117] "RemoveContainer" containerID="be274802c25223a4118606561eb92b9f1ec833d4a6088d395d3838762b889ac4" Jan 30 06:30:54 crc kubenswrapper[4841]: I0130 06:30:54.439941 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:30:54 crc kubenswrapper[4841]: E0130 06:30:54.440685 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:31:08 crc kubenswrapper[4841]: I0130 06:31:08.432024 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:31:08 crc kubenswrapper[4841]: E0130 06:31:08.432977 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:31:20 crc kubenswrapper[4841]: I0130 06:31:20.431905 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:31:20 crc kubenswrapper[4841]: E0130 06:31:20.432900 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:31:35 crc kubenswrapper[4841]: I0130 06:31:35.432320 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:31:35 crc kubenswrapper[4841]: E0130 06:31:35.433390 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:31:46 crc kubenswrapper[4841]: I0130 06:31:46.432315 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:31:46 crc kubenswrapper[4841]: E0130 06:31:46.433483 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:31:51 crc kubenswrapper[4841]: I0130 06:31:51.155893 4841 scope.go:117] "RemoveContainer" containerID="932f1a5c163df2bb79d66b8e0b351d3190a27f815cc4a1feb3d923ced9a08f84" Jan 30 06:31:57 crc kubenswrapper[4841]: I0130 06:31:57.432493 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:31:57 crc kubenswrapper[4841]: E0130 06:31:57.433363 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:32:12 crc kubenswrapper[4841]: I0130 06:32:12.432827 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:32:12 crc kubenswrapper[4841]: E0130 06:32:12.433761 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:32:27 crc kubenswrapper[4841]: I0130 06:32:27.432553 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:32:27 crc kubenswrapper[4841]: E0130 06:32:27.433123 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:32:39 crc kubenswrapper[4841]: I0130 06:32:39.432777 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:32:39 crc kubenswrapper[4841]: E0130 06:32:39.433957 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:32:50 crc kubenswrapper[4841]: I0130 06:32:50.432763 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:32:50 crc kubenswrapper[4841]: E0130 06:32:50.434048 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:33:03 crc kubenswrapper[4841]: I0130 06:33:03.431885 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:33:03 crc kubenswrapper[4841]: E0130 06:33:03.433022 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:33:14 crc kubenswrapper[4841]: I0130 06:33:14.442363 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:33:14 crc kubenswrapper[4841]: E0130 06:33:14.444794 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:33:15 crc kubenswrapper[4841]: I0130 06:33:15.975996 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jwqvx"] Jan 30 06:33:15 crc kubenswrapper[4841]: E0130 06:33:15.976594 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdfadcd-66c4-432d-b2f1-d6bb281c168b" containerName="extract-content" Jan 30 06:33:15 crc kubenswrapper[4841]: I0130 06:33:15.976608 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdfadcd-66c4-432d-b2f1-d6bb281c168b" containerName="extract-content" Jan 30 06:33:15 crc kubenswrapper[4841]: E0130 06:33:15.976626 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdfadcd-66c4-432d-b2f1-d6bb281c168b" containerName="extract-utilities" Jan 30 06:33:15 crc kubenswrapper[4841]: I0130 06:33:15.976634 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdfadcd-66c4-432d-b2f1-d6bb281c168b" containerName="extract-utilities" Jan 30 06:33:15 crc kubenswrapper[4841]: E0130 06:33:15.976656 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdfadcd-66c4-432d-b2f1-d6bb281c168b" containerName="registry-server" Jan 30 06:33:15 crc kubenswrapper[4841]: I0130 06:33:15.976665 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdfadcd-66c4-432d-b2f1-d6bb281c168b" containerName="registry-server" Jan 30 06:33:15 crc kubenswrapper[4841]: I0130 06:33:15.976879 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebdfadcd-66c4-432d-b2f1-d6bb281c168b" containerName="registry-server" Jan 30 06:33:15 crc kubenswrapper[4841]: I0130 06:33:15.978154 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:16 crc kubenswrapper[4841]: I0130 06:33:16.006109 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwqvx"] Jan 30 06:33:16 crc kubenswrapper[4841]: I0130 06:33:16.045751 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nklzt\" (UniqueName: \"kubernetes.io/projected/c0130e26-f962-4825-bccc-b751cdd31798-kube-api-access-nklzt\") pod \"redhat-marketplace-jwqvx\" (UID: \"c0130e26-f962-4825-bccc-b751cdd31798\") " pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:16 crc kubenswrapper[4841]: I0130 06:33:16.045849 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0130e26-f962-4825-bccc-b751cdd31798-utilities\") pod \"redhat-marketplace-jwqvx\" (UID: \"c0130e26-f962-4825-bccc-b751cdd31798\") " pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:16 crc kubenswrapper[4841]: I0130 06:33:16.045999 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0130e26-f962-4825-bccc-b751cdd31798-catalog-content\") pod \"redhat-marketplace-jwqvx\" (UID: \"c0130e26-f962-4825-bccc-b751cdd31798\") " pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:16 crc kubenswrapper[4841]: I0130 06:33:16.147354 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nklzt\" (UniqueName: \"kubernetes.io/projected/c0130e26-f962-4825-bccc-b751cdd31798-kube-api-access-nklzt\") pod \"redhat-marketplace-jwqvx\" (UID: \"c0130e26-f962-4825-bccc-b751cdd31798\") " pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:16 crc kubenswrapper[4841]: I0130 06:33:16.147561 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0130e26-f962-4825-bccc-b751cdd31798-utilities\") pod \"redhat-marketplace-jwqvx\" (UID: \"c0130e26-f962-4825-bccc-b751cdd31798\") " pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:16 crc kubenswrapper[4841]: I0130 06:33:16.148459 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0130e26-f962-4825-bccc-b751cdd31798-utilities\") pod \"redhat-marketplace-jwqvx\" (UID: \"c0130e26-f962-4825-bccc-b751cdd31798\") " pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:16 crc kubenswrapper[4841]: I0130 06:33:16.149001 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0130e26-f962-4825-bccc-b751cdd31798-catalog-content\") pod \"redhat-marketplace-jwqvx\" (UID: \"c0130e26-f962-4825-bccc-b751cdd31798\") " pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:16 crc kubenswrapper[4841]: I0130 06:33:16.149061 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0130e26-f962-4825-bccc-b751cdd31798-catalog-content\") pod \"redhat-marketplace-jwqvx\" (UID: \"c0130e26-f962-4825-bccc-b751cdd31798\") " pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:16 crc kubenswrapper[4841]: I0130 06:33:16.363761 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nklzt\" (UniqueName: \"kubernetes.io/projected/c0130e26-f962-4825-bccc-b751cdd31798-kube-api-access-nklzt\") pod \"redhat-marketplace-jwqvx\" (UID: \"c0130e26-f962-4825-bccc-b751cdd31798\") " pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:16 crc kubenswrapper[4841]: I0130 06:33:16.613587 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:17 crc kubenswrapper[4841]: I0130 06:33:17.059307 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwqvx"] Jan 30 06:33:17 crc kubenswrapper[4841]: I0130 06:33:17.996974 4841 generic.go:334] "Generic (PLEG): container finished" podID="c0130e26-f962-4825-bccc-b751cdd31798" containerID="eb8edd8d1c819f5c4fb790c24037b3725f24f0128e7eedc2a88331c118b9e9b4" exitCode=0 Jan 30 06:33:17 crc kubenswrapper[4841]: I0130 06:33:17.997081 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwqvx" event={"ID":"c0130e26-f962-4825-bccc-b751cdd31798","Type":"ContainerDied","Data":"eb8edd8d1c819f5c4fb790c24037b3725f24f0128e7eedc2a88331c118b9e9b4"} Jan 30 06:33:17 crc kubenswrapper[4841]: I0130 06:33:17.997287 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwqvx" event={"ID":"c0130e26-f962-4825-bccc-b751cdd31798","Type":"ContainerStarted","Data":"2b4a3d7413a200ca44222cac09a0c926ec40bb42a203a5d5bab72ac15d600e4a"} Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.179125 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f95h4"] Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.181467 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.188242 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f95h4"] Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.307602 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe64ae7-bfae-4574-836a-936ccb3b3bde-utilities\") pod \"certified-operators-f95h4\" (UID: \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\") " pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.307658 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe64ae7-bfae-4574-836a-936ccb3b3bde-catalog-content\") pod \"certified-operators-f95h4\" (UID: \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\") " pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.307764 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkz94\" (UniqueName: \"kubernetes.io/projected/fbe64ae7-bfae-4574-836a-936ccb3b3bde-kube-api-access-vkz94\") pod \"certified-operators-f95h4\" (UID: \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\") " pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.410375 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkz94\" (UniqueName: \"kubernetes.io/projected/fbe64ae7-bfae-4574-836a-936ccb3b3bde-kube-api-access-vkz94\") pod \"certified-operators-f95h4\" (UID: \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\") " pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.410516 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe64ae7-bfae-4574-836a-936ccb3b3bde-utilities\") pod \"certified-operators-f95h4\" (UID: \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\") " pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.410548 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe64ae7-bfae-4574-836a-936ccb3b3bde-catalog-content\") pod \"certified-operators-f95h4\" (UID: \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\") " pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.411083 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe64ae7-bfae-4574-836a-936ccb3b3bde-catalog-content\") pod \"certified-operators-f95h4\" (UID: \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\") " pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.411316 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe64ae7-bfae-4574-836a-936ccb3b3bde-utilities\") pod \"certified-operators-f95h4\" (UID: \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\") " pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.451891 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkz94\" (UniqueName: \"kubernetes.io/projected/fbe64ae7-bfae-4574-836a-936ccb3b3bde-kube-api-access-vkz94\") pod \"certified-operators-f95h4\" (UID: \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\") " pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.530180 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:19 crc kubenswrapper[4841]: I0130 06:33:19.818889 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f95h4"] Jan 30 06:33:20 crc kubenswrapper[4841]: I0130 06:33:20.012478 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f95h4" event={"ID":"fbe64ae7-bfae-4574-836a-936ccb3b3bde","Type":"ContainerStarted","Data":"2a05da998c84b169f4c7ecc014a51362549a9ae208baad1811c9d020f5410633"} Jan 30 06:33:20 crc kubenswrapper[4841]: I0130 06:33:20.014917 4841 generic.go:334] "Generic (PLEG): container finished" podID="c0130e26-f962-4825-bccc-b751cdd31798" containerID="1c365b326d5878b558cde90de1b4a8d412b10ee5d4cca27c63ea8c08c9e10cac" exitCode=0 Jan 30 06:33:20 crc kubenswrapper[4841]: I0130 06:33:20.014959 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwqvx" event={"ID":"c0130e26-f962-4825-bccc-b751cdd31798","Type":"ContainerDied","Data":"1c365b326d5878b558cde90de1b4a8d412b10ee5d4cca27c63ea8c08c9e10cac"} Jan 30 06:33:21 crc kubenswrapper[4841]: I0130 06:33:21.028000 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwqvx" event={"ID":"c0130e26-f962-4825-bccc-b751cdd31798","Type":"ContainerStarted","Data":"1be915db4fc8e4bf4db4850f3edba34dfefc6c4a5958b54272d4a75d5a1dddc5"} Jan 30 06:33:21 crc kubenswrapper[4841]: I0130 06:33:21.030975 4841 generic.go:334] "Generic (PLEG): container finished" podID="fbe64ae7-bfae-4574-836a-936ccb3b3bde" containerID="b20eb61579efd0dee610be777fd63a010960c2dc1dc64a754940825539c07273" exitCode=0 Jan 30 06:33:21 crc kubenswrapper[4841]: I0130 06:33:21.031034 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f95h4" event={"ID":"fbe64ae7-bfae-4574-836a-936ccb3b3bde","Type":"ContainerDied","Data":"b20eb61579efd0dee610be777fd63a010960c2dc1dc64a754940825539c07273"} Jan 30 06:33:21 crc kubenswrapper[4841]: I0130 06:33:21.067851 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jwqvx" podStartSLOduration=3.531865222 podStartE2EDuration="6.067818879s" podCreationTimestamp="2026-01-30 06:33:15 +0000 UTC" firstStartedPulling="2026-01-30 06:33:17.999888331 +0000 UTC m=+5134.993360999" lastFinishedPulling="2026-01-30 06:33:20.535842018 +0000 UTC m=+5137.529314656" observedRunningTime="2026-01-30 06:33:21.053219311 +0000 UTC m=+5138.046691949" watchObservedRunningTime="2026-01-30 06:33:21.067818879 +0000 UTC m=+5138.061291557" Jan 30 06:33:23 crc kubenswrapper[4841]: I0130 06:33:23.053267 4841 generic.go:334] "Generic (PLEG): container finished" podID="fbe64ae7-bfae-4574-836a-936ccb3b3bde" containerID="484d3097d2fcc85031dcfc6f91d3b65f7373bf3a65aad476c899fd054740ac6b" exitCode=0 Jan 30 06:33:23 crc kubenswrapper[4841]: I0130 06:33:23.053367 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f95h4" event={"ID":"fbe64ae7-bfae-4574-836a-936ccb3b3bde","Type":"ContainerDied","Data":"484d3097d2fcc85031dcfc6f91d3b65f7373bf3a65aad476c899fd054740ac6b"} Jan 30 06:33:24 crc kubenswrapper[4841]: I0130 06:33:24.065299 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f95h4" event={"ID":"fbe64ae7-bfae-4574-836a-936ccb3b3bde","Type":"ContainerStarted","Data":"79c64b5ee04cdc571bd0fa38c650ee6dde1b9d9f0241d56a2dd9f5a992f5aebf"} Jan 30 06:33:24 crc kubenswrapper[4841]: I0130 06:33:24.097473 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f95h4" podStartSLOduration=2.6092890029999998 podStartE2EDuration="5.097452987s" podCreationTimestamp="2026-01-30 06:33:19 +0000 UTC" firstStartedPulling="2026-01-30 06:33:21.033139277 +0000 UTC m=+5138.026611955" lastFinishedPulling="2026-01-30 06:33:23.521303291 +0000 UTC m=+5140.514775939" observedRunningTime="2026-01-30 06:33:24.089955958 +0000 UTC m=+5141.083428596" watchObservedRunningTime="2026-01-30 06:33:24.097452987 +0000 UTC m=+5141.090925635" Jan 30 06:33:26 crc kubenswrapper[4841]: I0130 06:33:26.614431 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:26 crc kubenswrapper[4841]: I0130 06:33:26.614665 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:26 crc kubenswrapper[4841]: I0130 06:33:26.653146 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:27 crc kubenswrapper[4841]: I0130 06:33:27.174687 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:29 crc kubenswrapper[4841]: I0130 06:33:29.432100 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:33:29 crc kubenswrapper[4841]: E0130 06:33:29.432528 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:33:29 crc kubenswrapper[4841]: I0130 06:33:29.531266 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:29 crc kubenswrapper[4841]: I0130 06:33:29.531365 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:29 crc kubenswrapper[4841]: I0130 06:33:29.576947 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:30 crc kubenswrapper[4841]: I0130 06:33:30.303314 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:30 crc kubenswrapper[4841]: I0130 06:33:30.768044 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f95h4"] Jan 30 06:33:31 crc kubenswrapper[4841]: I0130 06:33:31.759674 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwqvx"] Jan 30 06:33:31 crc kubenswrapper[4841]: I0130 06:33:31.760297 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jwqvx" podUID="c0130e26-f962-4825-bccc-b751cdd31798" containerName="registry-server" containerID="cri-o://1be915db4fc8e4bf4db4850f3edba34dfefc6c4a5958b54272d4a75d5a1dddc5" gracePeriod=2 Jan 30 06:33:31 crc kubenswrapper[4841]: E0130 06:33:31.947814 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0130e26_f962_4825_bccc_b751cdd31798.slice/crio-1be915db4fc8e4bf4db4850f3edba34dfefc6c4a5958b54272d4a75d5a1dddc5.scope\": RecentStats: unable to find data in memory cache]" Jan 30 06:33:32 crc kubenswrapper[4841]: I0130 06:33:32.140729 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f95h4" podUID="fbe64ae7-bfae-4574-836a-936ccb3b3bde" containerName="registry-server" containerID="cri-o://79c64b5ee04cdc571bd0fa38c650ee6dde1b9d9f0241d56a2dd9f5a992f5aebf" gracePeriod=2 Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.153124 4841 generic.go:334] "Generic (PLEG): container finished" podID="c0130e26-f962-4825-bccc-b751cdd31798" containerID="1be915db4fc8e4bf4db4850f3edba34dfefc6c4a5958b54272d4a75d5a1dddc5" exitCode=0 Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.153203 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwqvx" event={"ID":"c0130e26-f962-4825-bccc-b751cdd31798","Type":"ContainerDied","Data":"1be915db4fc8e4bf4db4850f3edba34dfefc6c4a5958b54272d4a75d5a1dddc5"} Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.156579 4841 generic.go:334] "Generic (PLEG): container finished" podID="fbe64ae7-bfae-4574-836a-936ccb3b3bde" containerID="79c64b5ee04cdc571bd0fa38c650ee6dde1b9d9f0241d56a2dd9f5a992f5aebf" exitCode=0 Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.156641 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f95h4" event={"ID":"fbe64ae7-bfae-4574-836a-936ccb3b3bde","Type":"ContainerDied","Data":"79c64b5ee04cdc571bd0fa38c650ee6dde1b9d9f0241d56a2dd9f5a992f5aebf"} Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.423254 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.553013 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0130e26-f962-4825-bccc-b751cdd31798-catalog-content\") pod \"c0130e26-f962-4825-bccc-b751cdd31798\" (UID: \"c0130e26-f962-4825-bccc-b751cdd31798\") " Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.553543 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nklzt\" (UniqueName: \"kubernetes.io/projected/c0130e26-f962-4825-bccc-b751cdd31798-kube-api-access-nklzt\") pod \"c0130e26-f962-4825-bccc-b751cdd31798\" (UID: \"c0130e26-f962-4825-bccc-b751cdd31798\") " Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.553723 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0130e26-f962-4825-bccc-b751cdd31798-utilities\") pod \"c0130e26-f962-4825-bccc-b751cdd31798\" (UID: \"c0130e26-f962-4825-bccc-b751cdd31798\") " Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.554511 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0130e26-f962-4825-bccc-b751cdd31798-utilities" (OuterVolumeSpecName: "utilities") pod "c0130e26-f962-4825-bccc-b751cdd31798" (UID: "c0130e26-f962-4825-bccc-b751cdd31798"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.555332 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0130e26-f962-4825-bccc-b751cdd31798-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.561173 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0130e26-f962-4825-bccc-b751cdd31798-kube-api-access-nklzt" (OuterVolumeSpecName: "kube-api-access-nklzt") pod "c0130e26-f962-4825-bccc-b751cdd31798" (UID: "c0130e26-f962-4825-bccc-b751cdd31798"). InnerVolumeSpecName "kube-api-access-nklzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.592376 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0130e26-f962-4825-bccc-b751cdd31798-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0130e26-f962-4825-bccc-b751cdd31798" (UID: "c0130e26-f962-4825-bccc-b751cdd31798"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.608491 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.657896 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0130e26-f962-4825-bccc-b751cdd31798-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.657940 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nklzt\" (UniqueName: \"kubernetes.io/projected/c0130e26-f962-4825-bccc-b751cdd31798-kube-api-access-nklzt\") on node \"crc\" DevicePath \"\"" Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.759583 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe64ae7-bfae-4574-836a-936ccb3b3bde-catalog-content\") pod \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\" (UID: \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\") " Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.759903 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkz94\" (UniqueName: \"kubernetes.io/projected/fbe64ae7-bfae-4574-836a-936ccb3b3bde-kube-api-access-vkz94\") pod \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\" (UID: \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\") " Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.760086 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe64ae7-bfae-4574-836a-936ccb3b3bde-utilities\") pod \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\" (UID: \"fbe64ae7-bfae-4574-836a-936ccb3b3bde\") " Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.762379 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe64ae7-bfae-4574-836a-936ccb3b3bde-utilities" (OuterVolumeSpecName: "utilities") pod "fbe64ae7-bfae-4574-836a-936ccb3b3bde" (UID: "fbe64ae7-bfae-4574-836a-936ccb3b3bde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.764867 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe64ae7-bfae-4574-836a-936ccb3b3bde-kube-api-access-vkz94" (OuterVolumeSpecName: "kube-api-access-vkz94") pod "fbe64ae7-bfae-4574-836a-936ccb3b3bde" (UID: "fbe64ae7-bfae-4574-836a-936ccb3b3bde"). InnerVolumeSpecName "kube-api-access-vkz94". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.861947 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkz94\" (UniqueName: \"kubernetes.io/projected/fbe64ae7-bfae-4574-836a-936ccb3b3bde-kube-api-access-vkz94\") on node \"crc\" DevicePath \"\"" Jan 30 06:33:33 crc kubenswrapper[4841]: I0130 06:33:33.862218 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe64ae7-bfae-4574-836a-936ccb3b3bde-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.178726 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f95h4" event={"ID":"fbe64ae7-bfae-4574-836a-936ccb3b3bde","Type":"ContainerDied","Data":"2a05da998c84b169f4c7ecc014a51362549a9ae208baad1811c9d020f5410633"} Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.178816 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f95h4" Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.178884 4841 scope.go:117] "RemoveContainer" containerID="79c64b5ee04cdc571bd0fa38c650ee6dde1b9d9f0241d56a2dd9f5a992f5aebf" Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.182557 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwqvx" event={"ID":"c0130e26-f962-4825-bccc-b751cdd31798","Type":"ContainerDied","Data":"2b4a3d7413a200ca44222cac09a0c926ec40bb42a203a5d5bab72ac15d600e4a"} Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.182921 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwqvx" Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.221569 4841 scope.go:117] "RemoveContainer" containerID="484d3097d2fcc85031dcfc6f91d3b65f7373bf3a65aad476c899fd054740ac6b" Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.240870 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwqvx"] Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.249660 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwqvx"] Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.258534 4841 scope.go:117] "RemoveContainer" containerID="b20eb61579efd0dee610be777fd63a010960c2dc1dc64a754940825539c07273" Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.289856 4841 scope.go:117] "RemoveContainer" containerID="1be915db4fc8e4bf4db4850f3edba34dfefc6c4a5958b54272d4a75d5a1dddc5" Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.316380 4841 scope.go:117] "RemoveContainer" containerID="1c365b326d5878b558cde90de1b4a8d412b10ee5d4cca27c63ea8c08c9e10cac" Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.340828 4841 scope.go:117] "RemoveContainer" containerID="eb8edd8d1c819f5c4fb790c24037b3725f24f0128e7eedc2a88331c118b9e9b4" Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.434045 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe64ae7-bfae-4574-836a-936ccb3b3bde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbe64ae7-bfae-4574-836a-936ccb3b3bde" (UID: "fbe64ae7-bfae-4574-836a-936ccb3b3bde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.448644 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0130e26-f962-4825-bccc-b751cdd31798" path="/var/lib/kubelet/pods/c0130e26-f962-4825-bccc-b751cdd31798/volumes" Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.474731 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe64ae7-bfae-4574-836a-936ccb3b3bde-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.506280 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f95h4"] Jan 30 06:33:34 crc kubenswrapper[4841]: I0130 06:33:34.515913 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f95h4"] Jan 30 06:33:36 crc kubenswrapper[4841]: I0130 06:33:36.449774 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe64ae7-bfae-4574-836a-936ccb3b3bde" path="/var/lib/kubelet/pods/fbe64ae7-bfae-4574-836a-936ccb3b3bde/volumes" Jan 30 06:33:38 crc kubenswrapper[4841]: I0130 06:33:38.894660 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Jan 30 06:33:38 crc kubenswrapper[4841]: E0130 06:33:38.896550 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe64ae7-bfae-4574-836a-936ccb3b3bde" containerName="registry-server" Jan 30 06:33:38 crc kubenswrapper[4841]: I0130 06:33:38.896667 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe64ae7-bfae-4574-836a-936ccb3b3bde" containerName="registry-server" Jan 30 06:33:38 crc kubenswrapper[4841]: E0130 06:33:38.896762 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0130e26-f962-4825-bccc-b751cdd31798" containerName="extract-content" Jan 30 06:33:38 crc kubenswrapper[4841]: I0130 06:33:38.896840 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0130e26-f962-4825-bccc-b751cdd31798" containerName="extract-content" Jan 30 06:33:38 crc kubenswrapper[4841]: E0130 06:33:38.896932 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe64ae7-bfae-4574-836a-936ccb3b3bde" containerName="extract-content" Jan 30 06:33:38 crc kubenswrapper[4841]: I0130 06:33:38.897007 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe64ae7-bfae-4574-836a-936ccb3b3bde" containerName="extract-content" Jan 30 06:33:38 crc kubenswrapper[4841]: E0130 06:33:38.897116 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0130e26-f962-4825-bccc-b751cdd31798" containerName="registry-server" Jan 30 06:33:38 crc kubenswrapper[4841]: I0130 06:33:38.897217 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0130e26-f962-4825-bccc-b751cdd31798" containerName="registry-server" Jan 30 06:33:38 crc kubenswrapper[4841]: E0130 06:33:38.897340 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe64ae7-bfae-4574-836a-936ccb3b3bde" containerName="extract-utilities" Jan 30 06:33:38 crc kubenswrapper[4841]: I0130 06:33:38.897542 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe64ae7-bfae-4574-836a-936ccb3b3bde" containerName="extract-utilities" Jan 30 06:33:38 crc kubenswrapper[4841]: E0130 06:33:38.897672 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0130e26-f962-4825-bccc-b751cdd31798" containerName="extract-utilities" Jan 30 06:33:38 crc kubenswrapper[4841]: I0130 06:33:38.897779 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0130e26-f962-4825-bccc-b751cdd31798" containerName="extract-utilities" Jan 30 06:33:38 crc kubenswrapper[4841]: I0130 06:33:38.898140 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe64ae7-bfae-4574-836a-936ccb3b3bde" containerName="registry-server" Jan 30 06:33:38 crc kubenswrapper[4841]: I0130 06:33:38.898276 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0130e26-f962-4825-bccc-b751cdd31798" containerName="registry-server" Jan 30 06:33:38 crc kubenswrapper[4841]: I0130 06:33:38.899629 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Jan 30 06:33:38 crc kubenswrapper[4841]: I0130 06:33:38.903504 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-b4dbj" Jan 30 06:33:38 crc kubenswrapper[4841]: I0130 06:33:38.905029 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Jan 30 06:33:39 crc kubenswrapper[4841]: I0130 06:33:39.051662 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2bf78c2d-aa19-4405-b08e-b86ac7a70891\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bf78c2d-aa19-4405-b08e-b86ac7a70891\") pod \"mariadb-copy-data\" (UID: \"e8971d3d-723a-46c8-b729-707fbe7b953b\") " pod="openstack/mariadb-copy-data" Jan 30 06:33:39 crc kubenswrapper[4841]: I0130 06:33:39.051921 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zp6t\" (UniqueName: \"kubernetes.io/projected/e8971d3d-723a-46c8-b729-707fbe7b953b-kube-api-access-7zp6t\") pod \"mariadb-copy-data\" (UID: \"e8971d3d-723a-46c8-b729-707fbe7b953b\") " pod="openstack/mariadb-copy-data" Jan 30 06:33:39 crc kubenswrapper[4841]: I0130 06:33:39.153355 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2bf78c2d-aa19-4405-b08e-b86ac7a70891\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bf78c2d-aa19-4405-b08e-b86ac7a70891\") pod \"mariadb-copy-data\" (UID: \"e8971d3d-723a-46c8-b729-707fbe7b953b\") " pod="openstack/mariadb-copy-data" Jan 30 06:33:39 crc kubenswrapper[4841]: I0130 06:33:39.153435 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zp6t\" (UniqueName: \"kubernetes.io/projected/e8971d3d-723a-46c8-b729-707fbe7b953b-kube-api-access-7zp6t\") pod \"mariadb-copy-data\" (UID: \"e8971d3d-723a-46c8-b729-707fbe7b953b\") " pod="openstack/mariadb-copy-data" Jan 30 06:33:39 crc kubenswrapper[4841]: I0130 06:33:39.157908 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:33:39 crc kubenswrapper[4841]: I0130 06:33:39.157988 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2bf78c2d-aa19-4405-b08e-b86ac7a70891\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bf78c2d-aa19-4405-b08e-b86ac7a70891\") pod \"mariadb-copy-data\" (UID: \"e8971d3d-723a-46c8-b729-707fbe7b953b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7fdb4bfbfb7b8b2d44fc4573553de8c5d22ce36810e4fe8232a4adb15986375b/globalmount\"" pod="openstack/mariadb-copy-data" Jan 30 06:33:39 crc kubenswrapper[4841]: I0130 06:33:39.172270 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zp6t\" (UniqueName: \"kubernetes.io/projected/e8971d3d-723a-46c8-b729-707fbe7b953b-kube-api-access-7zp6t\") pod \"mariadb-copy-data\" (UID: \"e8971d3d-723a-46c8-b729-707fbe7b953b\") " pod="openstack/mariadb-copy-data" Jan 30 06:33:39 crc kubenswrapper[4841]: I0130 06:33:39.185059 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2bf78c2d-aa19-4405-b08e-b86ac7a70891\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bf78c2d-aa19-4405-b08e-b86ac7a70891\") pod \"mariadb-copy-data\" (UID: \"e8971d3d-723a-46c8-b729-707fbe7b953b\") " pod="openstack/mariadb-copy-data" Jan 30 06:33:39 crc kubenswrapper[4841]: I0130 06:33:39.227989 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Jan 30 06:33:39 crc kubenswrapper[4841]: I0130 06:33:39.548746 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Jan 30 06:33:40 crc kubenswrapper[4841]: I0130 06:33:40.236372 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e8971d3d-723a-46c8-b729-707fbe7b953b","Type":"ContainerStarted","Data":"4c451c07b80fc86ad015e16adc43a91c17ec3ea49894643185347f775cca6267"} Jan 30 06:33:40 crc kubenswrapper[4841]: I0130 06:33:40.236768 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e8971d3d-723a-46c8-b729-707fbe7b953b","Type":"ContainerStarted","Data":"0786847e157d0be4b2e109e2106db308e12230216d0a0d49baf9e6a0b4f7e02a"} Jan 30 06:33:40 crc kubenswrapper[4841]: I0130 06:33:40.270146 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.270115336 podStartE2EDuration="3.270115336s" podCreationTimestamp="2026-01-30 06:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:33:40.258701252 +0000 UTC m=+5157.252173930" watchObservedRunningTime="2026-01-30 06:33:40.270115336 +0000 UTC m=+5157.263588014" Jan 30 06:33:43 crc kubenswrapper[4841]: I0130 06:33:43.431752 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:33:43 crc kubenswrapper[4841]: E0130 06:33:43.432800 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:33:43 crc kubenswrapper[4841]: I0130 06:33:43.497789 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Jan 30 06:33:43 crc kubenswrapper[4841]: I0130 06:33:43.498872 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:33:43 crc kubenswrapper[4841]: I0130 06:33:43.510062 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:33:43 crc kubenswrapper[4841]: I0130 06:33:43.626492 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47sjg\" (UniqueName: \"kubernetes.io/projected/5235b5f7-8a5a-48b5-b574-04ee849f5203-kube-api-access-47sjg\") pod \"mariadb-client\" (UID: \"5235b5f7-8a5a-48b5-b574-04ee849f5203\") " pod="openstack/mariadb-client" Jan 30 06:33:43 crc kubenswrapper[4841]: I0130 06:33:43.728903 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47sjg\" (UniqueName: \"kubernetes.io/projected/5235b5f7-8a5a-48b5-b574-04ee849f5203-kube-api-access-47sjg\") pod \"mariadb-client\" (UID: \"5235b5f7-8a5a-48b5-b574-04ee849f5203\") " pod="openstack/mariadb-client" Jan 30 06:33:43 crc kubenswrapper[4841]: I0130 06:33:43.763154 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47sjg\" (UniqueName: \"kubernetes.io/projected/5235b5f7-8a5a-48b5-b574-04ee849f5203-kube-api-access-47sjg\") pod \"mariadb-client\" (UID: \"5235b5f7-8a5a-48b5-b574-04ee849f5203\") " pod="openstack/mariadb-client" Jan 30 06:33:43 crc kubenswrapper[4841]: I0130 06:33:43.959616 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:33:44 crc kubenswrapper[4841]: I0130 06:33:44.263913 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:33:44 crc kubenswrapper[4841]: W0130 06:33:44.267804 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5235b5f7_8a5a_48b5_b574_04ee849f5203.slice/crio-a7ab489a7d115a3e2c6006357aaa4e7971251595548c9d028ed51e04f82a37ce WatchSource:0}: Error finding container a7ab489a7d115a3e2c6006357aaa4e7971251595548c9d028ed51e04f82a37ce: Status 404 returned error can't find the container with id a7ab489a7d115a3e2c6006357aaa4e7971251595548c9d028ed51e04f82a37ce Jan 30 06:33:45 crc kubenswrapper[4841]: I0130 06:33:45.282287 4841 generic.go:334] "Generic (PLEG): container finished" podID="5235b5f7-8a5a-48b5-b574-04ee849f5203" containerID="e3239e27b50f1e3e211cedf510aff342256d92f96bac7311f773731e6e953b68" exitCode=0 Jan 30 06:33:45 crc kubenswrapper[4841]: I0130 06:33:45.282469 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"5235b5f7-8a5a-48b5-b574-04ee849f5203","Type":"ContainerDied","Data":"e3239e27b50f1e3e211cedf510aff342256d92f96bac7311f773731e6e953b68"} Jan 30 06:33:45 crc kubenswrapper[4841]: I0130 06:33:45.282650 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"5235b5f7-8a5a-48b5-b574-04ee849f5203","Type":"ContainerStarted","Data":"a7ab489a7d115a3e2c6006357aaa4e7971251595548c9d028ed51e04f82a37ce"} Jan 30 06:33:46 crc kubenswrapper[4841]: I0130 06:33:46.684019 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:33:46 crc kubenswrapper[4841]: I0130 06:33:46.719540 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_5235b5f7-8a5a-48b5-b574-04ee849f5203/mariadb-client/0.log" Jan 30 06:33:46 crc kubenswrapper[4841]: I0130 06:33:46.750611 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:33:46 crc kubenswrapper[4841]: I0130 06:33:46.765541 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:33:46 crc kubenswrapper[4841]: I0130 06:33:46.779525 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47sjg\" (UniqueName: \"kubernetes.io/projected/5235b5f7-8a5a-48b5-b574-04ee849f5203-kube-api-access-47sjg\") pod \"5235b5f7-8a5a-48b5-b574-04ee849f5203\" (UID: \"5235b5f7-8a5a-48b5-b574-04ee849f5203\") " Jan 30 06:33:46 crc kubenswrapper[4841]: I0130 06:33:46.784893 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5235b5f7-8a5a-48b5-b574-04ee849f5203-kube-api-access-47sjg" (OuterVolumeSpecName: "kube-api-access-47sjg") pod "5235b5f7-8a5a-48b5-b574-04ee849f5203" (UID: "5235b5f7-8a5a-48b5-b574-04ee849f5203"). InnerVolumeSpecName "kube-api-access-47sjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:33:46 crc kubenswrapper[4841]: I0130 06:33:46.882241 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47sjg\" (UniqueName: \"kubernetes.io/projected/5235b5f7-8a5a-48b5-b574-04ee849f5203-kube-api-access-47sjg\") on node \"crc\" DevicePath \"\"" Jan 30 06:33:46 crc kubenswrapper[4841]: I0130 06:33:46.925956 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Jan 30 06:33:46 crc kubenswrapper[4841]: E0130 06:33:46.926685 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5235b5f7-8a5a-48b5-b574-04ee849f5203" containerName="mariadb-client" Jan 30 06:33:46 crc kubenswrapper[4841]: I0130 06:33:46.926719 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5235b5f7-8a5a-48b5-b574-04ee849f5203" containerName="mariadb-client" Jan 30 06:33:46 crc kubenswrapper[4841]: I0130 06:33:46.927010 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5235b5f7-8a5a-48b5-b574-04ee849f5203" containerName="mariadb-client" Jan 30 06:33:46 crc kubenswrapper[4841]: I0130 06:33:46.927772 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:33:46 crc kubenswrapper[4841]: I0130 06:33:46.939601 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:33:47 crc kubenswrapper[4841]: I0130 06:33:47.085174 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbgc\" (UniqueName: \"kubernetes.io/projected/ce0155cc-fca5-4e85-b1ca-fd88ec51732b-kube-api-access-fzbgc\") pod \"mariadb-client\" (UID: \"ce0155cc-fca5-4e85-b1ca-fd88ec51732b\") " pod="openstack/mariadb-client" Jan 30 06:33:47 crc kubenswrapper[4841]: I0130 06:33:47.186908 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbgc\" (UniqueName: \"kubernetes.io/projected/ce0155cc-fca5-4e85-b1ca-fd88ec51732b-kube-api-access-fzbgc\") pod \"mariadb-client\" (UID: \"ce0155cc-fca5-4e85-b1ca-fd88ec51732b\") " pod="openstack/mariadb-client" Jan 30 06:33:47 crc kubenswrapper[4841]: I0130 06:33:47.215312 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbgc\" (UniqueName: \"kubernetes.io/projected/ce0155cc-fca5-4e85-b1ca-fd88ec51732b-kube-api-access-fzbgc\") pod \"mariadb-client\" (UID: \"ce0155cc-fca5-4e85-b1ca-fd88ec51732b\") " pod="openstack/mariadb-client" Jan 30 06:33:47 crc kubenswrapper[4841]: I0130 06:33:47.263652 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:33:47 crc kubenswrapper[4841]: I0130 06:33:47.298973 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7ab489a7d115a3e2c6006357aaa4e7971251595548c9d028ed51e04f82a37ce" Jan 30 06:33:47 crc kubenswrapper[4841]: I0130 06:33:47.299163 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:33:47 crc kubenswrapper[4841]: I0130 06:33:47.334658 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="5235b5f7-8a5a-48b5-b574-04ee849f5203" podUID="ce0155cc-fca5-4e85-b1ca-fd88ec51732b" Jan 30 06:33:47 crc kubenswrapper[4841]: I0130 06:33:47.718987 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:33:48 crc kubenswrapper[4841]: I0130 06:33:48.308867 4841 generic.go:334] "Generic (PLEG): container finished" podID="ce0155cc-fca5-4e85-b1ca-fd88ec51732b" containerID="26bcd2dd5ac79725d9029453700cba134772cef5b02f992e3fd40717864117fd" exitCode=0 Jan 30 06:33:48 crc kubenswrapper[4841]: I0130 06:33:48.308989 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ce0155cc-fca5-4e85-b1ca-fd88ec51732b","Type":"ContainerDied","Data":"26bcd2dd5ac79725d9029453700cba134772cef5b02f992e3fd40717864117fd"} Jan 30 06:33:48 crc kubenswrapper[4841]: I0130 06:33:48.309181 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ce0155cc-fca5-4e85-b1ca-fd88ec51732b","Type":"ContainerStarted","Data":"d3cbf19f178a244129bd0e7d4851cfee1e8e85a5d270eb7f9900d6a887986b9d"} Jan 30 06:33:48 crc kubenswrapper[4841]: I0130 06:33:48.446152 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5235b5f7-8a5a-48b5-b574-04ee849f5203" path="/var/lib/kubelet/pods/5235b5f7-8a5a-48b5-b574-04ee849f5203/volumes" Jan 30 06:33:49 crc kubenswrapper[4841]: I0130 06:33:49.764160 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:33:49 crc kubenswrapper[4841]: I0130 06:33:49.796371 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_ce0155cc-fca5-4e85-b1ca-fd88ec51732b/mariadb-client/0.log" Jan 30 06:33:49 crc kubenswrapper[4841]: I0130 06:33:49.845111 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:33:49 crc kubenswrapper[4841]: I0130 06:33:49.859457 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Jan 30 06:33:49 crc kubenswrapper[4841]: I0130 06:33:49.930027 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzbgc\" (UniqueName: \"kubernetes.io/projected/ce0155cc-fca5-4e85-b1ca-fd88ec51732b-kube-api-access-fzbgc\") pod \"ce0155cc-fca5-4e85-b1ca-fd88ec51732b\" (UID: \"ce0155cc-fca5-4e85-b1ca-fd88ec51732b\") " Jan 30 06:33:49 crc kubenswrapper[4841]: I0130 06:33:49.936552 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0155cc-fca5-4e85-b1ca-fd88ec51732b-kube-api-access-fzbgc" (OuterVolumeSpecName: "kube-api-access-fzbgc") pod "ce0155cc-fca5-4e85-b1ca-fd88ec51732b" (UID: "ce0155cc-fca5-4e85-b1ca-fd88ec51732b"). InnerVolumeSpecName "kube-api-access-fzbgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:33:50 crc kubenswrapper[4841]: I0130 06:33:50.031483 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzbgc\" (UniqueName: \"kubernetes.io/projected/ce0155cc-fca5-4e85-b1ca-fd88ec51732b-kube-api-access-fzbgc\") on node \"crc\" DevicePath \"\"" Jan 30 06:33:50 crc kubenswrapper[4841]: I0130 06:33:50.329361 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3cbf19f178a244129bd0e7d4851cfee1e8e85a5d270eb7f9900d6a887986b9d" Jan 30 06:33:50 crc kubenswrapper[4841]: I0130 06:33:50.329494 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 30 06:33:50 crc kubenswrapper[4841]: I0130 06:33:50.459692 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0155cc-fca5-4e85-b1ca-fd88ec51732b" path="/var/lib/kubelet/pods/ce0155cc-fca5-4e85-b1ca-fd88ec51732b/volumes" Jan 30 06:33:51 crc kubenswrapper[4841]: I0130 06:33:51.243115 4841 scope.go:117] "RemoveContainer" containerID="b5f822efc82114dae2c8983ded39a30c20fed1924b327059e0a01920c8fe50e4" Jan 30 06:33:57 crc kubenswrapper[4841]: I0130 06:33:57.432820 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:33:57 crc kubenswrapper[4841]: E0130 06:33:57.433716 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:34:12 crc kubenswrapper[4841]: I0130 06:34:12.433137 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:34:12 crc kubenswrapper[4841]: E0130 06:34:12.434291 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:34:23 crc kubenswrapper[4841]: I0130 06:34:23.432318 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:34:23 crc kubenswrapper[4841]: E0130 06:34:23.435152 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:34:38 crc kubenswrapper[4841]: I0130 06:34:38.432884 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:34:38 crc kubenswrapper[4841]: E0130 06:34:38.434026 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:34:51 crc kubenswrapper[4841]: I0130 06:34:51.431966 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:34:51 crc kubenswrapper[4841]: E0130 06:34:51.432952 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:35:02 crc kubenswrapper[4841]: I0130 06:35:02.431851 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:35:02 crc kubenswrapper[4841]: E0130 06:35:02.432394 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:35:16 crc kubenswrapper[4841]: I0130 06:35:16.432002 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:35:16 crc kubenswrapper[4841]: E0130 06:35:16.433229 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:35:27 crc kubenswrapper[4841]: I0130 06:35:27.432612 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:35:27 crc kubenswrapper[4841]: E0130 06:35:27.434653 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:35:38 crc kubenswrapper[4841]: I0130 06:35:38.431683 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:35:38 crc kubenswrapper[4841]: E0130 06:35:38.432820 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:35:49 crc kubenswrapper[4841]: I0130 06:35:49.432706 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:35:49 crc kubenswrapper[4841]: I0130 06:35:49.716943 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"d8439101c0c816d59529151ad7b9f22d723c26d50210e194b3b0d5dd43185d44"} Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.432804 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 06:35:53 crc kubenswrapper[4841]: E0130 06:35:53.434170 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0155cc-fca5-4e85-b1ca-fd88ec51732b" containerName="mariadb-client" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.434207 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0155cc-fca5-4e85-b1ca-fd88ec51732b" containerName="mariadb-client" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.434665 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0155cc-fca5-4e85-b1ca-fd88ec51732b" containerName="mariadb-client" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.436738 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.443152 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.443740 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.444025 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.444245 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.444542 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-xqfsd" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.451117 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.461979 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.464954 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.487568 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.498101 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.498219 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.500789 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.582352 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lrqc\" (UniqueName: \"kubernetes.io/projected/dcc67785-4c94-4fda-a487-9a6d82288895-kube-api-access-8lrqc\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.582392 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc67785-4c94-4fda-a487-9a6d82288895-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.582431 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc67785-4c94-4fda-a487-9a6d82288895-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.582471 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.582655 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.582739 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.582789 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.582816 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcc67785-4c94-4fda-a487-9a6d82288895-config\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.582949 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc67785-4c94-4fda-a487-9a6d82288895-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.583090 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcc67785-4c94-4fda-a487-9a6d82288895-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.583172 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-config\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.583208 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blmvp\" (UniqueName: \"kubernetes.io/projected/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-kube-api-access-blmvp\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.583944 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1bf07ab6-c423-46aa-97d3-da5ea041522b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bf07ab6-c423-46aa-97d3-da5ea041522b\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.584118 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc67785-4c94-4fda-a487-9a6d82288895-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.584152 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9e5764c8-2e95-46a2-9798-d2dc3df49733\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e5764c8-2e95-46a2-9798-d2dc3df49733\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.584214 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.690281 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcc67785-4c94-4fda-a487-9a6d82288895-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.690356 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78zwn\" (UniqueName: \"kubernetes.io/projected/8975a1d6-f9a2-4161-9d45-3e2886345aec-kube-api-access-78zwn\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.690423 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-config\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.690445 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blmvp\" (UniqueName: \"kubernetes.io/projected/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-kube-api-access-blmvp\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.690515 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc67785-4c94-4fda-a487-9a6d82288895-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.690750 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1bf07ab6-c423-46aa-97d3-da5ea041522b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bf07ab6-c423-46aa-97d3-da5ea041522b\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.690892 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8975a1d6-f9a2-4161-9d45-3e2886345aec-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.690962 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8975a1d6-f9a2-4161-9d45-3e2886345aec-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.691046 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9e5764c8-2e95-46a2-9798-d2dc3df49733\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e5764c8-2e95-46a2-9798-d2dc3df49733\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.691139 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.691310 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lrqc\" (UniqueName: \"kubernetes.io/projected/dcc67785-4c94-4fda-a487-9a6d82288895-kube-api-access-8lrqc\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.691371 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc67785-4c94-4fda-a487-9a6d82288895-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.691424 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc67785-4c94-4fda-a487-9a6d82288895-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.691506 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8975a1d6-f9a2-4161-9d45-3e2886345aec-config\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.691925 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcc67785-4c94-4fda-a487-9a6d82288895-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.691941 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.692067 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.692120 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-config\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.692132 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.692202 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.692280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcc67785-4c94-4fda-a487-9a6d82288895-config\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.692337 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-98baf24e-5f25-4dfb-a2fa-c8ac6cfb5890\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98baf24e-5f25-4dfb-a2fa-c8ac6cfb5890\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.692508 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8975a1d6-f9a2-4161-9d45-3e2886345aec-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.692549 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc67785-4c94-4fda-a487-9a6d82288895-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.692571 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8975a1d6-f9a2-4161-9d45-3e2886345aec-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.692685 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8975a1d6-f9a2-4161-9d45-3e2886345aec-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.693453 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc67785-4c94-4fda-a487-9a6d82288895-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.693768 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcc67785-4c94-4fda-a487-9a6d82288895-config\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.694682 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.695208 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.698420 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.698477 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9e5764c8-2e95-46a2-9798-d2dc3df49733\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e5764c8-2e95-46a2-9798-d2dc3df49733\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7fbf12cbda74f529747edd8b86b6f19761d12468a561f8946490e26a32f56c10/globalmount\"" pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.699191 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc67785-4c94-4fda-a487-9a6d82288895-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.699233 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc67785-4c94-4fda-a487-9a6d82288895-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.700377 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc67785-4c94-4fda-a487-9a6d82288895-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.702452 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.703355 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.706051 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.708611 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.708692 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1bf07ab6-c423-46aa-97d3-da5ea041522b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bf07ab6-c423-46aa-97d3-da5ea041522b\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7a0bef350463295db1c2624cb58830dc3c2e550b4acf945b17ec7016bdbdd383/globalmount\"" pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.709737 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blmvp\" (UniqueName: \"kubernetes.io/projected/1317cff7-7bfb-4a1a-8686-d3bcb83d6949-kube-api-access-blmvp\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.710033 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lrqc\" (UniqueName: \"kubernetes.io/projected/dcc67785-4c94-4fda-a487-9a6d82288895-kube-api-access-8lrqc\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.729233 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9e5764c8-2e95-46a2-9798-d2dc3df49733\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e5764c8-2e95-46a2-9798-d2dc3df49733\") pod \"ovsdbserver-nb-2\" (UID: \"dcc67785-4c94-4fda-a487-9a6d82288895\") " pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.769559 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1bf07ab6-c423-46aa-97d3-da5ea041522b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bf07ab6-c423-46aa-97d3-da5ea041522b\") pod \"ovsdbserver-nb-0\" (UID: \"1317cff7-7bfb-4a1a-8686-d3bcb83d6949\") " pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.777816 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.795734 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8975a1d6-f9a2-4161-9d45-3e2886345aec-config\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.796067 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-98baf24e-5f25-4dfb-a2fa-c8ac6cfb5890\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98baf24e-5f25-4dfb-a2fa-c8ac6cfb5890\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.796107 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8975a1d6-f9a2-4161-9d45-3e2886345aec-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.796131 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8975a1d6-f9a2-4161-9d45-3e2886345aec-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.796169 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8975a1d6-f9a2-4161-9d45-3e2886345aec-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.796226 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78zwn\" (UniqueName: \"kubernetes.io/projected/8975a1d6-f9a2-4161-9d45-3e2886345aec-kube-api-access-78zwn\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.796267 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8975a1d6-f9a2-4161-9d45-3e2886345aec-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.796292 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8975a1d6-f9a2-4161-9d45-3e2886345aec-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.797761 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8975a1d6-f9a2-4161-9d45-3e2886345aec-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.797915 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8975a1d6-f9a2-4161-9d45-3e2886345aec-config\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.798114 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8975a1d6-f9a2-4161-9d45-3e2886345aec-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.799346 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.799374 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-98baf24e-5f25-4dfb-a2fa-c8ac6cfb5890\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98baf24e-5f25-4dfb-a2fa-c8ac6cfb5890\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5727dd24871f7c29bdc42f14b46d9ea997ee0c4a6490632f6780b0b2e5d2d9f0/globalmount\"" pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.801767 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8975a1d6-f9a2-4161-9d45-3e2886345aec-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.803248 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8975a1d6-f9a2-4161-9d45-3e2886345aec-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.809223 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8975a1d6-f9a2-4161-9d45-3e2886345aec-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.809548 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.817300 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78zwn\" (UniqueName: \"kubernetes.io/projected/8975a1d6-f9a2-4161-9d45-3e2886345aec-kube-api-access-78zwn\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:53 crc kubenswrapper[4841]: I0130 06:35:53.847937 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-98baf24e-5f25-4dfb-a2fa-c8ac6cfb5890\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98baf24e-5f25-4dfb-a2fa-c8ac6cfb5890\") pod \"ovsdbserver-nb-1\" (UID: \"8975a1d6-f9a2-4161-9d45-3e2886345aec\") " pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.127654 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.333715 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 06:35:54 crc kubenswrapper[4841]: W0130 06:35:54.343492 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1317cff7_7bfb_4a1a_8686_d3bcb83d6949.slice/crio-73cd953385820f752d72d2352badf7ff65c0b8adfad1ef83b4deb548990250ca WatchSource:0}: Error finding container 73cd953385820f752d72d2352badf7ff65c0b8adfad1ef83b4deb548990250ca: Status 404 returned error can't find the container with id 73cd953385820f752d72d2352badf7ff65c0b8adfad1ef83b4deb548990250ca Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.428505 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.645908 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Jan 30 06:35:54 crc kubenswrapper[4841]: W0130 06:35:54.658221 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8975a1d6_f9a2_4161_9d45_3e2886345aec.slice/crio-b60742e97c31ba841889af50cb72853716ecc578845f62afd9d565735b4d1f37 WatchSource:0}: Error finding container b60742e97c31ba841889af50cb72853716ecc578845f62afd9d565735b4d1f37: Status 404 returned error can't find the container with id b60742e97c31ba841889af50cb72853716ecc578845f62afd9d565735b4d1f37 Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.762450 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1317cff7-7bfb-4a1a-8686-d3bcb83d6949","Type":"ContainerStarted","Data":"d3e02977fd344b7352b9723e735c93103d062b9fc9964996f86acd7f69ebeedb"} Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.762508 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1317cff7-7bfb-4a1a-8686-d3bcb83d6949","Type":"ContainerStarted","Data":"73cd953385820f752d72d2352badf7ff65c0b8adfad1ef83b4deb548990250ca"} Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.764620 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"dcc67785-4c94-4fda-a487-9a6d82288895","Type":"ContainerStarted","Data":"c10319d55505c8537fc7aa272aa02badfca1f1c25ffc6a87c5b232c7bc927307"} Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.764670 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"dcc67785-4c94-4fda-a487-9a6d82288895","Type":"ContainerStarted","Data":"5440b7767f7be87401ed7fc9b31234bdfc40b23f65928b28f9bcd3c815ef5ee2"} Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.775635 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"8975a1d6-f9a2-4161-9d45-3e2886345aec","Type":"ContainerStarted","Data":"b60742e97c31ba841889af50cb72853716ecc578845f62afd9d565735b4d1f37"} Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.929600 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.931024 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.935005 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-pcc4l" Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.935262 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.936943 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.937214 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.951630 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.953025 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.972936 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.974535 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.974741 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.980440 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Jan 30 06:35:54 crc kubenswrapper[4841]: I0130 06:35:54.986856 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.030791 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65kjz\" (UniqueName: \"kubernetes.io/projected/a274354c-c005-44ce-85f0-feb8762cc66d-kube-api-access-65kjz\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.031062 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a274354c-c005-44ce-85f0-feb8762cc66d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.031154 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a274354c-c005-44ce-85f0-feb8762cc66d-config\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.031312 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a274354c-c005-44ce-85f0-feb8762cc66d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.031463 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a274354c-c005-44ce-85f0-feb8762cc66d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.031627 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a274354c-c005-44ce-85f0-feb8762cc66d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.031732 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e3e635f7-f323-42df-90b7-1f0525945606\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3e635f7-f323-42df-90b7-1f0525945606\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.031826 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a274354c-c005-44ce-85f0-feb8762cc66d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.135665 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/990cb14a-6eab-4cbf-997f-d04cb95b3575-config\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.136847 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llkmx\" (UniqueName: \"kubernetes.io/projected/990cb14a-6eab-4cbf-997f-d04cb95b3575-kube-api-access-llkmx\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.136985 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a274354c-c005-44ce-85f0-feb8762cc66d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.137112 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990cb14a-6eab-4cbf-997f-d04cb95b3575-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.137221 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-abac6fb5-957a-495c-aaec-e2cfbce0e3e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-abac6fb5-957a-495c-aaec-e2cfbce0e3e5\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.137329 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/861c7989-9589-44f4-bfac-39fc9c3c5b8c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.137452 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/861c7989-9589-44f4-bfac-39fc9c3c5b8c-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.137549 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a274354c-c005-44ce-85f0-feb8762cc66d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.138605 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/990cb14a-6eab-4cbf-997f-d04cb95b3575-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.139144 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e3e635f7-f323-42df-90b7-1f0525945606\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3e635f7-f323-42df-90b7-1f0525945606\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.139253 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a274354c-c005-44ce-85f0-feb8762cc66d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.139393 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqxzt\" (UniqueName: \"kubernetes.io/projected/861c7989-9589-44f4-bfac-39fc9c3c5b8c-kube-api-access-jqxzt\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.139516 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/990cb14a-6eab-4cbf-997f-d04cb95b3575-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.139647 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861c7989-9589-44f4-bfac-39fc9c3c5b8c-config\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.139750 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65kjz\" (UniqueName: \"kubernetes.io/projected/a274354c-c005-44ce-85f0-feb8762cc66d-kube-api-access-65kjz\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.139875 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a274354c-c005-44ce-85f0-feb8762cc66d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.139971 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/861c7989-9589-44f4-bfac-39fc9c3c5b8c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.140066 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a274354c-c005-44ce-85f0-feb8762cc66d-config\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.140172 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/990cb14a-6eab-4cbf-997f-d04cb95b3575-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.140479 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861c7989-9589-44f4-bfac-39fc9c3c5b8c-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.140519 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/861c7989-9589-44f4-bfac-39fc9c3c5b8c-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.140546 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-62eeacbe-fd42-42e8-ac2f-4b039968caa9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62eeacbe-fd42-42e8-ac2f-4b039968caa9\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.140600 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a274354c-c005-44ce-85f0-feb8762cc66d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.140616 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990cb14a-6eab-4cbf-997f-d04cb95b3575-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.140821 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a274354c-c005-44ce-85f0-feb8762cc66d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.142528 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a274354c-c005-44ce-85f0-feb8762cc66d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.142612 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a274354c-c005-44ce-85f0-feb8762cc66d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.142728 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a274354c-c005-44ce-85f0-feb8762cc66d-config\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.142755 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.142882 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e3e635f7-f323-42df-90b7-1f0525945606\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3e635f7-f323-42df-90b7-1f0525945606\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bfefe9dcdd1f652fd7cbec25bdb4290d95c20d7bbf64efeb7e55ff79b9750d40/globalmount\"" pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.143127 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a274354c-c005-44ce-85f0-feb8762cc66d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.145820 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a274354c-c005-44ce-85f0-feb8762cc66d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.155985 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65kjz\" (UniqueName: \"kubernetes.io/projected/a274354c-c005-44ce-85f0-feb8762cc66d-kube-api-access-65kjz\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.189189 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e3e635f7-f323-42df-90b7-1f0525945606\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3e635f7-f323-42df-90b7-1f0525945606\") pod \"ovsdbserver-sb-0\" (UID: \"a274354c-c005-44ce-85f0-feb8762cc66d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.242332 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqxzt\" (UniqueName: \"kubernetes.io/projected/861c7989-9589-44f4-bfac-39fc9c3c5b8c-kube-api-access-jqxzt\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.242395 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/990cb14a-6eab-4cbf-997f-d04cb95b3575-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.242438 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861c7989-9589-44f4-bfac-39fc9c3c5b8c-config\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.242487 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/861c7989-9589-44f4-bfac-39fc9c3c5b8c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.242562 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/990cb14a-6eab-4cbf-997f-d04cb95b3575-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.242599 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861c7989-9589-44f4-bfac-39fc9c3c5b8c-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.242620 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/861c7989-9589-44f4-bfac-39fc9c3c5b8c-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.242650 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-62eeacbe-fd42-42e8-ac2f-4b039968caa9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62eeacbe-fd42-42e8-ac2f-4b039968caa9\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.242692 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990cb14a-6eab-4cbf-997f-d04cb95b3575-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.242741 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/990cb14a-6eab-4cbf-997f-d04cb95b3575-config\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.242765 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llkmx\" (UniqueName: \"kubernetes.io/projected/990cb14a-6eab-4cbf-997f-d04cb95b3575-kube-api-access-llkmx\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.243296 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/990cb14a-6eab-4cbf-997f-d04cb95b3575-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.243386 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861c7989-9589-44f4-bfac-39fc9c3c5b8c-config\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.243624 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/861c7989-9589-44f4-bfac-39fc9c3c5b8c-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.244552 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990cb14a-6eab-4cbf-997f-d04cb95b3575-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.244612 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-abac6fb5-957a-495c-aaec-e2cfbce0e3e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-abac6fb5-957a-495c-aaec-e2cfbce0e3e5\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.244671 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/861c7989-9589-44f4-bfac-39fc9c3c5b8c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.244715 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/990cb14a-6eab-4cbf-997f-d04cb95b3575-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.244738 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/861c7989-9589-44f4-bfac-39fc9c3c5b8c-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.245121 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/861c7989-9589-44f4-bfac-39fc9c3c5b8c-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.246942 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990cb14a-6eab-4cbf-997f-d04cb95b3575-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.247861 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.247893 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-62eeacbe-fd42-42e8-ac2f-4b039968caa9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62eeacbe-fd42-42e8-ac2f-4b039968caa9\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ba017690b03f2183efa9e2f0b949ccdb1d739a40355891c993b12a9d2d0a1530/globalmount\"" pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.248039 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/990cb14a-6eab-4cbf-997f-d04cb95b3575-config\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.248362 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.248453 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-abac6fb5-957a-495c-aaec-e2cfbce0e3e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-abac6fb5-957a-495c-aaec-e2cfbce0e3e5\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d2158934bbf64df54a1b8e20a3be1d184a09d333d896c98586c2d1f5908f26fa/globalmount\"" pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.249196 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/861c7989-9589-44f4-bfac-39fc9c3c5b8c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.252030 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/990cb14a-6eab-4cbf-997f-d04cb95b3575-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.253069 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990cb14a-6eab-4cbf-997f-d04cb95b3575-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.254216 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/861c7989-9589-44f4-bfac-39fc9c3c5b8c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.255093 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861c7989-9589-44f4-bfac-39fc9c3c5b8c-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.255380 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/990cb14a-6eab-4cbf-997f-d04cb95b3575-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.258849 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqxzt\" (UniqueName: \"kubernetes.io/projected/861c7989-9589-44f4-bfac-39fc9c3c5b8c-kube-api-access-jqxzt\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.262162 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llkmx\" (UniqueName: \"kubernetes.io/projected/990cb14a-6eab-4cbf-997f-d04cb95b3575-kube-api-access-llkmx\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.279264 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-abac6fb5-957a-495c-aaec-e2cfbce0e3e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-abac6fb5-957a-495c-aaec-e2cfbce0e3e5\") pod \"ovsdbserver-sb-1\" (UID: \"861c7989-9589-44f4-bfac-39fc9c3c5b8c\") " pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.293584 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-62eeacbe-fd42-42e8-ac2f-4b039968caa9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62eeacbe-fd42-42e8-ac2f-4b039968caa9\") pod \"ovsdbserver-sb-2\" (UID: \"990cb14a-6eab-4cbf-997f-d04cb95b3575\") " pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.356587 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.366601 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.384718 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.798873 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1317cff7-7bfb-4a1a-8686-d3bcb83d6949","Type":"ContainerStarted","Data":"ac6e6038b9a7eda5a2f9768cd713f8fb248db8cb5fb88049992b075b1cc62c61"} Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.801085 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"dcc67785-4c94-4fda-a487-9a6d82288895","Type":"ContainerStarted","Data":"12d5d45b26a012069ee7a0ceee93ba7ca8fe45eb56f57e6bbce969582dd3f6ae"} Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.804926 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"8975a1d6-f9a2-4161-9d45-3e2886345aec","Type":"ContainerStarted","Data":"698bdee0190a80abf71b8e74d27ebf7afcf87d81007da8768b51c8706fee439f"} Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.804969 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"8975a1d6-f9a2-4161-9d45-3e2886345aec","Type":"ContainerStarted","Data":"b01bbad30df236c678bb8a5f41fcecb351c967e32795f2605c803822f079ee96"} Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.843844 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.843784308 podStartE2EDuration="3.843784308s" podCreationTimestamp="2026-01-30 06:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:55.836283299 +0000 UTC m=+5292.829755937" watchObservedRunningTime="2026-01-30 06:35:55.843784308 +0000 UTC m=+5292.837256986" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.857484 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.864655 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.864630203 podStartE2EDuration="3.864630203s" podCreationTimestamp="2026-01-30 06:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:55.856454005 +0000 UTC m=+5292.849926663" watchObservedRunningTime="2026-01-30 06:35:55.864630203 +0000 UTC m=+5292.858102851" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.882108 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.882087518 podStartE2EDuration="3.882087518s" podCreationTimestamp="2026-01-30 06:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:55.872896893 +0000 UTC m=+5292.866369531" watchObservedRunningTime="2026-01-30 06:35:55.882087518 +0000 UTC m=+5292.875560156" Jan 30 06:35:55 crc kubenswrapper[4841]: I0130 06:35:55.955544 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Jan 30 06:35:55 crc kubenswrapper[4841]: W0130 06:35:55.955644 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod861c7989_9589_44f4_bfac_39fc9c3c5b8c.slice/crio-acc38b57250df251675dfc5f4278a4e31d32a7e3af98959c223e5c25fd9b0ad9 WatchSource:0}: Error finding container acc38b57250df251675dfc5f4278a4e31d32a7e3af98959c223e5c25fd9b0ad9: Status 404 returned error can't find the container with id acc38b57250df251675dfc5f4278a4e31d32a7e3af98959c223e5c25fd9b0ad9 Jan 30 06:35:56 crc kubenswrapper[4841]: I0130 06:35:56.674904 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 06:35:56 crc kubenswrapper[4841]: W0130 06:35:56.677107 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda274354c_c005_44ce_85f0_feb8762cc66d.slice/crio-45c1b756285546fa4b73f2a1f9463c505822532dd0fb6b2507b5fa3fa61e253c WatchSource:0}: Error finding container 45c1b756285546fa4b73f2a1f9463c505822532dd0fb6b2507b5fa3fa61e253c: Status 404 returned error can't find the container with id 45c1b756285546fa4b73f2a1f9463c505822532dd0fb6b2507b5fa3fa61e253c Jan 30 06:35:56 crc kubenswrapper[4841]: I0130 06:35:56.778059 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:56 crc kubenswrapper[4841]: I0130 06:35:56.813573 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"861c7989-9589-44f4-bfac-39fc9c3c5b8c","Type":"ContainerStarted","Data":"8eadd85fb75154819f568cf4346416438de4431a3165879c6c445bf276a16720"} Jan 30 06:35:56 crc kubenswrapper[4841]: I0130 06:35:56.813618 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:56 crc kubenswrapper[4841]: I0130 06:35:56.813629 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"861c7989-9589-44f4-bfac-39fc9c3c5b8c","Type":"ContainerStarted","Data":"a600d0aec4ec9b8222529d59a95ecbbdbca14a22981d7a625161086169b359c0"} Jan 30 06:35:56 crc kubenswrapper[4841]: I0130 06:35:56.813638 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"861c7989-9589-44f4-bfac-39fc9c3c5b8c","Type":"ContainerStarted","Data":"acc38b57250df251675dfc5f4278a4e31d32a7e3af98959c223e5c25fd9b0ad9"} Jan 30 06:35:56 crc kubenswrapper[4841]: I0130 06:35:56.815104 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a274354c-c005-44ce-85f0-feb8762cc66d","Type":"ContainerStarted","Data":"45c1b756285546fa4b73f2a1f9463c505822532dd0fb6b2507b5fa3fa61e253c"} Jan 30 06:35:56 crc kubenswrapper[4841]: I0130 06:35:56.818558 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"990cb14a-6eab-4cbf-997f-d04cb95b3575","Type":"ContainerStarted","Data":"0d5a42a1eb2159a9bcd22ba6c46b0cca93e50e4e8b5e9d064c98504eeb3b26ff"} Jan 30 06:35:56 crc kubenswrapper[4841]: I0130 06:35:56.818583 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"990cb14a-6eab-4cbf-997f-d04cb95b3575","Type":"ContainerStarted","Data":"1841831aba5b0931b5afbcf06c7908e36719b57bb7ef79a00ded67b93efbc276"} Jan 30 06:35:56 crc kubenswrapper[4841]: I0130 06:35:56.818593 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"990cb14a-6eab-4cbf-997f-d04cb95b3575","Type":"ContainerStarted","Data":"36448b89fe2a7ed50ea102cb82c913aa896860b2c4406c0a4659a2ef2d5922dd"} Jan 30 06:35:56 crc kubenswrapper[4841]: I0130 06:35:56.842030 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.84201019 podStartE2EDuration="3.84201019s" podCreationTimestamp="2026-01-30 06:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:56.833640438 +0000 UTC m=+5293.827113076" watchObservedRunningTime="2026-01-30 06:35:56.84201019 +0000 UTC m=+5293.835482828" Jan 30 06:35:56 crc kubenswrapper[4841]: I0130 06:35:56.861500 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.861464328 podStartE2EDuration="3.861464328s" podCreationTimestamp="2026-01-30 06:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:56.85326246 +0000 UTC m=+5293.846735098" watchObservedRunningTime="2026-01-30 06:35:56.861464328 +0000 UTC m=+5293.854936966" Jan 30 06:35:57 crc kubenswrapper[4841]: I0130 06:35:57.128489 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:57 crc kubenswrapper[4841]: I0130 06:35:57.170713 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:57 crc kubenswrapper[4841]: I0130 06:35:57.837769 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a274354c-c005-44ce-85f0-feb8762cc66d","Type":"ContainerStarted","Data":"b0fd1bafe0fd4fe32d4f40437e74733d2ff106b94e43e7e27305dd447a59d5ff"} Jan 30 06:35:57 crc kubenswrapper[4841]: I0130 06:35:57.837889 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a274354c-c005-44ce-85f0-feb8762cc66d","Type":"ContainerStarted","Data":"b9e5e38838db9580192802c41698a474581b817ad9322cd4f2a032f40a72c3e7"} Jan 30 06:35:57 crc kubenswrapper[4841]: I0130 06:35:57.839472 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:57 crc kubenswrapper[4841]: I0130 06:35:57.879648 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.879617811 podStartE2EDuration="4.879617811s" podCreationTimestamp="2026-01-30 06:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:35:57.867712285 +0000 UTC m=+5294.861184983" watchObservedRunningTime="2026-01-30 06:35:57.879617811 +0000 UTC m=+5294.873090479" Jan 30 06:35:58 crc kubenswrapper[4841]: I0130 06:35:58.356911 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 30 06:35:58 crc kubenswrapper[4841]: I0130 06:35:58.367330 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Jan 30 06:35:58 crc kubenswrapper[4841]: I0130 06:35:58.386497 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Jan 30 06:35:58 crc kubenswrapper[4841]: I0130 06:35:58.778697 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:58 crc kubenswrapper[4841]: I0130 06:35:58.810342 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.214984 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.542713 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dc945dddc-j6qhs"] Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.544331 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.547086 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.557416 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc945dddc-j6qhs"] Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.646344 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-dns-svc\") pod \"dnsmasq-dns-6dc945dddc-j6qhs\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.646480 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-config\") pod \"dnsmasq-dns-6dc945dddc-j6qhs\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.646510 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc945dddc-j6qhs\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.646552 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99s6n\" (UniqueName: \"kubernetes.io/projected/1d283730-a2d7-431b-a09f-a74630d1f766-kube-api-access-99s6n\") pod \"dnsmasq-dns-6dc945dddc-j6qhs\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.748184 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-dns-svc\") pod \"dnsmasq-dns-6dc945dddc-j6qhs\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.748316 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-config\") pod \"dnsmasq-dns-6dc945dddc-j6qhs\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.748369 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc945dddc-j6qhs\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.748446 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99s6n\" (UniqueName: \"kubernetes.io/projected/1d283730-a2d7-431b-a09f-a74630d1f766-kube-api-access-99s6n\") pod \"dnsmasq-dns-6dc945dddc-j6qhs\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.749328 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-dns-svc\") pod \"dnsmasq-dns-6dc945dddc-j6qhs\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.749472 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-config\") pod \"dnsmasq-dns-6dc945dddc-j6qhs\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.749946 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc945dddc-j6qhs\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.771281 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99s6n\" (UniqueName: \"kubernetes.io/projected/1d283730-a2d7-431b-a09f-a74630d1f766-kube-api-access-99s6n\") pod \"dnsmasq-dns-6dc945dddc-j6qhs\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.831907 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.864342 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.912906 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.915601 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:35:59 crc kubenswrapper[4841]: I0130 06:35:59.918328 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Jan 30 06:36:00 crc kubenswrapper[4841]: I0130 06:36:00.356667 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 30 06:36:00 crc kubenswrapper[4841]: I0130 06:36:00.366902 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Jan 30 06:36:00 crc kubenswrapper[4841]: I0130 06:36:00.385481 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Jan 30 06:36:00 crc kubenswrapper[4841]: I0130 06:36:00.399260 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc945dddc-j6qhs"] Jan 30 06:36:00 crc kubenswrapper[4841]: W0130 06:36:00.411572 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d283730_a2d7_431b_a09f_a74630d1f766.slice/crio-330a7cf28372d50d31d4193e10aae0e28a3d622180b8bf598badfc268228717b WatchSource:0}: Error finding container 330a7cf28372d50d31d4193e10aae0e28a3d622180b8bf598badfc268228717b: Status 404 returned error can't find the container with id 330a7cf28372d50d31d4193e10aae0e28a3d622180b8bf598badfc268228717b Jan 30 06:36:00 crc kubenswrapper[4841]: I0130 06:36:00.879661 4841 generic.go:334] "Generic (PLEG): container finished" podID="1d283730-a2d7-431b-a09f-a74630d1f766" containerID="417ed40bae65c881835279cb1135719ef6fc76a58202018f27f340d92252269c" exitCode=0 Jan 30 06:36:00 crc kubenswrapper[4841]: I0130 06:36:00.879726 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" event={"ID":"1d283730-a2d7-431b-a09f-a74630d1f766","Type":"ContainerDied","Data":"417ed40bae65c881835279cb1135719ef6fc76a58202018f27f340d92252269c"} Jan 30 06:36:00 crc kubenswrapper[4841]: I0130 06:36:00.880237 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" event={"ID":"1d283730-a2d7-431b-a09f-a74630d1f766","Type":"ContainerStarted","Data":"330a7cf28372d50d31d4193e10aae0e28a3d622180b8bf598badfc268228717b"} Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.420394 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.429263 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.444300 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.498225 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.501467 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.684378 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc945dddc-j6qhs"] Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.713829 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69886d74bc-jh4dr"] Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.715549 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.722475 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.741235 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69886d74bc-jh4dr"] Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.800478 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-config\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.800793 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-ovsdbserver-sb\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.800923 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5lj7\" (UniqueName: \"kubernetes.io/projected/18d4de94-fb76-4b94-a887-a3e4c20c8761-kube-api-access-l5lj7\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.800967 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-dns-svc\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.801062 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-ovsdbserver-nb\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.895565 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" event={"ID":"1d283730-a2d7-431b-a09f-a74630d1f766","Type":"ContainerStarted","Data":"fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1"} Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.902979 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-config\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.903140 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-ovsdbserver-sb\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.903207 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5lj7\" (UniqueName: \"kubernetes.io/projected/18d4de94-fb76-4b94-a887-a3e4c20c8761-kube-api-access-l5lj7\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.903242 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-dns-svc\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.903284 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-ovsdbserver-nb\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.904444 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-ovsdbserver-sb\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.904764 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-config\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.904826 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-ovsdbserver-nb\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.905090 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-dns-svc\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.935066 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" podStartSLOduration=2.935038036 podStartE2EDuration="2.935038036s" podCreationTimestamp="2026-01-30 06:35:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:01.920196081 +0000 UTC m=+5298.913668759" watchObservedRunningTime="2026-01-30 06:36:01.935038036 +0000 UTC m=+5298.928510704" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.943122 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5lj7\" (UniqueName: \"kubernetes.io/projected/18d4de94-fb76-4b94-a887-a3e4c20c8761-kube-api-access-l5lj7\") pod \"dnsmasq-dns-69886d74bc-jh4dr\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:01 crc kubenswrapper[4841]: I0130 06:36:01.969888 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 30 06:36:02 crc kubenswrapper[4841]: I0130 06:36:02.073721 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:02 crc kubenswrapper[4841]: I0130 06:36:02.565846 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69886d74bc-jh4dr"] Jan 30 06:36:02 crc kubenswrapper[4841]: W0130 06:36:02.574574 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18d4de94_fb76_4b94_a887_a3e4c20c8761.slice/crio-fe831850ec8ff25352a2b1e7e26fe2968a23b9286bd7d3d9f6f4e664bd0393fe WatchSource:0}: Error finding container fe831850ec8ff25352a2b1e7e26fe2968a23b9286bd7d3d9f6f4e664bd0393fe: Status 404 returned error can't find the container with id fe831850ec8ff25352a2b1e7e26fe2968a23b9286bd7d3d9f6f4e664bd0393fe Jan 30 06:36:02 crc kubenswrapper[4841]: I0130 06:36:02.906482 4841 generic.go:334] "Generic (PLEG): container finished" podID="18d4de94-fb76-4b94-a887-a3e4c20c8761" containerID="6b94f6422f7515250dde9e44e3d53f6a1696af06426c06eec6f21ac6f27fa447" exitCode=0 Jan 30 06:36:02 crc kubenswrapper[4841]: I0130 06:36:02.906583 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" event={"ID":"18d4de94-fb76-4b94-a887-a3e4c20c8761","Type":"ContainerDied","Data":"6b94f6422f7515250dde9e44e3d53f6a1696af06426c06eec6f21ac6f27fa447"} Jan 30 06:36:02 crc kubenswrapper[4841]: I0130 06:36:02.906654 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" event={"ID":"18d4de94-fb76-4b94-a887-a3e4c20c8761","Type":"ContainerStarted","Data":"fe831850ec8ff25352a2b1e7e26fe2968a23b9286bd7d3d9f6f4e664bd0393fe"} Jan 30 06:36:02 crc kubenswrapper[4841]: I0130 06:36:02.907140 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:36:02 crc kubenswrapper[4841]: I0130 06:36:02.907470 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" podUID="1d283730-a2d7-431b-a09f-a74630d1f766" containerName="dnsmasq-dns" containerID="cri-o://fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1" gracePeriod=10 Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.678079 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.839067 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-config\") pod \"1d283730-a2d7-431b-a09f-a74630d1f766\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.839173 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99s6n\" (UniqueName: \"kubernetes.io/projected/1d283730-a2d7-431b-a09f-a74630d1f766-kube-api-access-99s6n\") pod \"1d283730-a2d7-431b-a09f-a74630d1f766\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.839265 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-ovsdbserver-nb\") pod \"1d283730-a2d7-431b-a09f-a74630d1f766\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.839294 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-dns-svc\") pod \"1d283730-a2d7-431b-a09f-a74630d1f766\" (UID: \"1d283730-a2d7-431b-a09f-a74630d1f766\") " Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.845007 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d283730-a2d7-431b-a09f-a74630d1f766-kube-api-access-99s6n" (OuterVolumeSpecName: "kube-api-access-99s6n") pod "1d283730-a2d7-431b-a09f-a74630d1f766" (UID: "1d283730-a2d7-431b-a09f-a74630d1f766"). InnerVolumeSpecName "kube-api-access-99s6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.889500 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1d283730-a2d7-431b-a09f-a74630d1f766" (UID: "1d283730-a2d7-431b-a09f-a74630d1f766"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.889599 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1d283730-a2d7-431b-a09f-a74630d1f766" (UID: "1d283730-a2d7-431b-a09f-a74630d1f766"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.889932 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-config" (OuterVolumeSpecName: "config") pod "1d283730-a2d7-431b-a09f-a74630d1f766" (UID: "1d283730-a2d7-431b-a09f-a74630d1f766"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.922384 4841 generic.go:334] "Generic (PLEG): container finished" podID="1d283730-a2d7-431b-a09f-a74630d1f766" containerID="fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1" exitCode=0 Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.922492 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.922513 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" event={"ID":"1d283730-a2d7-431b-a09f-a74630d1f766","Type":"ContainerDied","Data":"fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1"} Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.922554 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc945dddc-j6qhs" event={"ID":"1d283730-a2d7-431b-a09f-a74630d1f766","Type":"ContainerDied","Data":"330a7cf28372d50d31d4193e10aae0e28a3d622180b8bf598badfc268228717b"} Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.922580 4841 scope.go:117] "RemoveContainer" containerID="fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.926166 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" event={"ID":"18d4de94-fb76-4b94-a887-a3e4c20c8761","Type":"ContainerStarted","Data":"ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455"} Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.927122 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.941383 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.941463 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.941481 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d283730-a2d7-431b-a09f-a74630d1f766-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.941496 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99s6n\" (UniqueName: \"kubernetes.io/projected/1d283730-a2d7-431b-a09f-a74630d1f766-kube-api-access-99s6n\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.965817 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" podStartSLOduration=2.965780994 podStartE2EDuration="2.965780994s" podCreationTimestamp="2026-01-30 06:36:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:03.951231587 +0000 UTC m=+5300.944704235" watchObservedRunningTime="2026-01-30 06:36:03.965780994 +0000 UTC m=+5300.959253682" Jan 30 06:36:03 crc kubenswrapper[4841]: I0130 06:36:03.989463 4841 scope.go:117] "RemoveContainer" containerID="417ed40bae65c881835279cb1135719ef6fc76a58202018f27f340d92252269c" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.015450 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc945dddc-j6qhs"] Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.033707 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dc945dddc-j6qhs"] Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.034877 4841 scope.go:117] "RemoveContainer" containerID="fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1" Jan 30 06:36:04 crc kubenswrapper[4841]: E0130 06:36:04.035628 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1\": container with ID starting with fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1 not found: ID does not exist" containerID="fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.035991 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1"} err="failed to get container status \"fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1\": rpc error: code = NotFound desc = could not find container \"fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1\": container with ID starting with fe1fdfdac56a0b5fffc74e21e53e8b19d81572052185d0a8661d29b87411dbb1 not found: ID does not exist" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.036032 4841 scope.go:117] "RemoveContainer" containerID="417ed40bae65c881835279cb1135719ef6fc76a58202018f27f340d92252269c" Jan 30 06:36:04 crc kubenswrapper[4841]: E0130 06:36:04.036482 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"417ed40bae65c881835279cb1135719ef6fc76a58202018f27f340d92252269c\": container with ID starting with 417ed40bae65c881835279cb1135719ef6fc76a58202018f27f340d92252269c not found: ID does not exist" containerID="417ed40bae65c881835279cb1135719ef6fc76a58202018f27f340d92252269c" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.036569 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"417ed40bae65c881835279cb1135719ef6fc76a58202018f27f340d92252269c"} err="failed to get container status \"417ed40bae65c881835279cb1135719ef6fc76a58202018f27f340d92252269c\": rpc error: code = NotFound desc = could not find container \"417ed40bae65c881835279cb1135719ef6fc76a58202018f27f340d92252269c\": container with ID starting with 417ed40bae65c881835279cb1135719ef6fc76a58202018f27f340d92252269c not found: ID does not exist" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.447004 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d283730-a2d7-431b-a09f-a74630d1f766" path="/var/lib/kubelet/pods/1d283730-a2d7-431b-a09f-a74630d1f766/volumes" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.559675 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Jan 30 06:36:04 crc kubenswrapper[4841]: E0130 06:36:04.560183 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d283730-a2d7-431b-a09f-a74630d1f766" containerName="dnsmasq-dns" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.560206 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d283730-a2d7-431b-a09f-a74630d1f766" containerName="dnsmasq-dns" Jan 30 06:36:04 crc kubenswrapper[4841]: E0130 06:36:04.560228 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d283730-a2d7-431b-a09f-a74630d1f766" containerName="init" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.560238 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d283730-a2d7-431b-a09f-a74630d1f766" containerName="init" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.560569 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d283730-a2d7-431b-a09f-a74630d1f766" containerName="dnsmasq-dns" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.561355 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.564139 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.573332 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.656090 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f73f48c4-786b-403c-b9b6-906d749b3f1b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f73f48c4-786b-403c-b9b6-906d749b3f1b\") pod \"ovn-copy-data\" (UID: \"5fe595aa-ebed-48fc-a546-37e04be7808f\") " pod="openstack/ovn-copy-data" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.656205 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5fe595aa-ebed-48fc-a546-37e04be7808f-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"5fe595aa-ebed-48fc-a546-37e04be7808f\") " pod="openstack/ovn-copy-data" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.656277 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhvhh\" (UniqueName: \"kubernetes.io/projected/5fe595aa-ebed-48fc-a546-37e04be7808f-kube-api-access-mhvhh\") pod \"ovn-copy-data\" (UID: \"5fe595aa-ebed-48fc-a546-37e04be7808f\") " pod="openstack/ovn-copy-data" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.757907 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhvhh\" (UniqueName: \"kubernetes.io/projected/5fe595aa-ebed-48fc-a546-37e04be7808f-kube-api-access-mhvhh\") pod \"ovn-copy-data\" (UID: \"5fe595aa-ebed-48fc-a546-37e04be7808f\") " pod="openstack/ovn-copy-data" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.759301 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f73f48c4-786b-403c-b9b6-906d749b3f1b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f73f48c4-786b-403c-b9b6-906d749b3f1b\") pod \"ovn-copy-data\" (UID: \"5fe595aa-ebed-48fc-a546-37e04be7808f\") " pod="openstack/ovn-copy-data" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.759687 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5fe595aa-ebed-48fc-a546-37e04be7808f-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"5fe595aa-ebed-48fc-a546-37e04be7808f\") " pod="openstack/ovn-copy-data" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.761691 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.761748 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f73f48c4-786b-403c-b9b6-906d749b3f1b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f73f48c4-786b-403c-b9b6-906d749b3f1b\") pod \"ovn-copy-data\" (UID: \"5fe595aa-ebed-48fc-a546-37e04be7808f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3a1e12a11a629bdcf64026417b2e12d87b16edcccac6c44be371294f30c73374/globalmount\"" pod="openstack/ovn-copy-data" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.771810 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5fe595aa-ebed-48fc-a546-37e04be7808f-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"5fe595aa-ebed-48fc-a546-37e04be7808f\") " pod="openstack/ovn-copy-data" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.798709 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhvhh\" (UniqueName: \"kubernetes.io/projected/5fe595aa-ebed-48fc-a546-37e04be7808f-kube-api-access-mhvhh\") pod \"ovn-copy-data\" (UID: \"5fe595aa-ebed-48fc-a546-37e04be7808f\") " pod="openstack/ovn-copy-data" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.816163 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f73f48c4-786b-403c-b9b6-906d749b3f1b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f73f48c4-786b-403c-b9b6-906d749b3f1b\") pod \"ovn-copy-data\" (UID: \"5fe595aa-ebed-48fc-a546-37e04be7808f\") " pod="openstack/ovn-copy-data" Jan 30 06:36:04 crc kubenswrapper[4841]: I0130 06:36:04.887790 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Jan 30 06:36:05 crc kubenswrapper[4841]: I0130 06:36:05.453575 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Jan 30 06:36:05 crc kubenswrapper[4841]: W0130 06:36:05.457215 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fe595aa_ebed_48fc_a546_37e04be7808f.slice/crio-a285b386b8387ca2455260908b2b822c944f7409c5096328038d611432516a35 WatchSource:0}: Error finding container a285b386b8387ca2455260908b2b822c944f7409c5096328038d611432516a35: Status 404 returned error can't find the container with id a285b386b8387ca2455260908b2b822c944f7409c5096328038d611432516a35 Jan 30 06:36:05 crc kubenswrapper[4841]: I0130 06:36:05.460357 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:36:05 crc kubenswrapper[4841]: I0130 06:36:05.957678 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5fe595aa-ebed-48fc-a546-37e04be7808f","Type":"ContainerStarted","Data":"a285b386b8387ca2455260908b2b822c944f7409c5096328038d611432516a35"} Jan 30 06:36:06 crc kubenswrapper[4841]: I0130 06:36:06.970709 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5fe595aa-ebed-48fc-a546-37e04be7808f","Type":"ContainerStarted","Data":"6e21465a7a9162fab943ec871c3d06e7c600551ea70ab4a2834c3edb8c9e307d"} Jan 30 06:36:07 crc kubenswrapper[4841]: I0130 06:36:07.000650 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.525467443 podStartE2EDuration="4.000627142s" podCreationTimestamp="2026-01-30 06:36:03 +0000 UTC" firstStartedPulling="2026-01-30 06:36:05.460035161 +0000 UTC m=+5302.453507819" lastFinishedPulling="2026-01-30 06:36:05.93519484 +0000 UTC m=+5302.928667518" observedRunningTime="2026-01-30 06:36:06.998610178 +0000 UTC m=+5303.992082916" watchObservedRunningTime="2026-01-30 06:36:07.000627142 +0000 UTC m=+5303.994099790" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.075616 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.169291 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-nrhwp"] Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.169838 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" podUID="3e71987d-6237-432a-9a8d-d360b62b497d" containerName="dnsmasq-dns" containerID="cri-o://602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419" gracePeriod=10 Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.657824 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.708184 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e71987d-6237-432a-9a8d-d360b62b497d-config\") pod \"3e71987d-6237-432a-9a8d-d360b62b497d\" (UID: \"3e71987d-6237-432a-9a8d-d360b62b497d\") " Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.708285 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d45l\" (UniqueName: \"kubernetes.io/projected/3e71987d-6237-432a-9a8d-d360b62b497d-kube-api-access-6d45l\") pod \"3e71987d-6237-432a-9a8d-d360b62b497d\" (UID: \"3e71987d-6237-432a-9a8d-d360b62b497d\") " Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.708449 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e71987d-6237-432a-9a8d-d360b62b497d-dns-svc\") pod \"3e71987d-6237-432a-9a8d-d360b62b497d\" (UID: \"3e71987d-6237-432a-9a8d-d360b62b497d\") " Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.714067 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e71987d-6237-432a-9a8d-d360b62b497d-kube-api-access-6d45l" (OuterVolumeSpecName: "kube-api-access-6d45l") pod "3e71987d-6237-432a-9a8d-d360b62b497d" (UID: "3e71987d-6237-432a-9a8d-d360b62b497d"). InnerVolumeSpecName "kube-api-access-6d45l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.759741 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e71987d-6237-432a-9a8d-d360b62b497d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3e71987d-6237-432a-9a8d-d360b62b497d" (UID: "3e71987d-6237-432a-9a8d-d360b62b497d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.768036 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e71987d-6237-432a-9a8d-d360b62b497d-config" (OuterVolumeSpecName: "config") pod "3e71987d-6237-432a-9a8d-d360b62b497d" (UID: "3e71987d-6237-432a-9a8d-d360b62b497d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.804218 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 30 06:36:12 crc kubenswrapper[4841]: E0130 06:36:12.804597 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e71987d-6237-432a-9a8d-d360b62b497d" containerName="dnsmasq-dns" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.804619 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e71987d-6237-432a-9a8d-d360b62b497d" containerName="dnsmasq-dns" Jan 30 06:36:12 crc kubenswrapper[4841]: E0130 06:36:12.804642 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e71987d-6237-432a-9a8d-d360b62b497d" containerName="init" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.804650 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e71987d-6237-432a-9a8d-d360b62b497d" containerName="init" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.804844 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e71987d-6237-432a-9a8d-d360b62b497d" containerName="dnsmasq-dns" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.807246 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.811502 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e71987d-6237-432a-9a8d-d360b62b497d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.811533 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e71987d-6237-432a-9a8d-d360b62b497d-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.811543 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d45l\" (UniqueName: \"kubernetes.io/projected/3e71987d-6237-432a-9a8d-d360b62b497d-kube-api-access-6d45l\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.816074 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.816074 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.816390 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-bt4gl" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.816530 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.835952 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.913390 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.913457 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8qb9\" (UniqueName: \"kubernetes.io/projected/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-kube-api-access-x8qb9\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.913633 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.913824 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.913913 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-scripts\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.914074 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:12 crc kubenswrapper[4841]: I0130 06:36:12.914213 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-config\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.016596 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.016663 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8qb9\" (UniqueName: \"kubernetes.io/projected/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-kube-api-access-x8qb9\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.016746 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.016845 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.016902 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-scripts\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.016974 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.017033 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-config\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.017162 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.018540 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-scripts\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.018927 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-config\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.020703 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.021135 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.022825 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.032891 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8qb9\" (UniqueName: \"kubernetes.io/projected/cea99eaf-d2ae-4baf-a45c-27a7f5279d5c-kube-api-access-x8qb9\") pod \"ovn-northd-0\" (UID: \"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c\") " pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.035162 4841 generic.go:334] "Generic (PLEG): container finished" podID="3e71987d-6237-432a-9a8d-d360b62b497d" containerID="602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419" exitCode=0 Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.035202 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" event={"ID":"3e71987d-6237-432a-9a8d-d360b62b497d","Type":"ContainerDied","Data":"602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419"} Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.035220 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.035238 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-nrhwp" event={"ID":"3e71987d-6237-432a-9a8d-d360b62b497d","Type":"ContainerDied","Data":"eb48a1f42d0ca3dee9456f06d6e8bc5154ee0e42da70b38d5c892e6a37e51327"} Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.035258 4841 scope.go:117] "RemoveContainer" containerID="602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.062452 4841 scope.go:117] "RemoveContainer" containerID="34b5a83249a194f4fa49ef98e101d2ef5d94ee834c156e1c69652642c04065ce" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.084296 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-nrhwp"] Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.097375 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-nrhwp"] Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.107259 4841 scope.go:117] "RemoveContainer" containerID="602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419" Jan 30 06:36:13 crc kubenswrapper[4841]: E0130 06:36:13.108165 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419\": container with ID starting with 602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419 not found: ID does not exist" containerID="602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.108217 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419"} err="failed to get container status \"602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419\": rpc error: code = NotFound desc = could not find container \"602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419\": container with ID starting with 602c293ce9b8b0e83fa95449a182d56291638c71f4146e4faccedf81bd331419 not found: ID does not exist" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.108247 4841 scope.go:117] "RemoveContainer" containerID="34b5a83249a194f4fa49ef98e101d2ef5d94ee834c156e1c69652642c04065ce" Jan 30 06:36:13 crc kubenswrapper[4841]: E0130 06:36:13.108508 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b5a83249a194f4fa49ef98e101d2ef5d94ee834c156e1c69652642c04065ce\": container with ID starting with 34b5a83249a194f4fa49ef98e101d2ef5d94ee834c156e1c69652642c04065ce not found: ID does not exist" containerID="34b5a83249a194f4fa49ef98e101d2ef5d94ee834c156e1c69652642c04065ce" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.108529 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b5a83249a194f4fa49ef98e101d2ef5d94ee834c156e1c69652642c04065ce"} err="failed to get container status \"34b5a83249a194f4fa49ef98e101d2ef5d94ee834c156e1c69652642c04065ce\": rpc error: code = NotFound desc = could not find container \"34b5a83249a194f4fa49ef98e101d2ef5d94ee834c156e1c69652642c04065ce\": container with ID starting with 34b5a83249a194f4fa49ef98e101d2ef5d94ee834c156e1c69652642c04065ce not found: ID does not exist" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.135765 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 06:36:13 crc kubenswrapper[4841]: I0130 06:36:13.632425 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 06:36:14 crc kubenswrapper[4841]: I0130 06:36:14.050748 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c","Type":"ContainerStarted","Data":"e110bb92cc2fa08791c1410e13fabec1f19495d3c4193d829bf52b249dfb5a8e"} Jan 30 06:36:14 crc kubenswrapper[4841]: I0130 06:36:14.050837 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c","Type":"ContainerStarted","Data":"6b37d0562555fdfcdbed204e17a3294b437f19d7c899d794993065916ade98ca"} Jan 30 06:36:14 crc kubenswrapper[4841]: I0130 06:36:14.450654 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e71987d-6237-432a-9a8d-d360b62b497d" path="/var/lib/kubelet/pods/3e71987d-6237-432a-9a8d-d360b62b497d/volumes" Jan 30 06:36:15 crc kubenswrapper[4841]: I0130 06:36:15.064683 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cea99eaf-d2ae-4baf-a45c-27a7f5279d5c","Type":"ContainerStarted","Data":"69ba2d60de00e76e9d9482195d8c95d12907f9ff8a5eab3e6e0a5ab350749157"} Jan 30 06:36:15 crc kubenswrapper[4841]: I0130 06:36:15.064878 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 30 06:36:15 crc kubenswrapper[4841]: I0130 06:36:15.092761 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.092674781 podStartE2EDuration="3.092674781s" podCreationTimestamp="2026-01-30 06:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:15.091971202 +0000 UTC m=+5312.085443860" watchObservedRunningTime="2026-01-30 06:36:15.092674781 +0000 UTC m=+5312.086147449" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.170986 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-zzzcl"] Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.172219 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zzzcl" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.180289 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zzzcl"] Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.214226 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzs2d\" (UniqueName: \"kubernetes.io/projected/deed29e2-f797-4d6e-a53c-8a6a70f98888-kube-api-access-wzs2d\") pod \"keystone-db-create-zzzcl\" (UID: \"deed29e2-f797-4d6e-a53c-8a6a70f98888\") " pod="openstack/keystone-db-create-zzzcl" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.214489 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deed29e2-f797-4d6e-a53c-8a6a70f98888-operator-scripts\") pod \"keystone-db-create-zzzcl\" (UID: \"deed29e2-f797-4d6e-a53c-8a6a70f98888\") " pod="openstack/keystone-db-create-zzzcl" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.267643 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-caf3-account-create-update-rsk67"] Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.269118 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-caf3-account-create-update-rsk67" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.271199 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.281146 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-caf3-account-create-update-rsk67"] Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.319485 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deed29e2-f797-4d6e-a53c-8a6a70f98888-operator-scripts\") pod \"keystone-db-create-zzzcl\" (UID: \"deed29e2-f797-4d6e-a53c-8a6a70f98888\") " pod="openstack/keystone-db-create-zzzcl" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.319611 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzs2d\" (UniqueName: \"kubernetes.io/projected/deed29e2-f797-4d6e-a53c-8a6a70f98888-kube-api-access-wzs2d\") pod \"keystone-db-create-zzzcl\" (UID: \"deed29e2-f797-4d6e-a53c-8a6a70f98888\") " pod="openstack/keystone-db-create-zzzcl" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.319658 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crlvm\" (UniqueName: \"kubernetes.io/projected/b447a96e-6def-4771-b1f9-ed2f17a2fd37-kube-api-access-crlvm\") pod \"keystone-caf3-account-create-update-rsk67\" (UID: \"b447a96e-6def-4771-b1f9-ed2f17a2fd37\") " pod="openstack/keystone-caf3-account-create-update-rsk67" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.319705 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b447a96e-6def-4771-b1f9-ed2f17a2fd37-operator-scripts\") pod \"keystone-caf3-account-create-update-rsk67\" (UID: \"b447a96e-6def-4771-b1f9-ed2f17a2fd37\") " pod="openstack/keystone-caf3-account-create-update-rsk67" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.320574 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deed29e2-f797-4d6e-a53c-8a6a70f98888-operator-scripts\") pod \"keystone-db-create-zzzcl\" (UID: \"deed29e2-f797-4d6e-a53c-8a6a70f98888\") " pod="openstack/keystone-db-create-zzzcl" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.341058 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzs2d\" (UniqueName: \"kubernetes.io/projected/deed29e2-f797-4d6e-a53c-8a6a70f98888-kube-api-access-wzs2d\") pod \"keystone-db-create-zzzcl\" (UID: \"deed29e2-f797-4d6e-a53c-8a6a70f98888\") " pod="openstack/keystone-db-create-zzzcl" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.421686 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crlvm\" (UniqueName: \"kubernetes.io/projected/b447a96e-6def-4771-b1f9-ed2f17a2fd37-kube-api-access-crlvm\") pod \"keystone-caf3-account-create-update-rsk67\" (UID: \"b447a96e-6def-4771-b1f9-ed2f17a2fd37\") " pod="openstack/keystone-caf3-account-create-update-rsk67" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.421767 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b447a96e-6def-4771-b1f9-ed2f17a2fd37-operator-scripts\") pod \"keystone-caf3-account-create-update-rsk67\" (UID: \"b447a96e-6def-4771-b1f9-ed2f17a2fd37\") " pod="openstack/keystone-caf3-account-create-update-rsk67" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.422577 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b447a96e-6def-4771-b1f9-ed2f17a2fd37-operator-scripts\") pod \"keystone-caf3-account-create-update-rsk67\" (UID: \"b447a96e-6def-4771-b1f9-ed2f17a2fd37\") " pod="openstack/keystone-caf3-account-create-update-rsk67" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.442982 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crlvm\" (UniqueName: \"kubernetes.io/projected/b447a96e-6def-4771-b1f9-ed2f17a2fd37-kube-api-access-crlvm\") pod \"keystone-caf3-account-create-update-rsk67\" (UID: \"b447a96e-6def-4771-b1f9-ed2f17a2fd37\") " pod="openstack/keystone-caf3-account-create-update-rsk67" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.489749 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zzzcl" Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.623218 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-caf3-account-create-update-rsk67" Jan 30 06:36:18 crc kubenswrapper[4841]: W0130 06:36:18.969536 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeed29e2_f797_4d6e_a53c_8a6a70f98888.slice/crio-eb1329567618e047de0f1d20c925bfe8766e4055b1ecdac2cfd0a32dfe160737 WatchSource:0}: Error finding container eb1329567618e047de0f1d20c925bfe8766e4055b1ecdac2cfd0a32dfe160737: Status 404 returned error can't find the container with id eb1329567618e047de0f1d20c925bfe8766e4055b1ecdac2cfd0a32dfe160737 Jan 30 06:36:18 crc kubenswrapper[4841]: I0130 06:36:18.970367 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zzzcl"] Jan 30 06:36:19 crc kubenswrapper[4841]: I0130 06:36:19.070797 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-caf3-account-create-update-rsk67"] Jan 30 06:36:19 crc kubenswrapper[4841]: I0130 06:36:19.107934 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zzzcl" event={"ID":"deed29e2-f797-4d6e-a53c-8a6a70f98888","Type":"ContainerStarted","Data":"eb1329567618e047de0f1d20c925bfe8766e4055b1ecdac2cfd0a32dfe160737"} Jan 30 06:36:19 crc kubenswrapper[4841]: I0130 06:36:19.110138 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-caf3-account-create-update-rsk67" event={"ID":"b447a96e-6def-4771-b1f9-ed2f17a2fd37","Type":"ContainerStarted","Data":"54d1f2bfad2b23d20ed050074e978c1968a31e5ab579e74f3526ff72f88abfe1"} Jan 30 06:36:20 crc kubenswrapper[4841]: I0130 06:36:20.125458 4841 generic.go:334] "Generic (PLEG): container finished" podID="deed29e2-f797-4d6e-a53c-8a6a70f98888" containerID="2919ee10621b59e484c5a46b7fc4eff039c6cadb1fa1165aac92b9a68ce08d69" exitCode=0 Jan 30 06:36:20 crc kubenswrapper[4841]: I0130 06:36:20.125616 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zzzcl" event={"ID":"deed29e2-f797-4d6e-a53c-8a6a70f98888","Type":"ContainerDied","Data":"2919ee10621b59e484c5a46b7fc4eff039c6cadb1fa1165aac92b9a68ce08d69"} Jan 30 06:36:20 crc kubenswrapper[4841]: I0130 06:36:20.128899 4841 generic.go:334] "Generic (PLEG): container finished" podID="b447a96e-6def-4771-b1f9-ed2f17a2fd37" containerID="866cd25e2c9d737cc7210a35349328ee12b39451f6c10b9048c7377c362c082a" exitCode=0 Jan 30 06:36:20 crc kubenswrapper[4841]: I0130 06:36:20.128944 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-caf3-account-create-update-rsk67" event={"ID":"b447a96e-6def-4771-b1f9-ed2f17a2fd37","Type":"ContainerDied","Data":"866cd25e2c9d737cc7210a35349328ee12b39451f6c10b9048c7377c362c082a"} Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.570875 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-caf3-account-create-update-rsk67" Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.579188 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zzzcl" Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.693678 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b447a96e-6def-4771-b1f9-ed2f17a2fd37-operator-scripts\") pod \"b447a96e-6def-4771-b1f9-ed2f17a2fd37\" (UID: \"b447a96e-6def-4771-b1f9-ed2f17a2fd37\") " Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.693782 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deed29e2-f797-4d6e-a53c-8a6a70f98888-operator-scripts\") pod \"deed29e2-f797-4d6e-a53c-8a6a70f98888\" (UID: \"deed29e2-f797-4d6e-a53c-8a6a70f98888\") " Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.693810 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crlvm\" (UniqueName: \"kubernetes.io/projected/b447a96e-6def-4771-b1f9-ed2f17a2fd37-kube-api-access-crlvm\") pod \"b447a96e-6def-4771-b1f9-ed2f17a2fd37\" (UID: \"b447a96e-6def-4771-b1f9-ed2f17a2fd37\") " Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.693890 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzs2d\" (UniqueName: \"kubernetes.io/projected/deed29e2-f797-4d6e-a53c-8a6a70f98888-kube-api-access-wzs2d\") pod \"deed29e2-f797-4d6e-a53c-8a6a70f98888\" (UID: \"deed29e2-f797-4d6e-a53c-8a6a70f98888\") " Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.694725 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b447a96e-6def-4771-b1f9-ed2f17a2fd37-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b447a96e-6def-4771-b1f9-ed2f17a2fd37" (UID: "b447a96e-6def-4771-b1f9-ed2f17a2fd37"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.694734 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deed29e2-f797-4d6e-a53c-8a6a70f98888-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "deed29e2-f797-4d6e-a53c-8a6a70f98888" (UID: "deed29e2-f797-4d6e-a53c-8a6a70f98888"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.698830 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b447a96e-6def-4771-b1f9-ed2f17a2fd37-kube-api-access-crlvm" (OuterVolumeSpecName: "kube-api-access-crlvm") pod "b447a96e-6def-4771-b1f9-ed2f17a2fd37" (UID: "b447a96e-6def-4771-b1f9-ed2f17a2fd37"). InnerVolumeSpecName "kube-api-access-crlvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.700023 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deed29e2-f797-4d6e-a53c-8a6a70f98888-kube-api-access-wzs2d" (OuterVolumeSpecName: "kube-api-access-wzs2d") pod "deed29e2-f797-4d6e-a53c-8a6a70f98888" (UID: "deed29e2-f797-4d6e-a53c-8a6a70f98888"). InnerVolumeSpecName "kube-api-access-wzs2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.795889 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deed29e2-f797-4d6e-a53c-8a6a70f98888-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.795923 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crlvm\" (UniqueName: \"kubernetes.io/projected/b447a96e-6def-4771-b1f9-ed2f17a2fd37-kube-api-access-crlvm\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.795933 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzs2d\" (UniqueName: \"kubernetes.io/projected/deed29e2-f797-4d6e-a53c-8a6a70f98888-kube-api-access-wzs2d\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:21 crc kubenswrapper[4841]: I0130 06:36:21.795942 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b447a96e-6def-4771-b1f9-ed2f17a2fd37-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:22 crc kubenswrapper[4841]: I0130 06:36:22.148000 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zzzcl" Jan 30 06:36:22 crc kubenswrapper[4841]: I0130 06:36:22.148186 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zzzcl" event={"ID":"deed29e2-f797-4d6e-a53c-8a6a70f98888","Type":"ContainerDied","Data":"eb1329567618e047de0f1d20c925bfe8766e4055b1ecdac2cfd0a32dfe160737"} Jan 30 06:36:22 crc kubenswrapper[4841]: I0130 06:36:22.148224 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb1329567618e047de0f1d20c925bfe8766e4055b1ecdac2cfd0a32dfe160737" Jan 30 06:36:22 crc kubenswrapper[4841]: I0130 06:36:22.149225 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-caf3-account-create-update-rsk67" event={"ID":"b447a96e-6def-4771-b1f9-ed2f17a2fd37","Type":"ContainerDied","Data":"54d1f2bfad2b23d20ed050074e978c1968a31e5ab579e74f3526ff72f88abfe1"} Jan 30 06:36:22 crc kubenswrapper[4841]: I0130 06:36:22.149263 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54d1f2bfad2b23d20ed050074e978c1968a31e5ab579e74f3526ff72f88abfe1" Jan 30 06:36:22 crc kubenswrapper[4841]: I0130 06:36:22.149334 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-caf3-account-create-update-rsk67" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.666068 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-bltvn"] Jan 30 06:36:23 crc kubenswrapper[4841]: E0130 06:36:23.666690 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b447a96e-6def-4771-b1f9-ed2f17a2fd37" containerName="mariadb-account-create-update" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.666705 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b447a96e-6def-4771-b1f9-ed2f17a2fd37" containerName="mariadb-account-create-update" Jan 30 06:36:23 crc kubenswrapper[4841]: E0130 06:36:23.666724 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deed29e2-f797-4d6e-a53c-8a6a70f98888" containerName="mariadb-database-create" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.666730 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="deed29e2-f797-4d6e-a53c-8a6a70f98888" containerName="mariadb-database-create" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.666878 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="deed29e2-f797-4d6e-a53c-8a6a70f98888" containerName="mariadb-database-create" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.666898 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b447a96e-6def-4771-b1f9-ed2f17a2fd37" containerName="mariadb-account-create-update" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.667467 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.669818 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.669862 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.669993 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.669887 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hqzvm" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.702066 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bltvn"] Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.731106 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670940fd-a625-4698-9f3c-12af46a73bf4-config-data\") pod \"keystone-db-sync-bltvn\" (UID: \"670940fd-a625-4698-9f3c-12af46a73bf4\") " pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.731347 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670940fd-a625-4698-9f3c-12af46a73bf4-combined-ca-bundle\") pod \"keystone-db-sync-bltvn\" (UID: \"670940fd-a625-4698-9f3c-12af46a73bf4\") " pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.731523 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjtl8\" (UniqueName: \"kubernetes.io/projected/670940fd-a625-4698-9f3c-12af46a73bf4-kube-api-access-jjtl8\") pod \"keystone-db-sync-bltvn\" (UID: \"670940fd-a625-4698-9f3c-12af46a73bf4\") " pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.832654 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjtl8\" (UniqueName: \"kubernetes.io/projected/670940fd-a625-4698-9f3c-12af46a73bf4-kube-api-access-jjtl8\") pod \"keystone-db-sync-bltvn\" (UID: \"670940fd-a625-4698-9f3c-12af46a73bf4\") " pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.832757 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670940fd-a625-4698-9f3c-12af46a73bf4-config-data\") pod \"keystone-db-sync-bltvn\" (UID: \"670940fd-a625-4698-9f3c-12af46a73bf4\") " pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.832810 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670940fd-a625-4698-9f3c-12af46a73bf4-combined-ca-bundle\") pod \"keystone-db-sync-bltvn\" (UID: \"670940fd-a625-4698-9f3c-12af46a73bf4\") " pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.838899 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670940fd-a625-4698-9f3c-12af46a73bf4-combined-ca-bundle\") pod \"keystone-db-sync-bltvn\" (UID: \"670940fd-a625-4698-9f3c-12af46a73bf4\") " pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.839346 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670940fd-a625-4698-9f3c-12af46a73bf4-config-data\") pod \"keystone-db-sync-bltvn\" (UID: \"670940fd-a625-4698-9f3c-12af46a73bf4\") " pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:23 crc kubenswrapper[4841]: I0130 06:36:23.853364 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjtl8\" (UniqueName: \"kubernetes.io/projected/670940fd-a625-4698-9f3c-12af46a73bf4-kube-api-access-jjtl8\") pod \"keystone-db-sync-bltvn\" (UID: \"670940fd-a625-4698-9f3c-12af46a73bf4\") " pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:24 crc kubenswrapper[4841]: I0130 06:36:24.011694 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:24 crc kubenswrapper[4841]: I0130 06:36:24.507075 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bltvn"] Jan 30 06:36:25 crc kubenswrapper[4841]: I0130 06:36:25.181962 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bltvn" event={"ID":"670940fd-a625-4698-9f3c-12af46a73bf4","Type":"ContainerStarted","Data":"8fa138e315a57640c62777a9d19e772b8f8b5bf1d5053e969ad622a94d7228b8"} Jan 30 06:36:25 crc kubenswrapper[4841]: I0130 06:36:25.183859 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bltvn" event={"ID":"670940fd-a625-4698-9f3c-12af46a73bf4","Type":"ContainerStarted","Data":"dba3d8b59f1cc85b63b9f34f231a497c047f253d71d3ad4e09413504dd5644c9"} Jan 30 06:36:25 crc kubenswrapper[4841]: I0130 06:36:25.204382 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-bltvn" podStartSLOduration=2.204352092 podStartE2EDuration="2.204352092s" podCreationTimestamp="2026-01-30 06:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:25.204013894 +0000 UTC m=+5322.197486532" watchObservedRunningTime="2026-01-30 06:36:25.204352092 +0000 UTC m=+5322.197824760" Jan 30 06:36:27 crc kubenswrapper[4841]: I0130 06:36:27.197854 4841 generic.go:334] "Generic (PLEG): container finished" podID="670940fd-a625-4698-9f3c-12af46a73bf4" containerID="8fa138e315a57640c62777a9d19e772b8f8b5bf1d5053e969ad622a94d7228b8" exitCode=0 Jan 30 06:36:27 crc kubenswrapper[4841]: I0130 06:36:27.197936 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bltvn" event={"ID":"670940fd-a625-4698-9f3c-12af46a73bf4","Type":"ContainerDied","Data":"8fa138e315a57640c62777a9d19e772b8f8b5bf1d5053e969ad622a94d7228b8"} Jan 30 06:36:28 crc kubenswrapper[4841]: I0130 06:36:28.553594 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:28 crc kubenswrapper[4841]: I0130 06:36:28.625141 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670940fd-a625-4698-9f3c-12af46a73bf4-config-data\") pod \"670940fd-a625-4698-9f3c-12af46a73bf4\" (UID: \"670940fd-a625-4698-9f3c-12af46a73bf4\") " Jan 30 06:36:28 crc kubenswrapper[4841]: I0130 06:36:28.625240 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670940fd-a625-4698-9f3c-12af46a73bf4-combined-ca-bundle\") pod \"670940fd-a625-4698-9f3c-12af46a73bf4\" (UID: \"670940fd-a625-4698-9f3c-12af46a73bf4\") " Jan 30 06:36:28 crc kubenswrapper[4841]: I0130 06:36:28.625386 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjtl8\" (UniqueName: \"kubernetes.io/projected/670940fd-a625-4698-9f3c-12af46a73bf4-kube-api-access-jjtl8\") pod \"670940fd-a625-4698-9f3c-12af46a73bf4\" (UID: \"670940fd-a625-4698-9f3c-12af46a73bf4\") " Jan 30 06:36:28 crc kubenswrapper[4841]: I0130 06:36:28.632249 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/670940fd-a625-4698-9f3c-12af46a73bf4-kube-api-access-jjtl8" (OuterVolumeSpecName: "kube-api-access-jjtl8") pod "670940fd-a625-4698-9f3c-12af46a73bf4" (UID: "670940fd-a625-4698-9f3c-12af46a73bf4"). InnerVolumeSpecName "kube-api-access-jjtl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:28 crc kubenswrapper[4841]: I0130 06:36:28.656557 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/670940fd-a625-4698-9f3c-12af46a73bf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "670940fd-a625-4698-9f3c-12af46a73bf4" (UID: "670940fd-a625-4698-9f3c-12af46a73bf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:28 crc kubenswrapper[4841]: I0130 06:36:28.668925 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/670940fd-a625-4698-9f3c-12af46a73bf4-config-data" (OuterVolumeSpecName: "config-data") pod "670940fd-a625-4698-9f3c-12af46a73bf4" (UID: "670940fd-a625-4698-9f3c-12af46a73bf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:28 crc kubenswrapper[4841]: I0130 06:36:28.729745 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670940fd-a625-4698-9f3c-12af46a73bf4-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:28 crc kubenswrapper[4841]: I0130 06:36:28.729783 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670940fd-a625-4698-9f3c-12af46a73bf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:28 crc kubenswrapper[4841]: I0130 06:36:28.729800 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjtl8\" (UniqueName: \"kubernetes.io/projected/670940fd-a625-4698-9f3c-12af46a73bf4-kube-api-access-jjtl8\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.217869 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bltvn" event={"ID":"670940fd-a625-4698-9f3c-12af46a73bf4","Type":"ContainerDied","Data":"dba3d8b59f1cc85b63b9f34f231a497c047f253d71d3ad4e09413504dd5644c9"} Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.217927 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dba3d8b59f1cc85b63b9f34f231a497c047f253d71d3ad4e09413504dd5644c9" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.217961 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bltvn" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.376160 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dd9744555-2f8bd"] Jan 30 06:36:29 crc kubenswrapper[4841]: E0130 06:36:29.376788 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="670940fd-a625-4698-9f3c-12af46a73bf4" containerName="keystone-db-sync" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.376810 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="670940fd-a625-4698-9f3c-12af46a73bf4" containerName="keystone-db-sync" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.377015 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="670940fd-a625-4698-9f3c-12af46a73bf4" containerName="keystone-db-sync" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.379355 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.393225 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dd9744555-2f8bd"] Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.427154 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jc6c4"] Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.428283 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.430445 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.430666 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hqzvm" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.431663 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.431836 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.432478 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.433589 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jc6c4"] Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.443467 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgjjg\" (UniqueName: \"kubernetes.io/projected/54044219-1c2b-482e-8f91-30c1910ef29c-kube-api-access-lgjjg\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.443510 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-scripts\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.443539 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-combined-ca-bundle\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.443592 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-config\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.443663 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-dns-svc\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.443690 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-config-data\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.443717 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-credential-keys\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.443740 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-ovsdbserver-nb\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.443775 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brnpl\" (UniqueName: \"kubernetes.io/projected/47b73063-5741-44b0-9ac2-0fd44822928e-kube-api-access-brnpl\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.443809 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-fernet-keys\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.443850 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-ovsdbserver-sb\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.545220 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-config\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.545299 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-dns-svc\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.545327 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-config-data\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.545342 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-credential-keys\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.545361 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-ovsdbserver-nb\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.545386 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brnpl\" (UniqueName: \"kubernetes.io/projected/47b73063-5741-44b0-9ac2-0fd44822928e-kube-api-access-brnpl\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.546183 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-ovsdbserver-nb\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.545425 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-fernet-keys\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.546220 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-config\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.546255 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-ovsdbserver-sb\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.546342 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgjjg\" (UniqueName: \"kubernetes.io/projected/54044219-1c2b-482e-8f91-30c1910ef29c-kube-api-access-lgjjg\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.546922 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-scripts\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.546955 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-combined-ca-bundle\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.547084 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-ovsdbserver-sb\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.547329 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-dns-svc\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.551875 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-combined-ca-bundle\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.553700 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-scripts\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.554081 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-config-data\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.555983 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-fernet-keys\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.557845 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-credential-keys\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.573134 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brnpl\" (UniqueName: \"kubernetes.io/projected/47b73063-5741-44b0-9ac2-0fd44822928e-kube-api-access-brnpl\") pod \"dnsmasq-dns-dd9744555-2f8bd\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.575167 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgjjg\" (UniqueName: \"kubernetes.io/projected/54044219-1c2b-482e-8f91-30c1910ef29c-kube-api-access-lgjjg\") pod \"keystone-bootstrap-jc6c4\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.700696 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:29 crc kubenswrapper[4841]: I0130 06:36:29.746133 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:30 crc kubenswrapper[4841]: I0130 06:36:30.237741 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dd9744555-2f8bd"] Jan 30 06:36:30 crc kubenswrapper[4841]: W0130 06:36:30.248040 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47b73063_5741_44b0_9ac2_0fd44822928e.slice/crio-a9dddcdad2762c2e0ba6a4fae8ce0930da1bb444634b8d0489aff5fede465e6b WatchSource:0}: Error finding container a9dddcdad2762c2e0ba6a4fae8ce0930da1bb444634b8d0489aff5fede465e6b: Status 404 returned error can't find the container with id a9dddcdad2762c2e0ba6a4fae8ce0930da1bb444634b8d0489aff5fede465e6b Jan 30 06:36:30 crc kubenswrapper[4841]: I0130 06:36:30.340903 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jc6c4"] Jan 30 06:36:30 crc kubenswrapper[4841]: W0130 06:36:30.350277 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54044219_1c2b_482e_8f91_30c1910ef29c.slice/crio-3044e4e128ab064356a79133503add484adb20e8e83e9acb34d77058c35660a7 WatchSource:0}: Error finding container 3044e4e128ab064356a79133503add484adb20e8e83e9acb34d77058c35660a7: Status 404 returned error can't find the container with id 3044e4e128ab064356a79133503add484adb20e8e83e9acb34d77058c35660a7 Jan 30 06:36:31 crc kubenswrapper[4841]: I0130 06:36:31.237498 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jc6c4" event={"ID":"54044219-1c2b-482e-8f91-30c1910ef29c","Type":"ContainerStarted","Data":"693fa6c383d647bff707e202f8e0064663d93ab71ebd7809058ad7b9c824121b"} Jan 30 06:36:31 crc kubenswrapper[4841]: I0130 06:36:31.237828 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jc6c4" event={"ID":"54044219-1c2b-482e-8f91-30c1910ef29c","Type":"ContainerStarted","Data":"3044e4e128ab064356a79133503add484adb20e8e83e9acb34d77058c35660a7"} Jan 30 06:36:31 crc kubenswrapper[4841]: I0130 06:36:31.240330 4841 generic.go:334] "Generic (PLEG): container finished" podID="47b73063-5741-44b0-9ac2-0fd44822928e" containerID="70d7c4ef8322c44919007b90deaadef6c091f5e1999afd8e89800460496e5b6b" exitCode=0 Jan 30 06:36:31 crc kubenswrapper[4841]: I0130 06:36:31.240672 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" event={"ID":"47b73063-5741-44b0-9ac2-0fd44822928e","Type":"ContainerDied","Data":"70d7c4ef8322c44919007b90deaadef6c091f5e1999afd8e89800460496e5b6b"} Jan 30 06:36:31 crc kubenswrapper[4841]: I0130 06:36:31.240710 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" event={"ID":"47b73063-5741-44b0-9ac2-0fd44822928e","Type":"ContainerStarted","Data":"a9dddcdad2762c2e0ba6a4fae8ce0930da1bb444634b8d0489aff5fede465e6b"} Jan 30 06:36:31 crc kubenswrapper[4841]: I0130 06:36:31.258717 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jc6c4" podStartSLOduration=2.258698698 podStartE2EDuration="2.258698698s" podCreationTimestamp="2026-01-30 06:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:31.253178831 +0000 UTC m=+5328.246651469" watchObservedRunningTime="2026-01-30 06:36:31.258698698 +0000 UTC m=+5328.252171336" Jan 30 06:36:32 crc kubenswrapper[4841]: I0130 06:36:32.251685 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" event={"ID":"47b73063-5741-44b0-9ac2-0fd44822928e","Type":"ContainerStarted","Data":"010b3abf249c6e927822c64d8edeb3fab1d86642596eabd6a97dd11469097406"} Jan 30 06:36:32 crc kubenswrapper[4841]: I0130 06:36:32.252000 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:32 crc kubenswrapper[4841]: I0130 06:36:32.287115 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" podStartSLOduration=3.287092123 podStartE2EDuration="3.287092123s" podCreationTimestamp="2026-01-30 06:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:32.278821964 +0000 UTC m=+5329.272294602" watchObservedRunningTime="2026-01-30 06:36:32.287092123 +0000 UTC m=+5329.280564771" Jan 30 06:36:33 crc kubenswrapper[4841]: I0130 06:36:33.199848 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 30 06:36:34 crc kubenswrapper[4841]: I0130 06:36:34.276840 4841 generic.go:334] "Generic (PLEG): container finished" podID="54044219-1c2b-482e-8f91-30c1910ef29c" containerID="693fa6c383d647bff707e202f8e0064663d93ab71ebd7809058ad7b9c824121b" exitCode=0 Jan 30 06:36:34 crc kubenswrapper[4841]: I0130 06:36:34.276903 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jc6c4" event={"ID":"54044219-1c2b-482e-8f91-30c1910ef29c","Type":"ContainerDied","Data":"693fa6c383d647bff707e202f8e0064663d93ab71ebd7809058ad7b9c824121b"} Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.675610 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.756996 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-credential-keys\") pod \"54044219-1c2b-482e-8f91-30c1910ef29c\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.757113 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-config-data\") pod \"54044219-1c2b-482e-8f91-30c1910ef29c\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.757134 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-fernet-keys\") pod \"54044219-1c2b-482e-8f91-30c1910ef29c\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.757177 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgjjg\" (UniqueName: \"kubernetes.io/projected/54044219-1c2b-482e-8f91-30c1910ef29c-kube-api-access-lgjjg\") pod \"54044219-1c2b-482e-8f91-30c1910ef29c\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.757227 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-scripts\") pod \"54044219-1c2b-482e-8f91-30c1910ef29c\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.757254 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-combined-ca-bundle\") pod \"54044219-1c2b-482e-8f91-30c1910ef29c\" (UID: \"54044219-1c2b-482e-8f91-30c1910ef29c\") " Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.763366 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "54044219-1c2b-482e-8f91-30c1910ef29c" (UID: "54044219-1c2b-482e-8f91-30c1910ef29c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.763521 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-scripts" (OuterVolumeSpecName: "scripts") pod "54044219-1c2b-482e-8f91-30c1910ef29c" (UID: "54044219-1c2b-482e-8f91-30c1910ef29c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.765023 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54044219-1c2b-482e-8f91-30c1910ef29c-kube-api-access-lgjjg" (OuterVolumeSpecName: "kube-api-access-lgjjg") pod "54044219-1c2b-482e-8f91-30c1910ef29c" (UID: "54044219-1c2b-482e-8f91-30c1910ef29c"). InnerVolumeSpecName "kube-api-access-lgjjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.765933 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "54044219-1c2b-482e-8f91-30c1910ef29c" (UID: "54044219-1c2b-482e-8f91-30c1910ef29c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.781135 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54044219-1c2b-482e-8f91-30c1910ef29c" (UID: "54044219-1c2b-482e-8f91-30c1910ef29c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.799023 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-config-data" (OuterVolumeSpecName: "config-data") pod "54044219-1c2b-482e-8f91-30c1910ef29c" (UID: "54044219-1c2b-482e-8f91-30c1910ef29c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.859156 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.859222 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.859250 4841 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.859276 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.859299 4841 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54044219-1c2b-482e-8f91-30c1910ef29c-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:35 crc kubenswrapper[4841]: I0130 06:36:35.859321 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgjjg\" (UniqueName: \"kubernetes.io/projected/54044219-1c2b-482e-8f91-30c1910ef29c-kube-api-access-lgjjg\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.298820 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jc6c4" event={"ID":"54044219-1c2b-482e-8f91-30c1910ef29c","Type":"ContainerDied","Data":"3044e4e128ab064356a79133503add484adb20e8e83e9acb34d77058c35660a7"} Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.299355 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3044e4e128ab064356a79133503add484adb20e8e83e9acb34d77058c35660a7" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.298867 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jc6c4" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.543185 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jc6c4"] Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.555838 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jc6c4"] Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.662713 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-trbn6"] Jan 30 06:36:36 crc kubenswrapper[4841]: E0130 06:36:36.663373 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54044219-1c2b-482e-8f91-30c1910ef29c" containerName="keystone-bootstrap" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.663450 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="54044219-1c2b-482e-8f91-30c1910ef29c" containerName="keystone-bootstrap" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.663789 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="54044219-1c2b-482e-8f91-30c1910ef29c" containerName="keystone-bootstrap" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.665349 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.669366 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.672483 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.672663 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.672669 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.672983 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hqzvm" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.681862 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-trbn6"] Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.775776 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-combined-ca-bundle\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.775953 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-credential-keys\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.776058 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-config-data\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.776099 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-fernet-keys\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.776131 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-scripts\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.776204 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slh8x\" (UniqueName: \"kubernetes.io/projected/fa316b86-b9d2-4285-9ede-7319f99b2b13-kube-api-access-slh8x\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.878662 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-credential-keys\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.878814 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-config-data\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.878859 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-fernet-keys\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.878889 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-scripts\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.878967 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slh8x\" (UniqueName: \"kubernetes.io/projected/fa316b86-b9d2-4285-9ede-7319f99b2b13-kube-api-access-slh8x\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.879029 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-combined-ca-bundle\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.884952 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-config-data\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.885547 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-combined-ca-bundle\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.886345 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-scripts\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.887016 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-credential-keys\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.903953 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-fernet-keys\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:36 crc kubenswrapper[4841]: I0130 06:36:36.910072 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slh8x\" (UniqueName: \"kubernetes.io/projected/fa316b86-b9d2-4285-9ede-7319f99b2b13-kube-api-access-slh8x\") pod \"keystone-bootstrap-trbn6\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:37 crc kubenswrapper[4841]: I0130 06:36:37.008729 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:37 crc kubenswrapper[4841]: I0130 06:36:37.513324 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-trbn6"] Jan 30 06:36:37 crc kubenswrapper[4841]: W0130 06:36:37.516946 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa316b86_b9d2_4285_9ede_7319f99b2b13.slice/crio-ef56e3b797761fe710c17ef97b22f09644015f3c9568b284c81d968c0e2c7512 WatchSource:0}: Error finding container ef56e3b797761fe710c17ef97b22f09644015f3c9568b284c81d968c0e2c7512: Status 404 returned error can't find the container with id ef56e3b797761fe710c17ef97b22f09644015f3c9568b284c81d968c0e2c7512 Jan 30 06:36:38 crc kubenswrapper[4841]: I0130 06:36:38.324999 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-trbn6" event={"ID":"fa316b86-b9d2-4285-9ede-7319f99b2b13","Type":"ContainerStarted","Data":"953edd53b511753824e06f1ab59887cc192942476e8602e5fb50cb89d2e5da02"} Jan 30 06:36:38 crc kubenswrapper[4841]: I0130 06:36:38.325373 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-trbn6" event={"ID":"fa316b86-b9d2-4285-9ede-7319f99b2b13","Type":"ContainerStarted","Data":"ef56e3b797761fe710c17ef97b22f09644015f3c9568b284c81d968c0e2c7512"} Jan 30 06:36:38 crc kubenswrapper[4841]: I0130 06:36:38.352875 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-trbn6" podStartSLOduration=2.352832543 podStartE2EDuration="2.352832543s" podCreationTimestamp="2026-01-30 06:36:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:38.345862117 +0000 UTC m=+5335.339334795" watchObservedRunningTime="2026-01-30 06:36:38.352832543 +0000 UTC m=+5335.346305231" Jan 30 06:36:38 crc kubenswrapper[4841]: I0130 06:36:38.444836 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54044219-1c2b-482e-8f91-30c1910ef29c" path="/var/lib/kubelet/pods/54044219-1c2b-482e-8f91-30c1910ef29c/volumes" Jan 30 06:36:39 crc kubenswrapper[4841]: I0130 06:36:39.702609 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:36:39 crc kubenswrapper[4841]: I0130 06:36:39.796645 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69886d74bc-jh4dr"] Jan 30 06:36:39 crc kubenswrapper[4841]: I0130 06:36:39.796915 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" podUID="18d4de94-fb76-4b94-a887-a3e4c20c8761" containerName="dnsmasq-dns" containerID="cri-o://ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455" gracePeriod=10 Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.282664 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.344315 4841 generic.go:334] "Generic (PLEG): container finished" podID="18d4de94-fb76-4b94-a887-a3e4c20c8761" containerID="ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455" exitCode=0 Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.344363 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" event={"ID":"18d4de94-fb76-4b94-a887-a3e4c20c8761","Type":"ContainerDied","Data":"ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455"} Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.344506 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" event={"ID":"18d4de94-fb76-4b94-a887-a3e4c20c8761","Type":"ContainerDied","Data":"fe831850ec8ff25352a2b1e7e26fe2968a23b9286bd7d3d9f6f4e664bd0393fe"} Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.344534 4841 scope.go:117] "RemoveContainer" containerID="ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.344717 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69886d74bc-jh4dr" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.371810 4841 scope.go:117] "RemoveContainer" containerID="6b94f6422f7515250dde9e44e3d53f6a1696af06426c06eec6f21ac6f27fa447" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.400100 4841 scope.go:117] "RemoveContainer" containerID="ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455" Jan 30 06:36:40 crc kubenswrapper[4841]: E0130 06:36:40.400555 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455\": container with ID starting with ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455 not found: ID does not exist" containerID="ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.400596 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455"} err="failed to get container status \"ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455\": rpc error: code = NotFound desc = could not find container \"ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455\": container with ID starting with ca73a4790debc1d7285df3c7637aa5376cb5173980ee92fa33a1c85eb17f4455 not found: ID does not exist" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.400626 4841 scope.go:117] "RemoveContainer" containerID="6b94f6422f7515250dde9e44e3d53f6a1696af06426c06eec6f21ac6f27fa447" Jan 30 06:36:40 crc kubenswrapper[4841]: E0130 06:36:40.404473 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b94f6422f7515250dde9e44e3d53f6a1696af06426c06eec6f21ac6f27fa447\": container with ID starting with 6b94f6422f7515250dde9e44e3d53f6a1696af06426c06eec6f21ac6f27fa447 not found: ID does not exist" containerID="6b94f6422f7515250dde9e44e3d53f6a1696af06426c06eec6f21ac6f27fa447" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.404527 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b94f6422f7515250dde9e44e3d53f6a1696af06426c06eec6f21ac6f27fa447"} err="failed to get container status \"6b94f6422f7515250dde9e44e3d53f6a1696af06426c06eec6f21ac6f27fa447\": rpc error: code = NotFound desc = could not find container \"6b94f6422f7515250dde9e44e3d53f6a1696af06426c06eec6f21ac6f27fa447\": container with ID starting with 6b94f6422f7515250dde9e44e3d53f6a1696af06426c06eec6f21ac6f27fa447 not found: ID does not exist" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.443732 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-dns-svc\") pod \"18d4de94-fb76-4b94-a887-a3e4c20c8761\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.444432 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-ovsdbserver-nb\") pod \"18d4de94-fb76-4b94-a887-a3e4c20c8761\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.444585 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-config\") pod \"18d4de94-fb76-4b94-a887-a3e4c20c8761\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.444704 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-ovsdbserver-sb\") pod \"18d4de94-fb76-4b94-a887-a3e4c20c8761\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.446494 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5lj7\" (UniqueName: \"kubernetes.io/projected/18d4de94-fb76-4b94-a887-a3e4c20c8761-kube-api-access-l5lj7\") pod \"18d4de94-fb76-4b94-a887-a3e4c20c8761\" (UID: \"18d4de94-fb76-4b94-a887-a3e4c20c8761\") " Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.453751 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d4de94-fb76-4b94-a887-a3e4c20c8761-kube-api-access-l5lj7" (OuterVolumeSpecName: "kube-api-access-l5lj7") pod "18d4de94-fb76-4b94-a887-a3e4c20c8761" (UID: "18d4de94-fb76-4b94-a887-a3e4c20c8761"). InnerVolumeSpecName "kube-api-access-l5lj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.493622 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18d4de94-fb76-4b94-a887-a3e4c20c8761" (UID: "18d4de94-fb76-4b94-a887-a3e4c20c8761"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.493899 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "18d4de94-fb76-4b94-a887-a3e4c20c8761" (UID: "18d4de94-fb76-4b94-a887-a3e4c20c8761"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.494981 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-config" (OuterVolumeSpecName: "config") pod "18d4de94-fb76-4b94-a887-a3e4c20c8761" (UID: "18d4de94-fb76-4b94-a887-a3e4c20c8761"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.497462 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "18d4de94-fb76-4b94-a887-a3e4c20c8761" (UID: "18d4de94-fb76-4b94-a887-a3e4c20c8761"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.549038 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5lj7\" (UniqueName: \"kubernetes.io/projected/18d4de94-fb76-4b94-a887-a3e4c20c8761-kube-api-access-l5lj7\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.549081 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.549092 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.549102 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.549112 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18d4de94-fb76-4b94-a887-a3e4c20c8761-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.678955 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69886d74bc-jh4dr"] Jan 30 06:36:40 crc kubenswrapper[4841]: I0130 06:36:40.687041 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69886d74bc-jh4dr"] Jan 30 06:36:41 crc kubenswrapper[4841]: I0130 06:36:41.357155 4841 generic.go:334] "Generic (PLEG): container finished" podID="fa316b86-b9d2-4285-9ede-7319f99b2b13" containerID="953edd53b511753824e06f1ab59887cc192942476e8602e5fb50cb89d2e5da02" exitCode=0 Jan 30 06:36:41 crc kubenswrapper[4841]: I0130 06:36:41.357284 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-trbn6" event={"ID":"fa316b86-b9d2-4285-9ede-7319f99b2b13","Type":"ContainerDied","Data":"953edd53b511753824e06f1ab59887cc192942476e8602e5fb50cb89d2e5da02"} Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.448177 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d4de94-fb76-4b94-a887-a3e4c20c8761" path="/var/lib/kubelet/pods/18d4de94-fb76-4b94-a887-a3e4c20c8761/volumes" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.725566 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.888091 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-credential-keys\") pod \"fa316b86-b9d2-4285-9ede-7319f99b2b13\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.888361 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-scripts\") pod \"fa316b86-b9d2-4285-9ede-7319f99b2b13\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.889618 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-combined-ca-bundle\") pod \"fa316b86-b9d2-4285-9ede-7319f99b2b13\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.889802 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slh8x\" (UniqueName: \"kubernetes.io/projected/fa316b86-b9d2-4285-9ede-7319f99b2b13-kube-api-access-slh8x\") pod \"fa316b86-b9d2-4285-9ede-7319f99b2b13\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.889921 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-fernet-keys\") pod \"fa316b86-b9d2-4285-9ede-7319f99b2b13\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.890047 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-config-data\") pod \"fa316b86-b9d2-4285-9ede-7319f99b2b13\" (UID: \"fa316b86-b9d2-4285-9ede-7319f99b2b13\") " Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.894164 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-scripts" (OuterVolumeSpecName: "scripts") pod "fa316b86-b9d2-4285-9ede-7319f99b2b13" (UID: "fa316b86-b9d2-4285-9ede-7319f99b2b13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.894780 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fa316b86-b9d2-4285-9ede-7319f99b2b13" (UID: "fa316b86-b9d2-4285-9ede-7319f99b2b13"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.894919 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa316b86-b9d2-4285-9ede-7319f99b2b13-kube-api-access-slh8x" (OuterVolumeSpecName: "kube-api-access-slh8x") pod "fa316b86-b9d2-4285-9ede-7319f99b2b13" (UID: "fa316b86-b9d2-4285-9ede-7319f99b2b13"). InnerVolumeSpecName "kube-api-access-slh8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.901547 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fa316b86-b9d2-4285-9ede-7319f99b2b13" (UID: "fa316b86-b9d2-4285-9ede-7319f99b2b13"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.911452 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa316b86-b9d2-4285-9ede-7319f99b2b13" (UID: "fa316b86-b9d2-4285-9ede-7319f99b2b13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.911622 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-config-data" (OuterVolumeSpecName: "config-data") pod "fa316b86-b9d2-4285-9ede-7319f99b2b13" (UID: "fa316b86-b9d2-4285-9ede-7319f99b2b13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.992099 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slh8x\" (UniqueName: \"kubernetes.io/projected/fa316b86-b9d2-4285-9ede-7319f99b2b13-kube-api-access-slh8x\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.992139 4841 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.992150 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.992160 4841 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.992170 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:42 crc kubenswrapper[4841]: I0130 06:36:42.992179 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa316b86-b9d2-4285-9ede-7319f99b2b13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.383341 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-trbn6" event={"ID":"fa316b86-b9d2-4285-9ede-7319f99b2b13","Type":"ContainerDied","Data":"ef56e3b797761fe710c17ef97b22f09644015f3c9568b284c81d968c0e2c7512"} Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.383383 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef56e3b797761fe710c17ef97b22f09644015f3c9568b284c81d968c0e2c7512" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.383449 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-trbn6" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.484286 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7b7f59b7f9-hz6dc"] Jan 30 06:36:43 crc kubenswrapper[4841]: E0130 06:36:43.484697 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d4de94-fb76-4b94-a887-a3e4c20c8761" containerName="dnsmasq-dns" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.484726 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d4de94-fb76-4b94-a887-a3e4c20c8761" containerName="dnsmasq-dns" Jan 30 06:36:43 crc kubenswrapper[4841]: E0130 06:36:43.484756 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d4de94-fb76-4b94-a887-a3e4c20c8761" containerName="init" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.484765 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d4de94-fb76-4b94-a887-a3e4c20c8761" containerName="init" Jan 30 06:36:43 crc kubenswrapper[4841]: E0130 06:36:43.484777 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa316b86-b9d2-4285-9ede-7319f99b2b13" containerName="keystone-bootstrap" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.484786 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa316b86-b9d2-4285-9ede-7319f99b2b13" containerName="keystone-bootstrap" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.484965 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa316b86-b9d2-4285-9ede-7319f99b2b13" containerName="keystone-bootstrap" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.484985 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d4de94-fb76-4b94-a887-a3e4c20c8761" containerName="dnsmasq-dns" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.485675 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.488762 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.489335 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hqzvm" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.489624 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.489805 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.489992 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.490296 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.497824 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b7f59b7f9-hz6dc"] Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.629058 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-config-data\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.629614 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg965\" (UniqueName: \"kubernetes.io/projected/e2f58a90-22a1-478c-a19e-6e71499f307e-kube-api-access-cg965\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.629768 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-scripts\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.629896 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-credential-keys\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.630107 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-public-tls-certs\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.630350 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-fernet-keys\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.630437 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-combined-ca-bundle\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.630470 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-internal-tls-certs\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.732397 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-public-tls-certs\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.732690 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-fernet-keys\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.732716 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-combined-ca-bundle\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.732734 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-internal-tls-certs\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.732769 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-config-data\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.732790 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg965\" (UniqueName: \"kubernetes.io/projected/e2f58a90-22a1-478c-a19e-6e71499f307e-kube-api-access-cg965\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.732824 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-scripts\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.732847 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-credential-keys\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.738527 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-public-tls-certs\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.738527 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-internal-tls-certs\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.738706 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-fernet-keys\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.740338 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-credential-keys\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.740582 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-scripts\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.740617 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-config-data\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.742083 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f58a90-22a1-478c-a19e-6e71499f307e-combined-ca-bundle\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.759341 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg965\" (UniqueName: \"kubernetes.io/projected/e2f58a90-22a1-478c-a19e-6e71499f307e-kube-api-access-cg965\") pod \"keystone-7b7f59b7f9-hz6dc\" (UID: \"e2f58a90-22a1-478c-a19e-6e71499f307e\") " pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:43 crc kubenswrapper[4841]: I0130 06:36:43.840726 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:44 crc kubenswrapper[4841]: I0130 06:36:44.566481 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b7f59b7f9-hz6dc"] Jan 30 06:36:45 crc kubenswrapper[4841]: I0130 06:36:45.413362 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b7f59b7f9-hz6dc" event={"ID":"e2f58a90-22a1-478c-a19e-6e71499f307e","Type":"ContainerStarted","Data":"ef60f57586530447b6616cfa2382cbad6d9dca6bc7936eecda5c624876e83524"} Jan 30 06:36:45 crc kubenswrapper[4841]: I0130 06:36:45.413686 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b7f59b7f9-hz6dc" event={"ID":"e2f58a90-22a1-478c-a19e-6e71499f307e","Type":"ContainerStarted","Data":"b6154130362581559bddd0307552a86f4f0ea20d79b17fcd5f0ecdfe7f3021ad"} Jan 30 06:36:45 crc kubenswrapper[4841]: I0130 06:36:45.413703 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:36:45 crc kubenswrapper[4841]: I0130 06:36:45.439724 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7b7f59b7f9-hz6dc" podStartSLOduration=2.439698205 podStartE2EDuration="2.439698205s" podCreationTimestamp="2026-01-30 06:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:36:45.438455332 +0000 UTC m=+5342.431927970" watchObservedRunningTime="2026-01-30 06:36:45.439698205 +0000 UTC m=+5342.433170843" Jan 30 06:37:15 crc kubenswrapper[4841]: I0130 06:37:15.345483 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7b7f59b7f9-hz6dc" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.218305 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.220554 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.227099 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.227497 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.227977 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-mgwwh" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.237221 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.398069 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsrp9\" (UniqueName: \"kubernetes.io/projected/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-kube-api-access-zsrp9\") pod \"openstackclient\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.398336 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-openstack-config-secret\") pod \"openstackclient\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.398409 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-openstack-config\") pod \"openstackclient\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.398657 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.500700 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.501003 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsrp9\" (UniqueName: \"kubernetes.io/projected/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-kube-api-access-zsrp9\") pod \"openstackclient\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.501097 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-openstack-config-secret\") pod \"openstackclient\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.501218 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-openstack-config\") pod \"openstackclient\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.502961 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-openstack-config\") pod \"openstackclient\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.512641 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-openstack-config-secret\") pod \"openstackclient\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.516754 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.520923 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsrp9\" (UniqueName: \"kubernetes.io/projected/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-kube-api-access-zsrp9\") pod \"openstackclient\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " pod="openstack/openstackclient" Jan 30 06:37:18 crc kubenswrapper[4841]: I0130 06:37:18.593226 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:37:19 crc kubenswrapper[4841]: I0130 06:37:19.058211 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 06:37:19 crc kubenswrapper[4841]: W0130 06:37:19.062319 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4daf8ad7_5722_443b_9ac6_f9742ba7db0a.slice/crio-ae04fa022b21209786f13d3a6e84a3dd89cae072afd2126916a3fd5d311b04e3 WatchSource:0}: Error finding container ae04fa022b21209786f13d3a6e84a3dd89cae072afd2126916a3fd5d311b04e3: Status 404 returned error can't find the container with id ae04fa022b21209786f13d3a6e84a3dd89cae072afd2126916a3fd5d311b04e3 Jan 30 06:37:19 crc kubenswrapper[4841]: I0130 06:37:19.754303 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4daf8ad7-5722-443b-9ac6-f9742ba7db0a","Type":"ContainerStarted","Data":"cf7fa48e8b80bd88ae241deae17a448b0c4ee4e2d53672773ea5fc32fecfcec4"} Jan 30 06:37:19 crc kubenswrapper[4841]: I0130 06:37:19.754737 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4daf8ad7-5722-443b-9ac6-f9742ba7db0a","Type":"ContainerStarted","Data":"ae04fa022b21209786f13d3a6e84a3dd89cae072afd2126916a3fd5d311b04e3"} Jan 30 06:37:19 crc kubenswrapper[4841]: I0130 06:37:19.789984 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.789948693 podStartE2EDuration="1.789948693s" podCreationTimestamp="2026-01-30 06:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:37:19.777059518 +0000 UTC m=+5376.770532166" watchObservedRunningTime="2026-01-30 06:37:19.789948693 +0000 UTC m=+5376.783421371" Jan 30 06:38:05 crc kubenswrapper[4841]: I0130 06:38:05.105499 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zx4n9"] Jan 30 06:38:05 crc kubenswrapper[4841]: I0130 06:38:05.115759 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zx4n9"] Jan 30 06:38:06 crc kubenswrapper[4841]: I0130 06:38:06.448902 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebbafff3-ee90-444f-94fb-a03bcf07a439" path="/var/lib/kubelet/pods/ebbafff3-ee90-444f-94fb-a03bcf07a439/volumes" Jan 30 06:38:10 crc kubenswrapper[4841]: I0130 06:38:10.464288 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:38:10 crc kubenswrapper[4841]: I0130 06:38:10.464595 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:38:28 crc kubenswrapper[4841]: E0130 06:38:28.630982 4841 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.36:58804->38.102.83.36:43553: write tcp 38.102.83.36:58804->38.102.83.36:43553: write: broken pipe Jan 30 06:38:40 crc kubenswrapper[4841]: I0130 06:38:40.463621 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:38:40 crc kubenswrapper[4841]: I0130 06:38:40.464332 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:38:51 crc kubenswrapper[4841]: I0130 06:38:51.474362 4841 scope.go:117] "RemoveContainer" containerID="aa67f1ce5ea8c43fbb1ff000f094fb3e174145b2d53cf88544796a4d2c0691ea" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.467820 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-nklng"] Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.470665 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nklng" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.482003 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-f53a-account-create-update-t7kwr"] Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.483467 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f53a-account-create-update-t7kwr" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.486170 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.491931 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nklng"] Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.506148 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f53a-account-create-update-t7kwr"] Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.573730 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glkt2\" (UniqueName: \"kubernetes.io/projected/23fc0c89-4399-4a71-ad3b-40cc2361d1b3-kube-api-access-glkt2\") pod \"barbican-f53a-account-create-update-t7kwr\" (UID: \"23fc0c89-4399-4a71-ad3b-40cc2361d1b3\") " pod="openstack/barbican-f53a-account-create-update-t7kwr" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.574003 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6jrz\" (UniqueName: \"kubernetes.io/projected/35c897c9-7826-48bc-89cb-01a9622ea531-kube-api-access-h6jrz\") pod \"barbican-db-create-nklng\" (UID: \"35c897c9-7826-48bc-89cb-01a9622ea531\") " pod="openstack/barbican-db-create-nklng" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.574028 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23fc0c89-4399-4a71-ad3b-40cc2361d1b3-operator-scripts\") pod \"barbican-f53a-account-create-update-t7kwr\" (UID: \"23fc0c89-4399-4a71-ad3b-40cc2361d1b3\") " pod="openstack/barbican-f53a-account-create-update-t7kwr" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.574051 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35c897c9-7826-48bc-89cb-01a9622ea531-operator-scripts\") pod \"barbican-db-create-nklng\" (UID: \"35c897c9-7826-48bc-89cb-01a9622ea531\") " pod="openstack/barbican-db-create-nklng" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.675420 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35c897c9-7826-48bc-89cb-01a9622ea531-operator-scripts\") pod \"barbican-db-create-nklng\" (UID: \"35c897c9-7826-48bc-89cb-01a9622ea531\") " pod="openstack/barbican-db-create-nklng" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.675482 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glkt2\" (UniqueName: \"kubernetes.io/projected/23fc0c89-4399-4a71-ad3b-40cc2361d1b3-kube-api-access-glkt2\") pod \"barbican-f53a-account-create-update-t7kwr\" (UID: \"23fc0c89-4399-4a71-ad3b-40cc2361d1b3\") " pod="openstack/barbican-f53a-account-create-update-t7kwr" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.675584 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6jrz\" (UniqueName: \"kubernetes.io/projected/35c897c9-7826-48bc-89cb-01a9622ea531-kube-api-access-h6jrz\") pod \"barbican-db-create-nklng\" (UID: \"35c897c9-7826-48bc-89cb-01a9622ea531\") " pod="openstack/barbican-db-create-nklng" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.675610 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23fc0c89-4399-4a71-ad3b-40cc2361d1b3-operator-scripts\") pod \"barbican-f53a-account-create-update-t7kwr\" (UID: \"23fc0c89-4399-4a71-ad3b-40cc2361d1b3\") " pod="openstack/barbican-f53a-account-create-update-t7kwr" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.676172 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35c897c9-7826-48bc-89cb-01a9622ea531-operator-scripts\") pod \"barbican-db-create-nklng\" (UID: \"35c897c9-7826-48bc-89cb-01a9622ea531\") " pod="openstack/barbican-db-create-nklng" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.676269 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23fc0c89-4399-4a71-ad3b-40cc2361d1b3-operator-scripts\") pod \"barbican-f53a-account-create-update-t7kwr\" (UID: \"23fc0c89-4399-4a71-ad3b-40cc2361d1b3\") " pod="openstack/barbican-f53a-account-create-update-t7kwr" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.693481 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glkt2\" (UniqueName: \"kubernetes.io/projected/23fc0c89-4399-4a71-ad3b-40cc2361d1b3-kube-api-access-glkt2\") pod \"barbican-f53a-account-create-update-t7kwr\" (UID: \"23fc0c89-4399-4a71-ad3b-40cc2361d1b3\") " pod="openstack/barbican-f53a-account-create-update-t7kwr" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.696605 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6jrz\" (UniqueName: \"kubernetes.io/projected/35c897c9-7826-48bc-89cb-01a9622ea531-kube-api-access-h6jrz\") pod \"barbican-db-create-nklng\" (UID: \"35c897c9-7826-48bc-89cb-01a9622ea531\") " pod="openstack/barbican-db-create-nklng" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.804306 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nklng" Jan 30 06:38:57 crc kubenswrapper[4841]: I0130 06:38:57.818496 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f53a-account-create-update-t7kwr" Jan 30 06:38:58 crc kubenswrapper[4841]: I0130 06:38:58.352454 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nklng"] Jan 30 06:38:58 crc kubenswrapper[4841]: I0130 06:38:58.361755 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f53a-account-create-update-t7kwr"] Jan 30 06:38:58 crc kubenswrapper[4841]: W0130 06:38:58.363394 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35c897c9_7826_48bc_89cb_01a9622ea531.slice/crio-c6ff30da3190815f654130f68c300b3351455f46e61939f9f92a2b95de34733b WatchSource:0}: Error finding container c6ff30da3190815f654130f68c300b3351455f46e61939f9f92a2b95de34733b: Status 404 returned error can't find the container with id c6ff30da3190815f654130f68c300b3351455f46e61939f9f92a2b95de34733b Jan 30 06:38:58 crc kubenswrapper[4841]: I0130 06:38:58.700350 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nklng" event={"ID":"35c897c9-7826-48bc-89cb-01a9622ea531","Type":"ContainerStarted","Data":"283f4f5754c1552cbee44d9d748bf54079893509b6ca16e144da1a9ea78724d3"} Jan 30 06:38:58 crc kubenswrapper[4841]: I0130 06:38:58.700619 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nklng" event={"ID":"35c897c9-7826-48bc-89cb-01a9622ea531","Type":"ContainerStarted","Data":"c6ff30da3190815f654130f68c300b3351455f46e61939f9f92a2b95de34733b"} Jan 30 06:38:58 crc kubenswrapper[4841]: I0130 06:38:58.702711 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f53a-account-create-update-t7kwr" event={"ID":"23fc0c89-4399-4a71-ad3b-40cc2361d1b3","Type":"ContainerStarted","Data":"fc64e979b3458d4a87bcfd6e8565b68dd5cc597575da82700f3703fa395a8367"} Jan 30 06:38:58 crc kubenswrapper[4841]: I0130 06:38:58.702762 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f53a-account-create-update-t7kwr" event={"ID":"23fc0c89-4399-4a71-ad3b-40cc2361d1b3","Type":"ContainerStarted","Data":"f3e04daa004251905d6daf9b8f0c817e9c95a5e56202987a989f3d787c94046d"} Jan 30 06:38:58 crc kubenswrapper[4841]: I0130 06:38:58.716391 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-nklng" podStartSLOduration=1.716366704 podStartE2EDuration="1.716366704s" podCreationTimestamp="2026-01-30 06:38:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:58.713706713 +0000 UTC m=+5475.707179351" watchObservedRunningTime="2026-01-30 06:38:58.716366704 +0000 UTC m=+5475.709839342" Jan 30 06:38:58 crc kubenswrapper[4841]: I0130 06:38:58.730269 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-f53a-account-create-update-t7kwr" podStartSLOduration=1.730250895 podStartE2EDuration="1.730250895s" podCreationTimestamp="2026-01-30 06:38:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:38:58.728529759 +0000 UTC m=+5475.722002397" watchObservedRunningTime="2026-01-30 06:38:58.730250895 +0000 UTC m=+5475.723723533" Jan 30 06:38:59 crc kubenswrapper[4841]: I0130 06:38:59.715917 4841 generic.go:334] "Generic (PLEG): container finished" podID="23fc0c89-4399-4a71-ad3b-40cc2361d1b3" containerID="fc64e979b3458d4a87bcfd6e8565b68dd5cc597575da82700f3703fa395a8367" exitCode=0 Jan 30 06:38:59 crc kubenswrapper[4841]: I0130 06:38:59.716314 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f53a-account-create-update-t7kwr" event={"ID":"23fc0c89-4399-4a71-ad3b-40cc2361d1b3","Type":"ContainerDied","Data":"fc64e979b3458d4a87bcfd6e8565b68dd5cc597575da82700f3703fa395a8367"} Jan 30 06:38:59 crc kubenswrapper[4841]: I0130 06:38:59.717879 4841 generic.go:334] "Generic (PLEG): container finished" podID="35c897c9-7826-48bc-89cb-01a9622ea531" containerID="283f4f5754c1552cbee44d9d748bf54079893509b6ca16e144da1a9ea78724d3" exitCode=0 Jan 30 06:38:59 crc kubenswrapper[4841]: I0130 06:38:59.717919 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nklng" event={"ID":"35c897c9-7826-48bc-89cb-01a9622ea531","Type":"ContainerDied","Data":"283f4f5754c1552cbee44d9d748bf54079893509b6ca16e144da1a9ea78724d3"} Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.107544 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nklng" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.113162 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f53a-account-create-update-t7kwr" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.236906 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6jrz\" (UniqueName: \"kubernetes.io/projected/35c897c9-7826-48bc-89cb-01a9622ea531-kube-api-access-h6jrz\") pod \"35c897c9-7826-48bc-89cb-01a9622ea531\" (UID: \"35c897c9-7826-48bc-89cb-01a9622ea531\") " Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.237193 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35c897c9-7826-48bc-89cb-01a9622ea531-operator-scripts\") pod \"35c897c9-7826-48bc-89cb-01a9622ea531\" (UID: \"35c897c9-7826-48bc-89cb-01a9622ea531\") " Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.237227 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glkt2\" (UniqueName: \"kubernetes.io/projected/23fc0c89-4399-4a71-ad3b-40cc2361d1b3-kube-api-access-glkt2\") pod \"23fc0c89-4399-4a71-ad3b-40cc2361d1b3\" (UID: \"23fc0c89-4399-4a71-ad3b-40cc2361d1b3\") " Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.237865 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c897c9-7826-48bc-89cb-01a9622ea531-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35c897c9-7826-48bc-89cb-01a9622ea531" (UID: "35c897c9-7826-48bc-89cb-01a9622ea531"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.237901 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23fc0c89-4399-4a71-ad3b-40cc2361d1b3-operator-scripts\") pod \"23fc0c89-4399-4a71-ad3b-40cc2361d1b3\" (UID: \"23fc0c89-4399-4a71-ad3b-40cc2361d1b3\") " Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.237877 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23fc0c89-4399-4a71-ad3b-40cc2361d1b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23fc0c89-4399-4a71-ad3b-40cc2361d1b3" (UID: "23fc0c89-4399-4a71-ad3b-40cc2361d1b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.238298 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23fc0c89-4399-4a71-ad3b-40cc2361d1b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.238311 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35c897c9-7826-48bc-89cb-01a9622ea531-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.243797 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fc0c89-4399-4a71-ad3b-40cc2361d1b3-kube-api-access-glkt2" (OuterVolumeSpecName: "kube-api-access-glkt2") pod "23fc0c89-4399-4a71-ad3b-40cc2361d1b3" (UID: "23fc0c89-4399-4a71-ad3b-40cc2361d1b3"). InnerVolumeSpecName "kube-api-access-glkt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.244096 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c897c9-7826-48bc-89cb-01a9622ea531-kube-api-access-h6jrz" (OuterVolumeSpecName: "kube-api-access-h6jrz") pod "35c897c9-7826-48bc-89cb-01a9622ea531" (UID: "35c897c9-7826-48bc-89cb-01a9622ea531"). InnerVolumeSpecName "kube-api-access-h6jrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.340420 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glkt2\" (UniqueName: \"kubernetes.io/projected/23fc0c89-4399-4a71-ad3b-40cc2361d1b3-kube-api-access-glkt2\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.340475 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6jrz\" (UniqueName: \"kubernetes.io/projected/35c897c9-7826-48bc-89cb-01a9622ea531-kube-api-access-h6jrz\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.740102 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f53a-account-create-update-t7kwr" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.740084 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f53a-account-create-update-t7kwr" event={"ID":"23fc0c89-4399-4a71-ad3b-40cc2361d1b3","Type":"ContainerDied","Data":"f3e04daa004251905d6daf9b8f0c817e9c95a5e56202987a989f3d787c94046d"} Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.740178 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e04daa004251905d6daf9b8f0c817e9c95a5e56202987a989f3d787c94046d" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.743025 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nklng" event={"ID":"35c897c9-7826-48bc-89cb-01a9622ea531","Type":"ContainerDied","Data":"c6ff30da3190815f654130f68c300b3351455f46e61939f9f92a2b95de34733b"} Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.743052 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6ff30da3190815f654130f68c300b3351455f46e61939f9f92a2b95de34733b" Jan 30 06:39:01 crc kubenswrapper[4841]: I0130 06:39:01.743147 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nklng" Jan 30 06:39:02 crc kubenswrapper[4841]: I0130 06:39:02.917682 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-67gbp"] Jan 30 06:39:02 crc kubenswrapper[4841]: E0130 06:39:02.918705 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c897c9-7826-48bc-89cb-01a9622ea531" containerName="mariadb-database-create" Jan 30 06:39:02 crc kubenswrapper[4841]: I0130 06:39:02.918722 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c897c9-7826-48bc-89cb-01a9622ea531" containerName="mariadb-database-create" Jan 30 06:39:02 crc kubenswrapper[4841]: E0130 06:39:02.918760 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fc0c89-4399-4a71-ad3b-40cc2361d1b3" containerName="mariadb-account-create-update" Jan 30 06:39:02 crc kubenswrapper[4841]: I0130 06:39:02.918770 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fc0c89-4399-4a71-ad3b-40cc2361d1b3" containerName="mariadb-account-create-update" Jan 30 06:39:02 crc kubenswrapper[4841]: I0130 06:39:02.918981 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c897c9-7826-48bc-89cb-01a9622ea531" containerName="mariadb-database-create" Jan 30 06:39:02 crc kubenswrapper[4841]: I0130 06:39:02.919007 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fc0c89-4399-4a71-ad3b-40cc2361d1b3" containerName="mariadb-account-create-update" Jan 30 06:39:02 crc kubenswrapper[4841]: I0130 06:39:02.919701 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:02 crc kubenswrapper[4841]: I0130 06:39:02.922421 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-s25s2" Jan 30 06:39:02 crc kubenswrapper[4841]: I0130 06:39:02.923163 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 06:39:02 crc kubenswrapper[4841]: I0130 06:39:02.925665 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-67gbp"] Jan 30 06:39:02 crc kubenswrapper[4841]: I0130 06:39:02.972556 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cfcdfd69-9665-4513-a512-13b0d0914f33-db-sync-config-data\") pod \"barbican-db-sync-67gbp\" (UID: \"cfcdfd69-9665-4513-a512-13b0d0914f33\") " pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:02 crc kubenswrapper[4841]: I0130 06:39:02.973348 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2mm6\" (UniqueName: \"kubernetes.io/projected/cfcdfd69-9665-4513-a512-13b0d0914f33-kube-api-access-c2mm6\") pod \"barbican-db-sync-67gbp\" (UID: \"cfcdfd69-9665-4513-a512-13b0d0914f33\") " pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:02 crc kubenswrapper[4841]: I0130 06:39:02.973424 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfcdfd69-9665-4513-a512-13b0d0914f33-combined-ca-bundle\") pod \"barbican-db-sync-67gbp\" (UID: \"cfcdfd69-9665-4513-a512-13b0d0914f33\") " pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:03 crc kubenswrapper[4841]: I0130 06:39:03.075516 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2mm6\" (UniqueName: \"kubernetes.io/projected/cfcdfd69-9665-4513-a512-13b0d0914f33-kube-api-access-c2mm6\") pod \"barbican-db-sync-67gbp\" (UID: \"cfcdfd69-9665-4513-a512-13b0d0914f33\") " pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:03 crc kubenswrapper[4841]: I0130 06:39:03.075621 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfcdfd69-9665-4513-a512-13b0d0914f33-combined-ca-bundle\") pod \"barbican-db-sync-67gbp\" (UID: \"cfcdfd69-9665-4513-a512-13b0d0914f33\") " pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:03 crc kubenswrapper[4841]: I0130 06:39:03.075695 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cfcdfd69-9665-4513-a512-13b0d0914f33-db-sync-config-data\") pod \"barbican-db-sync-67gbp\" (UID: \"cfcdfd69-9665-4513-a512-13b0d0914f33\") " pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:03 crc kubenswrapper[4841]: I0130 06:39:03.083095 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfcdfd69-9665-4513-a512-13b0d0914f33-combined-ca-bundle\") pod \"barbican-db-sync-67gbp\" (UID: \"cfcdfd69-9665-4513-a512-13b0d0914f33\") " pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:03 crc kubenswrapper[4841]: I0130 06:39:03.089608 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cfcdfd69-9665-4513-a512-13b0d0914f33-db-sync-config-data\") pod \"barbican-db-sync-67gbp\" (UID: \"cfcdfd69-9665-4513-a512-13b0d0914f33\") " pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:03 crc kubenswrapper[4841]: I0130 06:39:03.098510 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2mm6\" (UniqueName: \"kubernetes.io/projected/cfcdfd69-9665-4513-a512-13b0d0914f33-kube-api-access-c2mm6\") pod \"barbican-db-sync-67gbp\" (UID: \"cfcdfd69-9665-4513-a512-13b0d0914f33\") " pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:03 crc kubenswrapper[4841]: I0130 06:39:03.245374 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:03 crc kubenswrapper[4841]: I0130 06:39:03.733423 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-67gbp"] Jan 30 06:39:03 crc kubenswrapper[4841]: I0130 06:39:03.766281 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-67gbp" event={"ID":"cfcdfd69-9665-4513-a512-13b0d0914f33","Type":"ContainerStarted","Data":"6dd819319d0e0e4d8a32db31413ba7782af98c10b4a7a338db1cb281e87df874"} Jan 30 06:39:04 crc kubenswrapper[4841]: I0130 06:39:04.776503 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-67gbp" event={"ID":"cfcdfd69-9665-4513-a512-13b0d0914f33","Type":"ContainerStarted","Data":"867d005cd21bd781ade02045dfd4c6c1ccd01682a05940e6aeef94bf359071ca"} Jan 30 06:39:04 crc kubenswrapper[4841]: I0130 06:39:04.799696 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-67gbp" podStartSLOduration=2.7996688450000002 podStartE2EDuration="2.799668845s" podCreationTimestamp="2026-01-30 06:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:04.79271326 +0000 UTC m=+5481.786185898" watchObservedRunningTime="2026-01-30 06:39:04.799668845 +0000 UTC m=+5481.793141513" Jan 30 06:39:10 crc kubenswrapper[4841]: I0130 06:39:10.464582 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:39:10 crc kubenswrapper[4841]: I0130 06:39:10.465119 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:39:10 crc kubenswrapper[4841]: I0130 06:39:10.465164 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 06:39:10 crc kubenswrapper[4841]: I0130 06:39:10.465880 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8439101c0c816d59529151ad7b9f22d723c26d50210e194b3b0d5dd43185d44"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:39:10 crc kubenswrapper[4841]: I0130 06:39:10.465932 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://d8439101c0c816d59529151ad7b9f22d723c26d50210e194b3b0d5dd43185d44" gracePeriod=600 Jan 30 06:39:10 crc kubenswrapper[4841]: I0130 06:39:10.835551 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="d8439101c0c816d59529151ad7b9f22d723c26d50210e194b3b0d5dd43185d44" exitCode=0 Jan 30 06:39:10 crc kubenswrapper[4841]: I0130 06:39:10.835649 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"d8439101c0c816d59529151ad7b9f22d723c26d50210e194b3b0d5dd43185d44"} Jan 30 06:39:10 crc kubenswrapper[4841]: I0130 06:39:10.835952 4841 scope.go:117] "RemoveContainer" containerID="b81f1a082f2475be7f114a878911282d87282c7ae50fdd49e3735cb4bbd84176" Jan 30 06:39:11 crc kubenswrapper[4841]: I0130 06:39:11.852614 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156"} Jan 30 06:39:19 crc kubenswrapper[4841]: I0130 06:39:19.940901 4841 generic.go:334] "Generic (PLEG): container finished" podID="cfcdfd69-9665-4513-a512-13b0d0914f33" containerID="867d005cd21bd781ade02045dfd4c6c1ccd01682a05940e6aeef94bf359071ca" exitCode=0 Jan 30 06:39:19 crc kubenswrapper[4841]: I0130 06:39:19.941084 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-67gbp" event={"ID":"cfcdfd69-9665-4513-a512-13b0d0914f33","Type":"ContainerDied","Data":"867d005cd21bd781ade02045dfd4c6c1ccd01682a05940e6aeef94bf359071ca"} Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.342868 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.448360 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfcdfd69-9665-4513-a512-13b0d0914f33-combined-ca-bundle\") pod \"cfcdfd69-9665-4513-a512-13b0d0914f33\" (UID: \"cfcdfd69-9665-4513-a512-13b0d0914f33\") " Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.448585 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2mm6\" (UniqueName: \"kubernetes.io/projected/cfcdfd69-9665-4513-a512-13b0d0914f33-kube-api-access-c2mm6\") pod \"cfcdfd69-9665-4513-a512-13b0d0914f33\" (UID: \"cfcdfd69-9665-4513-a512-13b0d0914f33\") " Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.448634 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cfcdfd69-9665-4513-a512-13b0d0914f33-db-sync-config-data\") pod \"cfcdfd69-9665-4513-a512-13b0d0914f33\" (UID: \"cfcdfd69-9665-4513-a512-13b0d0914f33\") " Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.455344 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfcdfd69-9665-4513-a512-13b0d0914f33-kube-api-access-c2mm6" (OuterVolumeSpecName: "kube-api-access-c2mm6") pod "cfcdfd69-9665-4513-a512-13b0d0914f33" (UID: "cfcdfd69-9665-4513-a512-13b0d0914f33"). InnerVolumeSpecName "kube-api-access-c2mm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.457690 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfcdfd69-9665-4513-a512-13b0d0914f33-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cfcdfd69-9665-4513-a512-13b0d0914f33" (UID: "cfcdfd69-9665-4513-a512-13b0d0914f33"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.488766 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfcdfd69-9665-4513-a512-13b0d0914f33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfcdfd69-9665-4513-a512-13b0d0914f33" (UID: "cfcdfd69-9665-4513-a512-13b0d0914f33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.551135 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfcdfd69-9665-4513-a512-13b0d0914f33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.551164 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2mm6\" (UniqueName: \"kubernetes.io/projected/cfcdfd69-9665-4513-a512-13b0d0914f33-kube-api-access-c2mm6\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.551174 4841 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cfcdfd69-9665-4513-a512-13b0d0914f33-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.960819 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-67gbp" event={"ID":"cfcdfd69-9665-4513-a512-13b0d0914f33","Type":"ContainerDied","Data":"6dd819319d0e0e4d8a32db31413ba7782af98c10b4a7a338db1cb281e87df874"} Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.960863 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dd819319d0e0e4d8a32db31413ba7782af98c10b4a7a338db1cb281e87df874" Jan 30 06:39:21 crc kubenswrapper[4841]: I0130 06:39:21.960954 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-67gbp" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.159182 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7bdf49ff4b-r29n8"] Jan 30 06:39:22 crc kubenswrapper[4841]: E0130 06:39:22.162946 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfcdfd69-9665-4513-a512-13b0d0914f33" containerName="barbican-db-sync" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.162968 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcdfd69-9665-4513-a512-13b0d0914f33" containerName="barbican-db-sync" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.163135 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfcdfd69-9665-4513-a512-13b0d0914f33" containerName="barbican-db-sync" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.163976 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.169246 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-s25s2" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.169301 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-775cc4458f-ltjlx"] Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.169439 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.169458 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.170335 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.172215 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.178044 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-775cc4458f-ltjlx"] Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.195238 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7bdf49ff4b-r29n8"] Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.336470 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5daf3957-8483-40b4-a236-f92459dab9e4-config-data-custom\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.336514 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1b2b5e9-9162-48d4-b839-18e4d5316535-config-data-custom\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.336616 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2b5e9-9162-48d4-b839-18e4d5316535-combined-ca-bundle\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.336686 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b47l\" (UniqueName: \"kubernetes.io/projected/5daf3957-8483-40b4-a236-f92459dab9e4-kube-api-access-4b47l\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.336736 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b2b5e9-9162-48d4-b839-18e4d5316535-logs\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.336769 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5daf3957-8483-40b4-a236-f92459dab9e4-combined-ca-bundle\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.336843 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b2b5e9-9162-48d4-b839-18e4d5316535-config-data\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.336911 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5daf3957-8483-40b4-a236-f92459dab9e4-logs\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.337127 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9d6m\" (UniqueName: \"kubernetes.io/projected/a1b2b5e9-9162-48d4-b839-18e4d5316535-kube-api-access-b9d6m\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.337216 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5daf3957-8483-40b4-a236-f92459dab9e4-config-data\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.420146 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f4c986487-5m7g9"] Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.421457 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.438227 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9d6m\" (UniqueName: \"kubernetes.io/projected/a1b2b5e9-9162-48d4-b839-18e4d5316535-kube-api-access-b9d6m\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.438286 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5daf3957-8483-40b4-a236-f92459dab9e4-config-data\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.438326 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5daf3957-8483-40b4-a236-f92459dab9e4-config-data-custom\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.438344 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1b2b5e9-9162-48d4-b839-18e4d5316535-config-data-custom\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.438360 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2b5e9-9162-48d4-b839-18e4d5316535-combined-ca-bundle\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.438381 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b47l\" (UniqueName: \"kubernetes.io/projected/5daf3957-8483-40b4-a236-f92459dab9e4-kube-api-access-4b47l\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.438425 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b2b5e9-9162-48d4-b839-18e4d5316535-logs\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.438466 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5daf3957-8483-40b4-a236-f92459dab9e4-combined-ca-bundle\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.438488 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b2b5e9-9162-48d4-b839-18e4d5316535-config-data\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.438517 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5daf3957-8483-40b4-a236-f92459dab9e4-logs\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.438847 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5daf3957-8483-40b4-a236-f92459dab9e4-logs\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.444515 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f4c986487-5m7g9"] Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.445321 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5daf3957-8483-40b4-a236-f92459dab9e4-config-data\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.447910 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5daf3957-8483-40b4-a236-f92459dab9e4-combined-ca-bundle\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.448184 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b2b5e9-9162-48d4-b839-18e4d5316535-logs\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.452918 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5daf3957-8483-40b4-a236-f92459dab9e4-config-data-custom\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.453568 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1b2b5e9-9162-48d4-b839-18e4d5316535-config-data-custom\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.453922 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2b5e9-9162-48d4-b839-18e4d5316535-combined-ca-bundle\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.458246 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9d6m\" (UniqueName: \"kubernetes.io/projected/a1b2b5e9-9162-48d4-b839-18e4d5316535-kube-api-access-b9d6m\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.460460 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b2b5e9-9162-48d4-b839-18e4d5316535-config-data\") pod \"barbican-worker-775cc4458f-ltjlx\" (UID: \"a1b2b5e9-9162-48d4-b839-18e4d5316535\") " pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.463383 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b47l\" (UniqueName: \"kubernetes.io/projected/5daf3957-8483-40b4-a236-f92459dab9e4-kube-api-access-4b47l\") pod \"barbican-keystone-listener-7bdf49ff4b-r29n8\" (UID: \"5daf3957-8483-40b4-a236-f92459dab9e4\") " pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.540682 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-config\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.540729 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckljd\" (UniqueName: \"kubernetes.io/projected/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-kube-api-access-ckljd\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.540747 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-ovsdbserver-nb\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.541087 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-dns-svc\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.541304 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-ovsdbserver-sb\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.550941 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-645d96f7d8-wkzsd"] Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.552793 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.555266 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.574931 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-645d96f7d8-wkzsd"] Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.643573 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-ovsdbserver-sb\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.643636 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-config\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.643664 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckljd\" (UniqueName: \"kubernetes.io/projected/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-kube-api-access-ckljd\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.643681 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-ovsdbserver-nb\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.643709 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-dns-svc\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.644507 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-dns-svc\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.647737 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-ovsdbserver-sb\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.649385 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-config\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.650613 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-ovsdbserver-nb\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.659051 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.666051 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-775cc4458f-ltjlx" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.681862 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckljd\" (UniqueName: \"kubernetes.io/projected/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-kube-api-access-ckljd\") pod \"dnsmasq-dns-6f4c986487-5m7g9\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.744931 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-combined-ca-bundle\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.745035 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-config-data-custom\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.745057 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27kgb\" (UniqueName: \"kubernetes.io/projected/5d14b133-0ff6-49a1-8c0c-991b8908855a-kube-api-access-27kgb\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.745076 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d14b133-0ff6-49a1-8c0c-991b8908855a-logs\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.745093 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-config-data\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.748937 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.846450 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-config-data-custom\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.846758 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27kgb\" (UniqueName: \"kubernetes.io/projected/5d14b133-0ff6-49a1-8c0c-991b8908855a-kube-api-access-27kgb\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.846776 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d14b133-0ff6-49a1-8c0c-991b8908855a-logs\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.846794 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-config-data\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.846865 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-combined-ca-bundle\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.851269 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-combined-ca-bundle\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.851578 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d14b133-0ff6-49a1-8c0c-991b8908855a-logs\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.858903 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-config-data-custom\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.861714 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-config-data\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:22 crc kubenswrapper[4841]: I0130 06:39:22.876176 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27kgb\" (UniqueName: \"kubernetes.io/projected/5d14b133-0ff6-49a1-8c0c-991b8908855a-kube-api-access-27kgb\") pod \"barbican-api-645d96f7d8-wkzsd\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:23 crc kubenswrapper[4841]: I0130 06:39:23.170899 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:23 crc kubenswrapper[4841]: I0130 06:39:23.388134 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7bdf49ff4b-r29n8"] Jan 30 06:39:23 crc kubenswrapper[4841]: I0130 06:39:23.406984 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-775cc4458f-ltjlx"] Jan 30 06:39:23 crc kubenswrapper[4841]: W0130 06:39:23.412630 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1b2b5e9_9162_48d4_b839_18e4d5316535.slice/crio-1fa395d1f1e22e2da19c81b64eb2234dc2e0e256b3f1d945ac4c9055dfe1973a WatchSource:0}: Error finding container 1fa395d1f1e22e2da19c81b64eb2234dc2e0e256b3f1d945ac4c9055dfe1973a: Status 404 returned error can't find the container with id 1fa395d1f1e22e2da19c81b64eb2234dc2e0e256b3f1d945ac4c9055dfe1973a Jan 30 06:39:23 crc kubenswrapper[4841]: I0130 06:39:23.509759 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f4c986487-5m7g9"] Jan 30 06:39:23 crc kubenswrapper[4841]: I0130 06:39:23.671905 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-645d96f7d8-wkzsd"] Jan 30 06:39:23 crc kubenswrapper[4841]: W0130 06:39:23.677034 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d14b133_0ff6_49a1_8c0c_991b8908855a.slice/crio-453dc9494104bfcc5c48007cd5631ce8d8474a9407b33c43478ef029aaf428de WatchSource:0}: Error finding container 453dc9494104bfcc5c48007cd5631ce8d8474a9407b33c43478ef029aaf428de: Status 404 returned error can't find the container with id 453dc9494104bfcc5c48007cd5631ce8d8474a9407b33c43478ef029aaf428de Jan 30 06:39:23 crc kubenswrapper[4841]: I0130 06:39:23.989931 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" event={"ID":"c0fde94d-b3ff-4448-bedd-eef41eccd1f1","Type":"ContainerStarted","Data":"2088ba1be7d6474a58aba0cb23f220660c5364b387c12555a5f5182debb7b37f"} Jan 30 06:39:23 crc kubenswrapper[4841]: I0130 06:39:23.989976 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" event={"ID":"c0fde94d-b3ff-4448-bedd-eef41eccd1f1","Type":"ContainerStarted","Data":"2c8022419cb8f75fae6a206ab0c594c1bae80183d321bfdf91cfe5075784dcd5"} Jan 30 06:39:23 crc kubenswrapper[4841]: I0130 06:39:23.991101 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-645d96f7d8-wkzsd" event={"ID":"5d14b133-0ff6-49a1-8c0c-991b8908855a","Type":"ContainerStarted","Data":"453dc9494104bfcc5c48007cd5631ce8d8474a9407b33c43478ef029aaf428de"} Jan 30 06:39:23 crc kubenswrapper[4841]: I0130 06:39:23.993730 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-775cc4458f-ltjlx" event={"ID":"a1b2b5e9-9162-48d4-b839-18e4d5316535","Type":"ContainerStarted","Data":"51db67220bac0effd8c4b9989d9fc8d90674d644ac1e1c8bf14f1721b7053f57"} Jan 30 06:39:23 crc kubenswrapper[4841]: I0130 06:39:23.993754 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-775cc4458f-ltjlx" event={"ID":"a1b2b5e9-9162-48d4-b839-18e4d5316535","Type":"ContainerStarted","Data":"1fa395d1f1e22e2da19c81b64eb2234dc2e0e256b3f1d945ac4c9055dfe1973a"} Jan 30 06:39:23 crc kubenswrapper[4841]: I0130 06:39:23.999469 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" event={"ID":"5daf3957-8483-40b4-a236-f92459dab9e4","Type":"ContainerStarted","Data":"94c54d1b1e9f439cde6470313211cc80755fe1645ad897ea18d37940b8231e88"} Jan 30 06:39:23 crc kubenswrapper[4841]: I0130 06:39:23.999489 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" event={"ID":"5daf3957-8483-40b4-a236-f92459dab9e4","Type":"ContainerStarted","Data":"c1c10863e00870e2ee5b4e076033f31aa5b4dd1968dfe37e09085cc89d31d67c"} Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.412330 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7579c69d68-vszjk"] Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.414748 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.417901 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.418100 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.423821 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7579c69d68-vszjk"] Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.613844 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-config-data-custom\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.613897 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-combined-ca-bundle\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.614081 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-config-data\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.614185 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xt79\" (UniqueName: \"kubernetes.io/projected/521faf93-2a50-42d0-851e-3f2b407aeb5f-kube-api-access-6xt79\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.614301 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/521faf93-2a50-42d0-851e-3f2b407aeb5f-logs\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.614367 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-internal-tls-certs\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.614493 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-public-tls-certs\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.715727 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xt79\" (UniqueName: \"kubernetes.io/projected/521faf93-2a50-42d0-851e-3f2b407aeb5f-kube-api-access-6xt79\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.715820 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/521faf93-2a50-42d0-851e-3f2b407aeb5f-logs\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.715882 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-internal-tls-certs\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.715944 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-public-tls-certs\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.715973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-config-data-custom\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.715997 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-combined-ca-bundle\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.716069 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-config-data\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.717345 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/521faf93-2a50-42d0-851e-3f2b407aeb5f-logs\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.720554 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-combined-ca-bundle\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.721083 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-internal-tls-certs\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.721207 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-config-data-custom\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.722142 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-config-data\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.726149 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/521faf93-2a50-42d0-851e-3f2b407aeb5f-public-tls-certs\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.732424 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xt79\" (UniqueName: \"kubernetes.io/projected/521faf93-2a50-42d0-851e-3f2b407aeb5f-kube-api-access-6xt79\") pod \"barbican-api-7579c69d68-vszjk\" (UID: \"521faf93-2a50-42d0-851e-3f2b407aeb5f\") " pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:24 crc kubenswrapper[4841]: I0130 06:39:24.735051 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:25 crc kubenswrapper[4841]: I0130 06:39:25.007470 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" event={"ID":"5daf3957-8483-40b4-a236-f92459dab9e4","Type":"ContainerStarted","Data":"bc496bd4d76c1aaa05d0990f091cbed439921d9b539bf26352797c15ccdb0c03"} Jan 30 06:39:25 crc kubenswrapper[4841]: I0130 06:39:25.009546 4841 generic.go:334] "Generic (PLEG): container finished" podID="c0fde94d-b3ff-4448-bedd-eef41eccd1f1" containerID="2088ba1be7d6474a58aba0cb23f220660c5364b387c12555a5f5182debb7b37f" exitCode=0 Jan 30 06:39:25 crc kubenswrapper[4841]: I0130 06:39:25.009629 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" event={"ID":"c0fde94d-b3ff-4448-bedd-eef41eccd1f1","Type":"ContainerDied","Data":"2088ba1be7d6474a58aba0cb23f220660c5364b387c12555a5f5182debb7b37f"} Jan 30 06:39:25 crc kubenswrapper[4841]: I0130 06:39:25.028747 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-645d96f7d8-wkzsd" event={"ID":"5d14b133-0ff6-49a1-8c0c-991b8908855a","Type":"ContainerStarted","Data":"3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093"} Jan 30 06:39:25 crc kubenswrapper[4841]: I0130 06:39:25.028790 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-645d96f7d8-wkzsd" event={"ID":"5d14b133-0ff6-49a1-8c0c-991b8908855a","Type":"ContainerStarted","Data":"3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0"} Jan 30 06:39:25 crc kubenswrapper[4841]: I0130 06:39:25.029216 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:25 crc kubenswrapper[4841]: I0130 06:39:25.029529 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:25 crc kubenswrapper[4841]: I0130 06:39:25.033277 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7bdf49ff4b-r29n8" podStartSLOduration=3.033262016 podStartE2EDuration="3.033262016s" podCreationTimestamp="2026-01-30 06:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:25.027967035 +0000 UTC m=+5502.021439693" watchObservedRunningTime="2026-01-30 06:39:25.033262016 +0000 UTC m=+5502.026734654" Jan 30 06:39:25 crc kubenswrapper[4841]: I0130 06:39:25.034306 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-775cc4458f-ltjlx" event={"ID":"a1b2b5e9-9162-48d4-b839-18e4d5316535","Type":"ContainerStarted","Data":"8bf19037b9831960864a8cf748c8ee644393339d8550326420819157c0b0207d"} Jan 30 06:39:25 crc kubenswrapper[4841]: I0130 06:39:25.113297 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-775cc4458f-ltjlx" podStartSLOduration=3.11328106 podStartE2EDuration="3.11328106s" podCreationTimestamp="2026-01-30 06:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:25.111362299 +0000 UTC m=+5502.104834957" watchObservedRunningTime="2026-01-30 06:39:25.11328106 +0000 UTC m=+5502.106753688" Jan 30 06:39:25 crc kubenswrapper[4841]: I0130 06:39:25.120620 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-645d96f7d8-wkzsd" podStartSLOduration=3.120600325 podStartE2EDuration="3.120600325s" podCreationTimestamp="2026-01-30 06:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:25.07804344 +0000 UTC m=+5502.071516078" watchObservedRunningTime="2026-01-30 06:39:25.120600325 +0000 UTC m=+5502.114072963" Jan 30 06:39:25 crc kubenswrapper[4841]: I0130 06:39:25.218665 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7579c69d68-vszjk"] Jan 30 06:39:25 crc kubenswrapper[4841]: W0130 06:39:25.227242 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod521faf93_2a50_42d0_851e_3f2b407aeb5f.slice/crio-3c435907184e3fba82a0dcaa9ee0bd82055001a3db9441a66efad134e3a78812 WatchSource:0}: Error finding container 3c435907184e3fba82a0dcaa9ee0bd82055001a3db9441a66efad134e3a78812: Status 404 returned error can't find the container with id 3c435907184e3fba82a0dcaa9ee0bd82055001a3db9441a66efad134e3a78812 Jan 30 06:39:26 crc kubenswrapper[4841]: I0130 06:39:26.043284 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" event={"ID":"c0fde94d-b3ff-4448-bedd-eef41eccd1f1","Type":"ContainerStarted","Data":"c285ceb89751114494e42d3f9d4879dec3c330725d49efe61ef36818504762d2"} Jan 30 06:39:26 crc kubenswrapper[4841]: I0130 06:39:26.044472 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:26 crc kubenswrapper[4841]: I0130 06:39:26.047086 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7579c69d68-vszjk" event={"ID":"521faf93-2a50-42d0-851e-3f2b407aeb5f","Type":"ContainerStarted","Data":"1601b58bed61192142db126663338ffd09e2d0762021b6a2573e60c70cee26b8"} Jan 30 06:39:26 crc kubenswrapper[4841]: I0130 06:39:26.047109 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7579c69d68-vszjk" event={"ID":"521faf93-2a50-42d0-851e-3f2b407aeb5f","Type":"ContainerStarted","Data":"319d5660a6e2cd950f325f6a9c6cf1c2e3a41677d90785c8aaeebae17f743410"} Jan 30 06:39:26 crc kubenswrapper[4841]: I0130 06:39:26.047134 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7579c69d68-vszjk" event={"ID":"521faf93-2a50-42d0-851e-3f2b407aeb5f","Type":"ContainerStarted","Data":"3c435907184e3fba82a0dcaa9ee0bd82055001a3db9441a66efad134e3a78812"} Jan 30 06:39:26 crc kubenswrapper[4841]: I0130 06:39:26.047147 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:26 crc kubenswrapper[4841]: I0130 06:39:26.048130 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:26 crc kubenswrapper[4841]: I0130 06:39:26.086730 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7579c69d68-vszjk" podStartSLOduration=2.086712124 podStartE2EDuration="2.086712124s" podCreationTimestamp="2026-01-30 06:39:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:26.082756479 +0000 UTC m=+5503.076229117" watchObservedRunningTime="2026-01-30 06:39:26.086712124 +0000 UTC m=+5503.080184752" Jan 30 06:39:26 crc kubenswrapper[4841]: I0130 06:39:26.087176 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" podStartSLOduration=4.087169687 podStartE2EDuration="4.087169687s" podCreationTimestamp="2026-01-30 06:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:26.062379596 +0000 UTC m=+5503.055852234" watchObservedRunningTime="2026-01-30 06:39:26.087169687 +0000 UTC m=+5503.080642325" Jan 30 06:39:29 crc kubenswrapper[4841]: I0130 06:39:29.541978 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:30 crc kubenswrapper[4841]: I0130 06:39:30.937370 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:32 crc kubenswrapper[4841]: I0130 06:39:32.750576 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:39:32 crc kubenswrapper[4841]: I0130 06:39:32.811127 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dd9744555-2f8bd"] Jan 30 06:39:32 crc kubenswrapper[4841]: I0130 06:39:32.811593 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" podUID="47b73063-5741-44b0-9ac2-0fd44822928e" containerName="dnsmasq-dns" containerID="cri-o://010b3abf249c6e927822c64d8edeb3fab1d86642596eabd6a97dd11469097406" gracePeriod=10 Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.133877 4841 generic.go:334] "Generic (PLEG): container finished" podID="47b73063-5741-44b0-9ac2-0fd44822928e" containerID="010b3abf249c6e927822c64d8edeb3fab1d86642596eabd6a97dd11469097406" exitCode=0 Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.134144 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" event={"ID":"47b73063-5741-44b0-9ac2-0fd44822928e","Type":"ContainerDied","Data":"010b3abf249c6e927822c64d8edeb3fab1d86642596eabd6a97dd11469097406"} Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.337392 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.498962 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-ovsdbserver-nb\") pod \"47b73063-5741-44b0-9ac2-0fd44822928e\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.499837 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-ovsdbserver-sb\") pod \"47b73063-5741-44b0-9ac2-0fd44822928e\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.499922 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-dns-svc\") pod \"47b73063-5741-44b0-9ac2-0fd44822928e\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.499984 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brnpl\" (UniqueName: \"kubernetes.io/projected/47b73063-5741-44b0-9ac2-0fd44822928e-kube-api-access-brnpl\") pod \"47b73063-5741-44b0-9ac2-0fd44822928e\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.500103 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-config\") pod \"47b73063-5741-44b0-9ac2-0fd44822928e\" (UID: \"47b73063-5741-44b0-9ac2-0fd44822928e\") " Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.511687 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b73063-5741-44b0-9ac2-0fd44822928e-kube-api-access-brnpl" (OuterVolumeSpecName: "kube-api-access-brnpl") pod "47b73063-5741-44b0-9ac2-0fd44822928e" (UID: "47b73063-5741-44b0-9ac2-0fd44822928e"). InnerVolumeSpecName "kube-api-access-brnpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.544796 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "47b73063-5741-44b0-9ac2-0fd44822928e" (UID: "47b73063-5741-44b0-9ac2-0fd44822928e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.552223 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "47b73063-5741-44b0-9ac2-0fd44822928e" (UID: "47b73063-5741-44b0-9ac2-0fd44822928e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.552423 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-config" (OuterVolumeSpecName: "config") pod "47b73063-5741-44b0-9ac2-0fd44822928e" (UID: "47b73063-5741-44b0-9ac2-0fd44822928e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.588719 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "47b73063-5741-44b0-9ac2-0fd44822928e" (UID: "47b73063-5741-44b0-9ac2-0fd44822928e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.602775 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brnpl\" (UniqueName: \"kubernetes.io/projected/47b73063-5741-44b0-9ac2-0fd44822928e-kube-api-access-brnpl\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.602823 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.602836 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.602846 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:33 crc kubenswrapper[4841]: I0130 06:39:33.602858 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47b73063-5741-44b0-9ac2-0fd44822928e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:34 crc kubenswrapper[4841]: I0130 06:39:34.151215 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" event={"ID":"47b73063-5741-44b0-9ac2-0fd44822928e","Type":"ContainerDied","Data":"a9dddcdad2762c2e0ba6a4fae8ce0930da1bb444634b8d0489aff5fede465e6b"} Jan 30 06:39:34 crc kubenswrapper[4841]: I0130 06:39:34.151289 4841 scope.go:117] "RemoveContainer" containerID="010b3abf249c6e927822c64d8edeb3fab1d86642596eabd6a97dd11469097406" Jan 30 06:39:34 crc kubenswrapper[4841]: I0130 06:39:34.151493 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd9744555-2f8bd" Jan 30 06:39:34 crc kubenswrapper[4841]: I0130 06:39:34.205414 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dd9744555-2f8bd"] Jan 30 06:39:34 crc kubenswrapper[4841]: I0130 06:39:34.217294 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dd9744555-2f8bd"] Jan 30 06:39:34 crc kubenswrapper[4841]: I0130 06:39:34.220904 4841 scope.go:117] "RemoveContainer" containerID="70d7c4ef8322c44919007b90deaadef6c091f5e1999afd8e89800460496e5b6b" Jan 30 06:39:34 crc kubenswrapper[4841]: I0130 06:39:34.447605 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47b73063-5741-44b0-9ac2-0fd44822928e" path="/var/lib/kubelet/pods/47b73063-5741-44b0-9ac2-0fd44822928e/volumes" Jan 30 06:39:35 crc kubenswrapper[4841]: I0130 06:39:35.993806 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:35 crc kubenswrapper[4841]: I0130 06:39:35.999185 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7579c69d68-vszjk" Jan 30 06:39:36 crc kubenswrapper[4841]: I0130 06:39:36.114477 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-645d96f7d8-wkzsd"] Jan 30 06:39:36 crc kubenswrapper[4841]: I0130 06:39:36.114692 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-645d96f7d8-wkzsd" podUID="5d14b133-0ff6-49a1-8c0c-991b8908855a" containerName="barbican-api-log" containerID="cri-o://3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093" gracePeriod=30 Jan 30 06:39:36 crc kubenswrapper[4841]: I0130 06:39:36.114834 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-645d96f7d8-wkzsd" podUID="5d14b133-0ff6-49a1-8c0c-991b8908855a" containerName="barbican-api" containerID="cri-o://3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0" gracePeriod=30 Jan 30 06:39:37 crc kubenswrapper[4841]: I0130 06:39:37.177934 4841 generic.go:334] "Generic (PLEG): container finished" podID="5d14b133-0ff6-49a1-8c0c-991b8908855a" containerID="3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093" exitCode=143 Jan 30 06:39:37 crc kubenswrapper[4841]: I0130 06:39:37.178018 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-645d96f7d8-wkzsd" event={"ID":"5d14b133-0ff6-49a1-8c0c-991b8908855a","Type":"ContainerDied","Data":"3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093"} Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.262922 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-645d96f7d8-wkzsd" podUID="5d14b133-0ff6-49a1-8c0c-991b8908855a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.36:9311/healthcheck\": read tcp 10.217.0.2:49256->10.217.1.36:9311: read: connection reset by peer" Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.262975 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-645d96f7d8-wkzsd" podUID="5d14b133-0ff6-49a1-8c0c-991b8908855a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.36:9311/healthcheck\": read tcp 10.217.0.2:49270->10.217.1.36:9311: read: connection reset by peer" Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.697192 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.838443 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-config-data-custom\") pod \"5d14b133-0ff6-49a1-8c0c-991b8908855a\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.839419 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27kgb\" (UniqueName: \"kubernetes.io/projected/5d14b133-0ff6-49a1-8c0c-991b8908855a-kube-api-access-27kgb\") pod \"5d14b133-0ff6-49a1-8c0c-991b8908855a\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.839466 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-combined-ca-bundle\") pod \"5d14b133-0ff6-49a1-8c0c-991b8908855a\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.839565 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-config-data\") pod \"5d14b133-0ff6-49a1-8c0c-991b8908855a\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.839623 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d14b133-0ff6-49a1-8c0c-991b8908855a-logs\") pod \"5d14b133-0ff6-49a1-8c0c-991b8908855a\" (UID: \"5d14b133-0ff6-49a1-8c0c-991b8908855a\") " Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.840540 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d14b133-0ff6-49a1-8c0c-991b8908855a-logs" (OuterVolumeSpecName: "logs") pod "5d14b133-0ff6-49a1-8c0c-991b8908855a" (UID: "5d14b133-0ff6-49a1-8c0c-991b8908855a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.844851 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d14b133-0ff6-49a1-8c0c-991b8908855a-kube-api-access-27kgb" (OuterVolumeSpecName: "kube-api-access-27kgb") pod "5d14b133-0ff6-49a1-8c0c-991b8908855a" (UID: "5d14b133-0ff6-49a1-8c0c-991b8908855a"). InnerVolumeSpecName "kube-api-access-27kgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.845391 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5d14b133-0ff6-49a1-8c0c-991b8908855a" (UID: "5d14b133-0ff6-49a1-8c0c-991b8908855a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.864488 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d14b133-0ff6-49a1-8c0c-991b8908855a" (UID: "5d14b133-0ff6-49a1-8c0c-991b8908855a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.906112 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-config-data" (OuterVolumeSpecName: "config-data") pod "5d14b133-0ff6-49a1-8c0c-991b8908855a" (UID: "5d14b133-0ff6-49a1-8c0c-991b8908855a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.942069 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.942127 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.942149 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d14b133-0ff6-49a1-8c0c-991b8908855a-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.942167 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d14b133-0ff6-49a1-8c0c-991b8908855a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:39 crc kubenswrapper[4841]: I0130 06:39:39.942184 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27kgb\" (UniqueName: \"kubernetes.io/projected/5d14b133-0ff6-49a1-8c0c-991b8908855a-kube-api-access-27kgb\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.214094 4841 generic.go:334] "Generic (PLEG): container finished" podID="5d14b133-0ff6-49a1-8c0c-991b8908855a" containerID="3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0" exitCode=0 Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.214131 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-645d96f7d8-wkzsd" event={"ID":"5d14b133-0ff6-49a1-8c0c-991b8908855a","Type":"ContainerDied","Data":"3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0"} Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.214155 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-645d96f7d8-wkzsd" event={"ID":"5d14b133-0ff6-49a1-8c0c-991b8908855a","Type":"ContainerDied","Data":"453dc9494104bfcc5c48007cd5631ce8d8474a9407b33c43478ef029aaf428de"} Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.214169 4841 scope.go:117] "RemoveContainer" containerID="3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0" Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.214266 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-645d96f7d8-wkzsd" Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.244382 4841 scope.go:117] "RemoveContainer" containerID="3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093" Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.258638 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-645d96f7d8-wkzsd"] Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.271267 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-645d96f7d8-wkzsd"] Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.271573 4841 scope.go:117] "RemoveContainer" containerID="3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0" Jan 30 06:39:40 crc kubenswrapper[4841]: E0130 06:39:40.272033 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0\": container with ID starting with 3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0 not found: ID does not exist" containerID="3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0" Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.272074 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0"} err="failed to get container status \"3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0\": rpc error: code = NotFound desc = could not find container \"3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0\": container with ID starting with 3355e23811cbec92fe8ce42bdd7447771a6a414dd863a2940d87d7fbf6dddcd0 not found: ID does not exist" Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.272100 4841 scope.go:117] "RemoveContainer" containerID="3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093" Jan 30 06:39:40 crc kubenswrapper[4841]: E0130 06:39:40.272515 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093\": container with ID starting with 3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093 not found: ID does not exist" containerID="3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093" Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.272538 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093"} err="failed to get container status \"3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093\": rpc error: code = NotFound desc = could not find container \"3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093\": container with ID starting with 3e0bf149b5f60a21a2a226dde92bd11590333577455aec539e13906e9d540093 not found: ID does not exist" Jan 30 06:39:40 crc kubenswrapper[4841]: I0130 06:39:40.444359 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d14b133-0ff6-49a1-8c0c-991b8908855a" path="/var/lib/kubelet/pods/5d14b133-0ff6-49a1-8c0c-991b8908855a/volumes" Jan 30 06:39:51 crc kubenswrapper[4841]: I0130 06:39:51.527741 4841 scope.go:117] "RemoveContainer" containerID="e3239e27b50f1e3e211cedf510aff342256d92f96bac7311f773731e6e953b68" Jan 30 06:39:51 crc kubenswrapper[4841]: I0130 06:39:51.565866 4841 scope.go:117] "RemoveContainer" containerID="26bcd2dd5ac79725d9029453700cba134772cef5b02f992e3fd40717864117fd" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.114338 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-9mnmt"] Jan 30 06:39:53 crc kubenswrapper[4841]: E0130 06:39:53.114963 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d14b133-0ff6-49a1-8c0c-991b8908855a" containerName="barbican-api-log" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.114976 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d14b133-0ff6-49a1-8c0c-991b8908855a" containerName="barbican-api-log" Jan 30 06:39:53 crc kubenswrapper[4841]: E0130 06:39:53.115003 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b73063-5741-44b0-9ac2-0fd44822928e" containerName="dnsmasq-dns" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.115009 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b73063-5741-44b0-9ac2-0fd44822928e" containerName="dnsmasq-dns" Jan 30 06:39:53 crc kubenswrapper[4841]: E0130 06:39:53.115021 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b73063-5741-44b0-9ac2-0fd44822928e" containerName="init" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.115027 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b73063-5741-44b0-9ac2-0fd44822928e" containerName="init" Jan 30 06:39:53 crc kubenswrapper[4841]: E0130 06:39:53.115042 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d14b133-0ff6-49a1-8c0c-991b8908855a" containerName="barbican-api" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.115049 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d14b133-0ff6-49a1-8c0c-991b8908855a" containerName="barbican-api" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.115183 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b73063-5741-44b0-9ac2-0fd44822928e" containerName="dnsmasq-dns" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.115200 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d14b133-0ff6-49a1-8c0c-991b8908855a" containerName="barbican-api-log" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.115214 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d14b133-0ff6-49a1-8c0c-991b8908855a" containerName="barbican-api" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.115740 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9mnmt" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.123749 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9mnmt"] Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.201962 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h67d\" (UniqueName: \"kubernetes.io/projected/c9b11e52-0c46-441a-9938-0b4237eb9e94-kube-api-access-8h67d\") pod \"neutron-db-create-9mnmt\" (UID: \"c9b11e52-0c46-441a-9938-0b4237eb9e94\") " pod="openstack/neutron-db-create-9mnmt" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.202004 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b11e52-0c46-441a-9938-0b4237eb9e94-operator-scripts\") pod \"neutron-db-create-9mnmt\" (UID: \"c9b11e52-0c46-441a-9938-0b4237eb9e94\") " pod="openstack/neutron-db-create-9mnmt" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.219688 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1c57-account-create-update-gz9rp"] Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.221101 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1c57-account-create-update-gz9rp" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.223661 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.228666 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1c57-account-create-update-gz9rp"] Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.304130 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h67d\" (UniqueName: \"kubernetes.io/projected/c9b11e52-0c46-441a-9938-0b4237eb9e94-kube-api-access-8h67d\") pod \"neutron-db-create-9mnmt\" (UID: \"c9b11e52-0c46-441a-9938-0b4237eb9e94\") " pod="openstack/neutron-db-create-9mnmt" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.304202 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b11e52-0c46-441a-9938-0b4237eb9e94-operator-scripts\") pod \"neutron-db-create-9mnmt\" (UID: \"c9b11e52-0c46-441a-9938-0b4237eb9e94\") " pod="openstack/neutron-db-create-9mnmt" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.304235 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9c76d5a-1ed6-49cc-8c42-ebc88890c477-operator-scripts\") pod \"neutron-1c57-account-create-update-gz9rp\" (UID: \"c9c76d5a-1ed6-49cc-8c42-ebc88890c477\") " pod="openstack/neutron-1c57-account-create-update-gz9rp" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.304358 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxf4b\" (UniqueName: \"kubernetes.io/projected/c9c76d5a-1ed6-49cc-8c42-ebc88890c477-kube-api-access-sxf4b\") pod \"neutron-1c57-account-create-update-gz9rp\" (UID: \"c9c76d5a-1ed6-49cc-8c42-ebc88890c477\") " pod="openstack/neutron-1c57-account-create-update-gz9rp" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.305143 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b11e52-0c46-441a-9938-0b4237eb9e94-operator-scripts\") pod \"neutron-db-create-9mnmt\" (UID: \"c9b11e52-0c46-441a-9938-0b4237eb9e94\") " pod="openstack/neutron-db-create-9mnmt" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.323553 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h67d\" (UniqueName: \"kubernetes.io/projected/c9b11e52-0c46-441a-9938-0b4237eb9e94-kube-api-access-8h67d\") pod \"neutron-db-create-9mnmt\" (UID: \"c9b11e52-0c46-441a-9938-0b4237eb9e94\") " pod="openstack/neutron-db-create-9mnmt" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.405787 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9c76d5a-1ed6-49cc-8c42-ebc88890c477-operator-scripts\") pod \"neutron-1c57-account-create-update-gz9rp\" (UID: \"c9c76d5a-1ed6-49cc-8c42-ebc88890c477\") " pod="openstack/neutron-1c57-account-create-update-gz9rp" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.405912 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxf4b\" (UniqueName: \"kubernetes.io/projected/c9c76d5a-1ed6-49cc-8c42-ebc88890c477-kube-api-access-sxf4b\") pod \"neutron-1c57-account-create-update-gz9rp\" (UID: \"c9c76d5a-1ed6-49cc-8c42-ebc88890c477\") " pod="openstack/neutron-1c57-account-create-update-gz9rp" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.406486 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9c76d5a-1ed6-49cc-8c42-ebc88890c477-operator-scripts\") pod \"neutron-1c57-account-create-update-gz9rp\" (UID: \"c9c76d5a-1ed6-49cc-8c42-ebc88890c477\") " pod="openstack/neutron-1c57-account-create-update-gz9rp" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.422378 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxf4b\" (UniqueName: \"kubernetes.io/projected/c9c76d5a-1ed6-49cc-8c42-ebc88890c477-kube-api-access-sxf4b\") pod \"neutron-1c57-account-create-update-gz9rp\" (UID: \"c9c76d5a-1ed6-49cc-8c42-ebc88890c477\") " pod="openstack/neutron-1c57-account-create-update-gz9rp" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.438682 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9mnmt" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.540657 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1c57-account-create-update-gz9rp" Jan 30 06:39:53 crc kubenswrapper[4841]: I0130 06:39:53.740220 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9mnmt"] Jan 30 06:39:54 crc kubenswrapper[4841]: I0130 06:39:54.070740 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1c57-account-create-update-gz9rp"] Jan 30 06:39:54 crc kubenswrapper[4841]: W0130 06:39:54.073033 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9c76d5a_1ed6_49cc_8c42_ebc88890c477.slice/crio-bfffad294e619558761c5c202a87ce45ee2928efd7618ec2e84a2fb40ad8916c WatchSource:0}: Error finding container bfffad294e619558761c5c202a87ce45ee2928efd7618ec2e84a2fb40ad8916c: Status 404 returned error can't find the container with id bfffad294e619558761c5c202a87ce45ee2928efd7618ec2e84a2fb40ad8916c Jan 30 06:39:54 crc kubenswrapper[4841]: I0130 06:39:54.334111 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1c57-account-create-update-gz9rp" event={"ID":"c9c76d5a-1ed6-49cc-8c42-ebc88890c477","Type":"ContainerStarted","Data":"cb797749fe33ff934f3ab20cebadd7d9c856c6147d540bbfb4367f4c5d56926b"} Jan 30 06:39:54 crc kubenswrapper[4841]: I0130 06:39:54.334155 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1c57-account-create-update-gz9rp" event={"ID":"c9c76d5a-1ed6-49cc-8c42-ebc88890c477","Type":"ContainerStarted","Data":"bfffad294e619558761c5c202a87ce45ee2928efd7618ec2e84a2fb40ad8916c"} Jan 30 06:39:54 crc kubenswrapper[4841]: I0130 06:39:54.338241 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9mnmt" event={"ID":"c9b11e52-0c46-441a-9938-0b4237eb9e94","Type":"ContainerStarted","Data":"c0c975c2ae8957381471e3b4bd5ef1260153a2516ce0e7e793c0053ca9ef7c14"} Jan 30 06:39:54 crc kubenswrapper[4841]: I0130 06:39:54.338274 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9mnmt" event={"ID":"c9b11e52-0c46-441a-9938-0b4237eb9e94","Type":"ContainerStarted","Data":"1219d400f917e3745b017411876655d46d5d557d4cead4d8eebaf1fd8e1a31f2"} Jan 30 06:39:54 crc kubenswrapper[4841]: I0130 06:39:54.353591 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-1c57-account-create-update-gz9rp" podStartSLOduration=1.35357642 podStartE2EDuration="1.35357642s" podCreationTimestamp="2026-01-30 06:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:54.3517216 +0000 UTC m=+5531.345194238" watchObservedRunningTime="2026-01-30 06:39:54.35357642 +0000 UTC m=+5531.347049058" Jan 30 06:39:54 crc kubenswrapper[4841]: I0130 06:39:54.376037 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-9mnmt" podStartSLOduration=1.376016099 podStartE2EDuration="1.376016099s" podCreationTimestamp="2026-01-30 06:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:39:54.37271259 +0000 UTC m=+5531.366185228" watchObservedRunningTime="2026-01-30 06:39:54.376016099 +0000 UTC m=+5531.369488737" Jan 30 06:39:55 crc kubenswrapper[4841]: I0130 06:39:55.351389 4841 generic.go:334] "Generic (PLEG): container finished" podID="c9b11e52-0c46-441a-9938-0b4237eb9e94" containerID="c0c975c2ae8957381471e3b4bd5ef1260153a2516ce0e7e793c0053ca9ef7c14" exitCode=0 Jan 30 06:39:55 crc kubenswrapper[4841]: I0130 06:39:55.351595 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9mnmt" event={"ID":"c9b11e52-0c46-441a-9938-0b4237eb9e94","Type":"ContainerDied","Data":"c0c975c2ae8957381471e3b4bd5ef1260153a2516ce0e7e793c0053ca9ef7c14"} Jan 30 06:39:56 crc kubenswrapper[4841]: I0130 06:39:56.362593 4841 generic.go:334] "Generic (PLEG): container finished" podID="c9c76d5a-1ed6-49cc-8c42-ebc88890c477" containerID="cb797749fe33ff934f3ab20cebadd7d9c856c6147d540bbfb4367f4c5d56926b" exitCode=0 Jan 30 06:39:56 crc kubenswrapper[4841]: I0130 06:39:56.362794 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1c57-account-create-update-gz9rp" event={"ID":"c9c76d5a-1ed6-49cc-8c42-ebc88890c477","Type":"ContainerDied","Data":"cb797749fe33ff934f3ab20cebadd7d9c856c6147d540bbfb4367f4c5d56926b"} Jan 30 06:39:56 crc kubenswrapper[4841]: I0130 06:39:56.797360 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9mnmt" Jan 30 06:39:56 crc kubenswrapper[4841]: I0130 06:39:56.868773 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b11e52-0c46-441a-9938-0b4237eb9e94-operator-scripts\") pod \"c9b11e52-0c46-441a-9938-0b4237eb9e94\" (UID: \"c9b11e52-0c46-441a-9938-0b4237eb9e94\") " Jan 30 06:39:56 crc kubenswrapper[4841]: I0130 06:39:56.868945 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h67d\" (UniqueName: \"kubernetes.io/projected/c9b11e52-0c46-441a-9938-0b4237eb9e94-kube-api-access-8h67d\") pod \"c9b11e52-0c46-441a-9938-0b4237eb9e94\" (UID: \"c9b11e52-0c46-441a-9938-0b4237eb9e94\") " Jan 30 06:39:56 crc kubenswrapper[4841]: I0130 06:39:56.869302 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9b11e52-0c46-441a-9938-0b4237eb9e94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9b11e52-0c46-441a-9938-0b4237eb9e94" (UID: "c9b11e52-0c46-441a-9938-0b4237eb9e94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:56 crc kubenswrapper[4841]: I0130 06:39:56.912511 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b11e52-0c46-441a-9938-0b4237eb9e94-kube-api-access-8h67d" (OuterVolumeSpecName: "kube-api-access-8h67d") pod "c9b11e52-0c46-441a-9938-0b4237eb9e94" (UID: "c9b11e52-0c46-441a-9938-0b4237eb9e94"). InnerVolumeSpecName "kube-api-access-8h67d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:56 crc kubenswrapper[4841]: I0130 06:39:56.971784 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b11e52-0c46-441a-9938-0b4237eb9e94-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:56 crc kubenswrapper[4841]: I0130 06:39:56.971831 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h67d\" (UniqueName: \"kubernetes.io/projected/c9b11e52-0c46-441a-9938-0b4237eb9e94-kube-api-access-8h67d\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:57 crc kubenswrapper[4841]: I0130 06:39:57.374820 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9mnmt" event={"ID":"c9b11e52-0c46-441a-9938-0b4237eb9e94","Type":"ContainerDied","Data":"1219d400f917e3745b017411876655d46d5d557d4cead4d8eebaf1fd8e1a31f2"} Jan 30 06:39:57 crc kubenswrapper[4841]: I0130 06:39:57.375224 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1219d400f917e3745b017411876655d46d5d557d4cead4d8eebaf1fd8e1a31f2" Jan 30 06:39:57 crc kubenswrapper[4841]: I0130 06:39:57.374945 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9mnmt" Jan 30 06:39:57 crc kubenswrapper[4841]: I0130 06:39:57.731822 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1c57-account-create-update-gz9rp" Jan 30 06:39:57 crc kubenswrapper[4841]: I0130 06:39:57.785512 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxf4b\" (UniqueName: \"kubernetes.io/projected/c9c76d5a-1ed6-49cc-8c42-ebc88890c477-kube-api-access-sxf4b\") pod \"c9c76d5a-1ed6-49cc-8c42-ebc88890c477\" (UID: \"c9c76d5a-1ed6-49cc-8c42-ebc88890c477\") " Jan 30 06:39:57 crc kubenswrapper[4841]: I0130 06:39:57.785605 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9c76d5a-1ed6-49cc-8c42-ebc88890c477-operator-scripts\") pod \"c9c76d5a-1ed6-49cc-8c42-ebc88890c477\" (UID: \"c9c76d5a-1ed6-49cc-8c42-ebc88890c477\") " Jan 30 06:39:57 crc kubenswrapper[4841]: I0130 06:39:57.786312 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9c76d5a-1ed6-49cc-8c42-ebc88890c477-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9c76d5a-1ed6-49cc-8c42-ebc88890c477" (UID: "c9c76d5a-1ed6-49cc-8c42-ebc88890c477"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:39:57 crc kubenswrapper[4841]: I0130 06:39:57.788900 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c76d5a-1ed6-49cc-8c42-ebc88890c477-kube-api-access-sxf4b" (OuterVolumeSpecName: "kube-api-access-sxf4b") pod "c9c76d5a-1ed6-49cc-8c42-ebc88890c477" (UID: "c9c76d5a-1ed6-49cc-8c42-ebc88890c477"). InnerVolumeSpecName "kube-api-access-sxf4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:39:57 crc kubenswrapper[4841]: I0130 06:39:57.887526 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9c76d5a-1ed6-49cc-8c42-ebc88890c477-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:57 crc kubenswrapper[4841]: I0130 06:39:57.887560 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxf4b\" (UniqueName: \"kubernetes.io/projected/c9c76d5a-1ed6-49cc-8c42-ebc88890c477-kube-api-access-sxf4b\") on node \"crc\" DevicePath \"\"" Jan 30 06:39:58 crc kubenswrapper[4841]: I0130 06:39:58.386495 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1c57-account-create-update-gz9rp" event={"ID":"c9c76d5a-1ed6-49cc-8c42-ebc88890c477","Type":"ContainerDied","Data":"bfffad294e619558761c5c202a87ce45ee2928efd7618ec2e84a2fb40ad8916c"} Jan 30 06:39:58 crc kubenswrapper[4841]: I0130 06:39:58.386583 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfffad294e619558761c5c202a87ce45ee2928efd7618ec2e84a2fb40ad8916c" Jan 30 06:39:58 crc kubenswrapper[4841]: I0130 06:39:58.386577 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1c57-account-create-update-gz9rp" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.403245 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-b9xdq"] Jan 30 06:40:03 crc kubenswrapper[4841]: E0130 06:40:03.403905 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b11e52-0c46-441a-9938-0b4237eb9e94" containerName="mariadb-database-create" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.403921 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b11e52-0c46-441a-9938-0b4237eb9e94" containerName="mariadb-database-create" Jan 30 06:40:03 crc kubenswrapper[4841]: E0130 06:40:03.403940 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c76d5a-1ed6-49cc-8c42-ebc88890c477" containerName="mariadb-account-create-update" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.403948 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c76d5a-1ed6-49cc-8c42-ebc88890c477" containerName="mariadb-account-create-update" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.404133 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b11e52-0c46-441a-9938-0b4237eb9e94" containerName="mariadb-database-create" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.404152 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c76d5a-1ed6-49cc-8c42-ebc88890c477" containerName="mariadb-account-create-update" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.405421 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.408356 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.408644 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.409915 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-jpdll" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.412488 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b9xdq"] Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.511789 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28pp5\" (UniqueName: \"kubernetes.io/projected/5825634c-2fce-4b58-8ed9-c6e113c162f0-kube-api-access-28pp5\") pod \"neutron-db-sync-b9xdq\" (UID: \"5825634c-2fce-4b58-8ed9-c6e113c162f0\") " pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.512119 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5825634c-2fce-4b58-8ed9-c6e113c162f0-combined-ca-bundle\") pod \"neutron-db-sync-b9xdq\" (UID: \"5825634c-2fce-4b58-8ed9-c6e113c162f0\") " pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.512213 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5825634c-2fce-4b58-8ed9-c6e113c162f0-config\") pod \"neutron-db-sync-b9xdq\" (UID: \"5825634c-2fce-4b58-8ed9-c6e113c162f0\") " pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.614172 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28pp5\" (UniqueName: \"kubernetes.io/projected/5825634c-2fce-4b58-8ed9-c6e113c162f0-kube-api-access-28pp5\") pod \"neutron-db-sync-b9xdq\" (UID: \"5825634c-2fce-4b58-8ed9-c6e113c162f0\") " pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.614321 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5825634c-2fce-4b58-8ed9-c6e113c162f0-combined-ca-bundle\") pod \"neutron-db-sync-b9xdq\" (UID: \"5825634c-2fce-4b58-8ed9-c6e113c162f0\") " pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.614350 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5825634c-2fce-4b58-8ed9-c6e113c162f0-config\") pod \"neutron-db-sync-b9xdq\" (UID: \"5825634c-2fce-4b58-8ed9-c6e113c162f0\") " pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.621835 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5825634c-2fce-4b58-8ed9-c6e113c162f0-config\") pod \"neutron-db-sync-b9xdq\" (UID: \"5825634c-2fce-4b58-8ed9-c6e113c162f0\") " pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.633085 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5825634c-2fce-4b58-8ed9-c6e113c162f0-combined-ca-bundle\") pod \"neutron-db-sync-b9xdq\" (UID: \"5825634c-2fce-4b58-8ed9-c6e113c162f0\") " pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.634173 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28pp5\" (UniqueName: \"kubernetes.io/projected/5825634c-2fce-4b58-8ed9-c6e113c162f0-kube-api-access-28pp5\") pod \"neutron-db-sync-b9xdq\" (UID: \"5825634c-2fce-4b58-8ed9-c6e113c162f0\") " pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:03 crc kubenswrapper[4841]: I0130 06:40:03.725289 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:04 crc kubenswrapper[4841]: I0130 06:40:04.247382 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b9xdq"] Jan 30 06:40:04 crc kubenswrapper[4841]: I0130 06:40:04.446474 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b9xdq" event={"ID":"5825634c-2fce-4b58-8ed9-c6e113c162f0","Type":"ContainerStarted","Data":"f5e8fd2c969e9d4fc729d0c36af09264b2328b2e7fc46d880ee4ef3b97fde898"} Jan 30 06:40:05 crc kubenswrapper[4841]: I0130 06:40:05.455076 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b9xdq" event={"ID":"5825634c-2fce-4b58-8ed9-c6e113c162f0","Type":"ContainerStarted","Data":"8a99113da5cb60b54732ce2862f6e446264f7631e60823cea8617e66044c1d76"} Jan 30 06:40:05 crc kubenswrapper[4841]: I0130 06:40:05.475330 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-b9xdq" podStartSLOduration=2.475311849 podStartE2EDuration="2.475311849s" podCreationTimestamp="2026-01-30 06:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:05.473386588 +0000 UTC m=+5542.466859256" watchObservedRunningTime="2026-01-30 06:40:05.475311849 +0000 UTC m=+5542.468784487" Jan 30 06:40:09 crc kubenswrapper[4841]: I0130 06:40:09.488651 4841 generic.go:334] "Generic (PLEG): container finished" podID="5825634c-2fce-4b58-8ed9-c6e113c162f0" containerID="8a99113da5cb60b54732ce2862f6e446264f7631e60823cea8617e66044c1d76" exitCode=0 Jan 30 06:40:09 crc kubenswrapper[4841]: I0130 06:40:09.489469 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b9xdq" event={"ID":"5825634c-2fce-4b58-8ed9-c6e113c162f0","Type":"ContainerDied","Data":"8a99113da5cb60b54732ce2862f6e446264f7631e60823cea8617e66044c1d76"} Jan 30 06:40:10 crc kubenswrapper[4841]: I0130 06:40:10.964842 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.054923 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5825634c-2fce-4b58-8ed9-c6e113c162f0-config\") pod \"5825634c-2fce-4b58-8ed9-c6e113c162f0\" (UID: \"5825634c-2fce-4b58-8ed9-c6e113c162f0\") " Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.055030 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28pp5\" (UniqueName: \"kubernetes.io/projected/5825634c-2fce-4b58-8ed9-c6e113c162f0-kube-api-access-28pp5\") pod \"5825634c-2fce-4b58-8ed9-c6e113c162f0\" (UID: \"5825634c-2fce-4b58-8ed9-c6e113c162f0\") " Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.055199 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5825634c-2fce-4b58-8ed9-c6e113c162f0-combined-ca-bundle\") pod \"5825634c-2fce-4b58-8ed9-c6e113c162f0\" (UID: \"5825634c-2fce-4b58-8ed9-c6e113c162f0\") " Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.072681 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5825634c-2fce-4b58-8ed9-c6e113c162f0-kube-api-access-28pp5" (OuterVolumeSpecName: "kube-api-access-28pp5") pod "5825634c-2fce-4b58-8ed9-c6e113c162f0" (UID: "5825634c-2fce-4b58-8ed9-c6e113c162f0"). InnerVolumeSpecName "kube-api-access-28pp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.095227 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5825634c-2fce-4b58-8ed9-c6e113c162f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5825634c-2fce-4b58-8ed9-c6e113c162f0" (UID: "5825634c-2fce-4b58-8ed9-c6e113c162f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.105113 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5825634c-2fce-4b58-8ed9-c6e113c162f0-config" (OuterVolumeSpecName: "config") pod "5825634c-2fce-4b58-8ed9-c6e113c162f0" (UID: "5825634c-2fce-4b58-8ed9-c6e113c162f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.156848 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5825634c-2fce-4b58-8ed9-c6e113c162f0-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.156883 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28pp5\" (UniqueName: \"kubernetes.io/projected/5825634c-2fce-4b58-8ed9-c6e113c162f0-kube-api-access-28pp5\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.156931 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5825634c-2fce-4b58-8ed9-c6e113c162f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.519289 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b9xdq" event={"ID":"5825634c-2fce-4b58-8ed9-c6e113c162f0","Type":"ContainerDied","Data":"f5e8fd2c969e9d4fc729d0c36af09264b2328b2e7fc46d880ee4ef3b97fde898"} Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.519858 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5e8fd2c969e9d4fc729d0c36af09264b2328b2e7fc46d880ee4ef3b97fde898" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.519806 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b9xdq" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.706393 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fc84cddc-q5v86"] Jan 30 06:40:11 crc kubenswrapper[4841]: E0130 06:40:11.706927 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5825634c-2fce-4b58-8ed9-c6e113c162f0" containerName="neutron-db-sync" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.706949 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5825634c-2fce-4b58-8ed9-c6e113c162f0" containerName="neutron-db-sync" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.707200 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5825634c-2fce-4b58-8ed9-c6e113c162f0" containerName="neutron-db-sync" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.708457 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.715750 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fc84cddc-q5v86"] Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.772561 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-dns-svc\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.772654 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-ovsdbserver-nb\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.772685 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4bs8\" (UniqueName: \"kubernetes.io/projected/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-kube-api-access-l4bs8\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.772717 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-config\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.772976 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-ovsdbserver-sb\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.778648 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f694c4c48-s8vmd"] Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.779939 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.784719 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.784903 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.784939 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.790447 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f694c4c48-s8vmd"] Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.797773 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-jpdll" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.875334 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-dns-svc\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.875430 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-httpd-config\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.875464 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-ovndb-tls-certs\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.875491 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-ovsdbserver-nb\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.875512 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4bs8\" (UniqueName: \"kubernetes.io/projected/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-kube-api-access-l4bs8\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.875534 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-config\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.875597 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npsqb\" (UniqueName: \"kubernetes.io/projected/e1f23627-635e-4613-97aa-cc743d1800ad-kube-api-access-npsqb\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.875622 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-ovsdbserver-sb\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.875648 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-combined-ca-bundle\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.875671 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-config\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.877270 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-config\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.877292 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-ovsdbserver-nb\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.877281 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-dns-svc\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.877329 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-ovsdbserver-sb\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.900787 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4bs8\" (UniqueName: \"kubernetes.io/projected/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-kube-api-access-l4bs8\") pod \"dnsmasq-dns-6fc84cddc-q5v86\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.977685 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npsqb\" (UniqueName: \"kubernetes.io/projected/e1f23627-635e-4613-97aa-cc743d1800ad-kube-api-access-npsqb\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.977752 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-combined-ca-bundle\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.977783 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-config\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.977848 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-httpd-config\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.977867 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-ovndb-tls-certs\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.981497 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-ovndb-tls-certs\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.983065 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-config\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.983094 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-combined-ca-bundle\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:11 crc kubenswrapper[4841]: I0130 06:40:11.983592 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-httpd-config\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:12 crc kubenswrapper[4841]: I0130 06:40:12.000007 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npsqb\" (UniqueName: \"kubernetes.io/projected/e1f23627-635e-4613-97aa-cc743d1800ad-kube-api-access-npsqb\") pod \"neutron-f694c4c48-s8vmd\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:12 crc kubenswrapper[4841]: I0130 06:40:12.031085 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:12 crc kubenswrapper[4841]: I0130 06:40:12.096737 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:12 crc kubenswrapper[4841]: I0130 06:40:12.532259 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fc84cddc-q5v86"] Jan 30 06:40:12 crc kubenswrapper[4841]: W0130 06:40:12.532770 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb34d8bc6_5a83_4050_b4eb_b66cb248dab6.slice/crio-cb003a2a0f9b997765ccc49df33a34d1627dd701042dfea7be19c94cb6fcd425 WatchSource:0}: Error finding container cb003a2a0f9b997765ccc49df33a34d1627dd701042dfea7be19c94cb6fcd425: Status 404 returned error can't find the container with id cb003a2a0f9b997765ccc49df33a34d1627dd701042dfea7be19c94cb6fcd425 Jan 30 06:40:12 crc kubenswrapper[4841]: I0130 06:40:12.715621 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f694c4c48-s8vmd"] Jan 30 06:40:12 crc kubenswrapper[4841]: W0130 06:40:12.721739 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1f23627_635e_4613_97aa_cc743d1800ad.slice/crio-b65b7993e13c15dc41adf95d7e5b2af979d807b82d442392e8a1a013310e3203 WatchSource:0}: Error finding container b65b7993e13c15dc41adf95d7e5b2af979d807b82d442392e8a1a013310e3203: Status 404 returned error can't find the container with id b65b7993e13c15dc41adf95d7e5b2af979d807b82d442392e8a1a013310e3203 Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.541212 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f694c4c48-s8vmd" event={"ID":"e1f23627-635e-4613-97aa-cc743d1800ad","Type":"ContainerStarted","Data":"5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c"} Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.542004 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.542021 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f694c4c48-s8vmd" event={"ID":"e1f23627-635e-4613-97aa-cc743d1800ad","Type":"ContainerStarted","Data":"7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a"} Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.542036 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f694c4c48-s8vmd" event={"ID":"e1f23627-635e-4613-97aa-cc743d1800ad","Type":"ContainerStarted","Data":"b65b7993e13c15dc41adf95d7e5b2af979d807b82d442392e8a1a013310e3203"} Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.542584 4841 generic.go:334] "Generic (PLEG): container finished" podID="b34d8bc6-5a83-4050-b4eb-b66cb248dab6" containerID="d130794e0a6cdfa92432f727865f05440ca84039d67e43c4bf287737b1ff617b" exitCode=0 Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.542614 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" event={"ID":"b34d8bc6-5a83-4050-b4eb-b66cb248dab6","Type":"ContainerDied","Data":"d130794e0a6cdfa92432f727865f05440ca84039d67e43c4bf287737b1ff617b"} Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.542632 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" event={"ID":"b34d8bc6-5a83-4050-b4eb-b66cb248dab6","Type":"ContainerStarted","Data":"cb003a2a0f9b997765ccc49df33a34d1627dd701042dfea7be19c94cb6fcd425"} Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.570036 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f694c4c48-s8vmd" podStartSLOduration=2.5700123599999998 podStartE2EDuration="2.57001236s" podCreationTimestamp="2026-01-30 06:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:13.562021957 +0000 UTC m=+5550.555494595" watchObservedRunningTime="2026-01-30 06:40:13.57001236 +0000 UTC m=+5550.563484998" Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.852855 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-589f6d954f-4jxxb"] Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.854056 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.856105 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.856863 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.875923 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-589f6d954f-4jxxb"] Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.914062 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-httpd-config\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.914122 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-public-tls-certs\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.914146 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-ovndb-tls-certs\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.914165 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfwg8\" (UniqueName: \"kubernetes.io/projected/d0490b4c-ee5f-4676-80f4-0d239b99a182-kube-api-access-gfwg8\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.914218 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-config\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.914245 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-internal-tls-certs\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:13 crc kubenswrapper[4841]: I0130 06:40:13.914278 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-combined-ca-bundle\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.015690 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-httpd-config\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.015763 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-public-tls-certs\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.015786 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-ovndb-tls-certs\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.015808 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfwg8\" (UniqueName: \"kubernetes.io/projected/d0490b4c-ee5f-4676-80f4-0d239b99a182-kube-api-access-gfwg8\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.015866 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-config\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.015889 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-internal-tls-certs\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.015922 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-combined-ca-bundle\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.020942 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-internal-tls-certs\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.021173 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-httpd-config\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.021365 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-config\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.029249 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-ovndb-tls-certs\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.029289 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-public-tls-certs\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.029335 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0490b4c-ee5f-4676-80f4-0d239b99a182-combined-ca-bundle\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.035271 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfwg8\" (UniqueName: \"kubernetes.io/projected/d0490b4c-ee5f-4676-80f4-0d239b99a182-kube-api-access-gfwg8\") pod \"neutron-589f6d954f-4jxxb\" (UID: \"d0490b4c-ee5f-4676-80f4-0d239b99a182\") " pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.173187 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.554386 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" event={"ID":"b34d8bc6-5a83-4050-b4eb-b66cb248dab6","Type":"ContainerStarted","Data":"0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620"} Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.554578 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.579240 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" podStartSLOduration=3.579219569 podStartE2EDuration="3.579219569s" podCreationTimestamp="2026-01-30 06:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:14.571698288 +0000 UTC m=+5551.565170936" watchObservedRunningTime="2026-01-30 06:40:14.579219569 +0000 UTC m=+5551.572692217" Jan 30 06:40:14 crc kubenswrapper[4841]: I0130 06:40:14.747282 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-589f6d954f-4jxxb"] Jan 30 06:40:14 crc kubenswrapper[4841]: W0130 06:40:14.751192 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0490b4c_ee5f_4676_80f4_0d239b99a182.slice/crio-0a34912514b7da359d0938c7763522acc39376a6fbb279de7cef39c196d7eb19 WatchSource:0}: Error finding container 0a34912514b7da359d0938c7763522acc39376a6fbb279de7cef39c196d7eb19: Status 404 returned error can't find the container with id 0a34912514b7da359d0938c7763522acc39376a6fbb279de7cef39c196d7eb19 Jan 30 06:40:15 crc kubenswrapper[4841]: I0130 06:40:15.564609 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589f6d954f-4jxxb" event={"ID":"d0490b4c-ee5f-4676-80f4-0d239b99a182","Type":"ContainerStarted","Data":"7f15051fa1f431654bc6664981395597407a989aca8cbfd80b00eef2a88c3f89"} Jan 30 06:40:15 crc kubenswrapper[4841]: I0130 06:40:15.565444 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589f6d954f-4jxxb" event={"ID":"d0490b4c-ee5f-4676-80f4-0d239b99a182","Type":"ContainerStarted","Data":"8c13a921823615eb3e281ed2658c12c447644aba3c350600e9254e022a8b1317"} Jan 30 06:40:15 crc kubenswrapper[4841]: I0130 06:40:15.565542 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:15 crc kubenswrapper[4841]: I0130 06:40:15.565614 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589f6d954f-4jxxb" event={"ID":"d0490b4c-ee5f-4676-80f4-0d239b99a182","Type":"ContainerStarted","Data":"0a34912514b7da359d0938c7763522acc39376a6fbb279de7cef39c196d7eb19"} Jan 30 06:40:15 crc kubenswrapper[4841]: I0130 06:40:15.590037 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-589f6d954f-4jxxb" podStartSLOduration=2.5900208 podStartE2EDuration="2.5900208s" podCreationTimestamp="2026-01-30 06:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:40:15.582699655 +0000 UTC m=+5552.576172293" watchObservedRunningTime="2026-01-30 06:40:15.5900208 +0000 UTC m=+5552.583493438" Jan 30 06:40:22 crc kubenswrapper[4841]: I0130 06:40:22.033052 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:40:22 crc kubenswrapper[4841]: I0130 06:40:22.119216 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f4c986487-5m7g9"] Jan 30 06:40:22 crc kubenswrapper[4841]: I0130 06:40:22.119479 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" podUID="c0fde94d-b3ff-4448-bedd-eef41eccd1f1" containerName="dnsmasq-dns" containerID="cri-o://c285ceb89751114494e42d3f9d4879dec3c330725d49efe61ef36818504762d2" gracePeriod=10 Jan 30 06:40:22 crc kubenswrapper[4841]: I0130 06:40:22.648335 4841 generic.go:334] "Generic (PLEG): container finished" podID="c0fde94d-b3ff-4448-bedd-eef41eccd1f1" containerID="c285ceb89751114494e42d3f9d4879dec3c330725d49efe61ef36818504762d2" exitCode=0 Jan 30 06:40:22 crc kubenswrapper[4841]: I0130 06:40:22.648374 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" event={"ID":"c0fde94d-b3ff-4448-bedd-eef41eccd1f1","Type":"ContainerDied","Data":"c285ceb89751114494e42d3f9d4879dec3c330725d49efe61ef36818504762d2"} Jan 30 06:40:22 crc kubenswrapper[4841]: I0130 06:40:22.750433 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" podUID="c0fde94d-b3ff-4448-bedd-eef41eccd1f1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.35:5353: connect: connection refused" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.117648 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.192182 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-config\") pod \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.192294 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckljd\" (UniqueName: \"kubernetes.io/projected/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-kube-api-access-ckljd\") pod \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.192415 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-dns-svc\") pod \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.192450 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-ovsdbserver-nb\") pod \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.192477 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-ovsdbserver-sb\") pod \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\" (UID: \"c0fde94d-b3ff-4448-bedd-eef41eccd1f1\") " Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.199239 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-kube-api-access-ckljd" (OuterVolumeSpecName: "kube-api-access-ckljd") pod "c0fde94d-b3ff-4448-bedd-eef41eccd1f1" (UID: "c0fde94d-b3ff-4448-bedd-eef41eccd1f1"). InnerVolumeSpecName "kube-api-access-ckljd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.247534 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c0fde94d-b3ff-4448-bedd-eef41eccd1f1" (UID: "c0fde94d-b3ff-4448-bedd-eef41eccd1f1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.260073 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c0fde94d-b3ff-4448-bedd-eef41eccd1f1" (UID: "c0fde94d-b3ff-4448-bedd-eef41eccd1f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.269596 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-config" (OuterVolumeSpecName: "config") pod "c0fde94d-b3ff-4448-bedd-eef41eccd1f1" (UID: "c0fde94d-b3ff-4448-bedd-eef41eccd1f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.270854 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0fde94d-b3ff-4448-bedd-eef41eccd1f1" (UID: "c0fde94d-b3ff-4448-bedd-eef41eccd1f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.294284 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckljd\" (UniqueName: \"kubernetes.io/projected/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-kube-api-access-ckljd\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.294322 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.294331 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.294339 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.294348 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0fde94d-b3ff-4448-bedd-eef41eccd1f1-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.658891 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" event={"ID":"c0fde94d-b3ff-4448-bedd-eef41eccd1f1","Type":"ContainerDied","Data":"2c8022419cb8f75fae6a206ab0c594c1bae80183d321bfdf91cfe5075784dcd5"} Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.658979 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f4c986487-5m7g9" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.659151 4841 scope.go:117] "RemoveContainer" containerID="c285ceb89751114494e42d3f9d4879dec3c330725d49efe61ef36818504762d2" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.704016 4841 scope.go:117] "RemoveContainer" containerID="2088ba1be7d6474a58aba0cb23f220660c5364b387c12555a5f5182debb7b37f" Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.718001 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f4c986487-5m7g9"] Jan 30 06:40:23 crc kubenswrapper[4841]: I0130 06:40:23.726950 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f4c986487-5m7g9"] Jan 30 06:40:24 crc kubenswrapper[4841]: I0130 06:40:24.450547 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0fde94d-b3ff-4448-bedd-eef41eccd1f1" path="/var/lib/kubelet/pods/c0fde94d-b3ff-4448-bedd-eef41eccd1f1/volumes" Jan 30 06:40:42 crc kubenswrapper[4841]: I0130 06:40:42.112523 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:44 crc kubenswrapper[4841]: I0130 06:40:44.198321 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-589f6d954f-4jxxb" Jan 30 06:40:44 crc kubenswrapper[4841]: I0130 06:40:44.298777 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f694c4c48-s8vmd"] Jan 30 06:40:44 crc kubenswrapper[4841]: I0130 06:40:44.299055 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f694c4c48-s8vmd" podUID="e1f23627-635e-4613-97aa-cc743d1800ad" containerName="neutron-api" containerID="cri-o://7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a" gracePeriod=30 Jan 30 06:40:44 crc kubenswrapper[4841]: I0130 06:40:44.299357 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f694c4c48-s8vmd" podUID="e1f23627-635e-4613-97aa-cc743d1800ad" containerName="neutron-httpd" containerID="cri-o://5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c" gracePeriod=30 Jan 30 06:40:44 crc kubenswrapper[4841]: I0130 06:40:44.911268 4841 generic.go:334] "Generic (PLEG): container finished" podID="e1f23627-635e-4613-97aa-cc743d1800ad" containerID="5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c" exitCode=0 Jan 30 06:40:44 crc kubenswrapper[4841]: I0130 06:40:44.911317 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f694c4c48-s8vmd" event={"ID":"e1f23627-635e-4613-97aa-cc743d1800ad","Type":"ContainerDied","Data":"5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c"} Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.769447 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.815463 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npsqb\" (UniqueName: \"kubernetes.io/projected/e1f23627-635e-4613-97aa-cc743d1800ad-kube-api-access-npsqb\") pod \"e1f23627-635e-4613-97aa-cc743d1800ad\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.815621 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-httpd-config\") pod \"e1f23627-635e-4613-97aa-cc743d1800ad\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.815789 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-combined-ca-bundle\") pod \"e1f23627-635e-4613-97aa-cc743d1800ad\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.815832 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-ovndb-tls-certs\") pod \"e1f23627-635e-4613-97aa-cc743d1800ad\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.815898 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-config\") pod \"e1f23627-635e-4613-97aa-cc743d1800ad\" (UID: \"e1f23627-635e-4613-97aa-cc743d1800ad\") " Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.828027 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f23627-635e-4613-97aa-cc743d1800ad-kube-api-access-npsqb" (OuterVolumeSpecName: "kube-api-access-npsqb") pod "e1f23627-635e-4613-97aa-cc743d1800ad" (UID: "e1f23627-635e-4613-97aa-cc743d1800ad"). InnerVolumeSpecName "kube-api-access-npsqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.842143 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e1f23627-635e-4613-97aa-cc743d1800ad" (UID: "e1f23627-635e-4613-97aa-cc743d1800ad"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.885562 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1f23627-635e-4613-97aa-cc743d1800ad" (UID: "e1f23627-635e-4613-97aa-cc743d1800ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.887708 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-config" (OuterVolumeSpecName: "config") pod "e1f23627-635e-4613-97aa-cc743d1800ad" (UID: "e1f23627-635e-4613-97aa-cc743d1800ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.918204 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npsqb\" (UniqueName: \"kubernetes.io/projected/e1f23627-635e-4613-97aa-cc743d1800ad-kube-api-access-npsqb\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.918948 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.919020 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.919080 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:58 crc kubenswrapper[4841]: I0130 06:40:58.930624 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e1f23627-635e-4613-97aa-cc743d1800ad" (UID: "e1f23627-635e-4613-97aa-cc743d1800ad"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.019869 4841 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1f23627-635e-4613-97aa-cc743d1800ad-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.054272 4841 generic.go:334] "Generic (PLEG): container finished" podID="e1f23627-635e-4613-97aa-cc743d1800ad" containerID="7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a" exitCode=0 Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.054327 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f694c4c48-s8vmd" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.054334 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f694c4c48-s8vmd" event={"ID":"e1f23627-635e-4613-97aa-cc743d1800ad","Type":"ContainerDied","Data":"7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a"} Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.054388 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f694c4c48-s8vmd" event={"ID":"e1f23627-635e-4613-97aa-cc743d1800ad","Type":"ContainerDied","Data":"b65b7993e13c15dc41adf95d7e5b2af979d807b82d442392e8a1a013310e3203"} Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.054439 4841 scope.go:117] "RemoveContainer" containerID="5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.084063 4841 scope.go:117] "RemoveContainer" containerID="7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.087161 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f694c4c48-s8vmd"] Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.103770 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f694c4c48-s8vmd"] Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.114818 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mt4km"] Jan 30 06:40:59 crc kubenswrapper[4841]: E0130 06:40:59.115167 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f23627-635e-4613-97aa-cc743d1800ad" containerName="neutron-api" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.115182 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f23627-635e-4613-97aa-cc743d1800ad" containerName="neutron-api" Jan 30 06:40:59 crc kubenswrapper[4841]: E0130 06:40:59.115208 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f23627-635e-4613-97aa-cc743d1800ad" containerName="neutron-httpd" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.115216 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f23627-635e-4613-97aa-cc743d1800ad" containerName="neutron-httpd" Jan 30 06:40:59 crc kubenswrapper[4841]: E0130 06:40:59.115239 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0fde94d-b3ff-4448-bedd-eef41eccd1f1" containerName="dnsmasq-dns" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.115246 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0fde94d-b3ff-4448-bedd-eef41eccd1f1" containerName="dnsmasq-dns" Jan 30 06:40:59 crc kubenswrapper[4841]: E0130 06:40:59.115265 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0fde94d-b3ff-4448-bedd-eef41eccd1f1" containerName="init" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.115273 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0fde94d-b3ff-4448-bedd-eef41eccd1f1" containerName="init" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.115650 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f23627-635e-4613-97aa-cc743d1800ad" containerName="neutron-httpd" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.115686 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0fde94d-b3ff-4448-bedd-eef41eccd1f1" containerName="dnsmasq-dns" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.115710 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f23627-635e-4613-97aa-cc743d1800ad" containerName="neutron-api" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.116365 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.123633 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.123979 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-s5k5j" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.124123 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.124222 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.124322 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.132139 4841 scope.go:117] "RemoveContainer" containerID="5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c" Jan 30 06:40:59 crc kubenswrapper[4841]: E0130 06:40:59.133388 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c\": container with ID starting with 5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c not found: ID does not exist" containerID="5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.133456 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c"} err="failed to get container status \"5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c\": rpc error: code = NotFound desc = could not find container \"5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c\": container with ID starting with 5f9e2293e551fd7e3b5ebd4a57d1ba8c8100da1f613b416f4d909108f6d0917c not found: ID does not exist" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.133487 4841 scope.go:117] "RemoveContainer" containerID="7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a" Jan 30 06:40:59 crc kubenswrapper[4841]: E0130 06:40:59.134339 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a\": container with ID starting with 7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a not found: ID does not exist" containerID="7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.134373 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a"} err="failed to get container status \"7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a\": rpc error: code = NotFound desc = could not find container \"7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a\": container with ID starting with 7f7ab32263590e441126808d965e19799ed3c176d9547bb4a7426b1e60c1d60a not found: ID does not exist" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.153933 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mt4km"] Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.210185 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f65fd76ff-29kqp"] Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.211508 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.224727 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-etc-swift\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.224785 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-scripts\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.224804 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-swiftconf\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.224863 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spnnv\" (UniqueName: \"kubernetes.io/projected/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-kube-api-access-spnnv\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.224903 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-dispersionconf\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.224924 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-ring-data-devices\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.224971 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-combined-ca-bundle\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.240058 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f65fd76ff-29kqp"] Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.326289 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-scripts\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.326583 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-swiftconf\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.326638 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-dns-svc\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.326679 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spnnv\" (UniqueName: \"kubernetes.io/projected/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-kube-api-access-spnnv\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.326718 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-dispersionconf\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.326735 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-ovsdbserver-sb\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.326759 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-ring-data-devices\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.326779 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgblx\" (UniqueName: \"kubernetes.io/projected/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-kube-api-access-tgblx\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.326801 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-ovsdbserver-nb\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.326821 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-config\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.326837 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-combined-ca-bundle\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.326860 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-etc-swift\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.327207 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-etc-swift\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.327279 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-scripts\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.328072 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-ring-data-devices\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.334155 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-dispersionconf\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.352037 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-swiftconf\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.366065 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-combined-ca-bundle\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.375041 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spnnv\" (UniqueName: \"kubernetes.io/projected/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-kube-api-access-spnnv\") pod \"swift-ring-rebalance-mt4km\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.430894 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-ovsdbserver-nb\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.430941 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-config\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.431014 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-dns-svc\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.431068 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-ovsdbserver-sb\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.431100 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgblx\" (UniqueName: \"kubernetes.io/projected/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-kube-api-access-tgblx\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.432338 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-ovsdbserver-nb\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.432852 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-config\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.433324 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-dns-svc\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.437924 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-ovsdbserver-sb\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.440692 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.475984 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgblx\" (UniqueName: \"kubernetes.io/projected/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-kube-api-access-tgblx\") pod \"dnsmasq-dns-5f65fd76ff-29kqp\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.556993 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.621931 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2xqx8"] Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.623925 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.632766 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2xqx8"] Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.736264 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e12061be-982a-4441-baa0-7d1da89faede-utilities\") pod \"community-operators-2xqx8\" (UID: \"e12061be-982a-4441-baa0-7d1da89faede\") " pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.736310 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zxj7\" (UniqueName: \"kubernetes.io/projected/e12061be-982a-4441-baa0-7d1da89faede-kube-api-access-7zxj7\") pod \"community-operators-2xqx8\" (UID: \"e12061be-982a-4441-baa0-7d1da89faede\") " pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.736362 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e12061be-982a-4441-baa0-7d1da89faede-catalog-content\") pod \"community-operators-2xqx8\" (UID: \"e12061be-982a-4441-baa0-7d1da89faede\") " pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.837597 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e12061be-982a-4441-baa0-7d1da89faede-utilities\") pod \"community-operators-2xqx8\" (UID: \"e12061be-982a-4441-baa0-7d1da89faede\") " pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.837636 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zxj7\" (UniqueName: \"kubernetes.io/projected/e12061be-982a-4441-baa0-7d1da89faede-kube-api-access-7zxj7\") pod \"community-operators-2xqx8\" (UID: \"e12061be-982a-4441-baa0-7d1da89faede\") " pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.837675 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e12061be-982a-4441-baa0-7d1da89faede-catalog-content\") pod \"community-operators-2xqx8\" (UID: \"e12061be-982a-4441-baa0-7d1da89faede\") " pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.838109 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e12061be-982a-4441-baa0-7d1da89faede-catalog-content\") pod \"community-operators-2xqx8\" (UID: \"e12061be-982a-4441-baa0-7d1da89faede\") " pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.838302 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e12061be-982a-4441-baa0-7d1da89faede-utilities\") pod \"community-operators-2xqx8\" (UID: \"e12061be-982a-4441-baa0-7d1da89faede\") " pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.872604 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zxj7\" (UniqueName: \"kubernetes.io/projected/e12061be-982a-4441-baa0-7d1da89faede-kube-api-access-7zxj7\") pod \"community-operators-2xqx8\" (UID: \"e12061be-982a-4441-baa0-7d1da89faede\") " pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.944423 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mt4km"] Jan 30 06:40:59 crc kubenswrapper[4841]: W0130 06:40:59.947007 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode345d041_d2b4_4aa7_aae0_35c5dbba9d0b.slice/crio-76d7c66530d14bedd571eda34d87516b99f9769b343ef49d794cb2d6c5a4c41f WatchSource:0}: Error finding container 76d7c66530d14bedd571eda34d87516b99f9769b343ef49d794cb2d6c5a4c41f: Status 404 returned error can't find the container with id 76d7c66530d14bedd571eda34d87516b99f9769b343ef49d794cb2d6c5a4c41f Jan 30 06:40:59 crc kubenswrapper[4841]: I0130 06:40:59.961328 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:41:00 crc kubenswrapper[4841]: I0130 06:41:00.097411 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f65fd76ff-29kqp"] Jan 30 06:41:00 crc kubenswrapper[4841]: I0130 06:41:00.112085 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mt4km" event={"ID":"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b","Type":"ContainerStarted","Data":"76d7c66530d14bedd571eda34d87516b99f9769b343ef49d794cb2d6c5a4c41f"} Jan 30 06:41:00 crc kubenswrapper[4841]: W0130 06:41:00.120818 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98ed3339_b24b_44c5_b2f5_47a5c84a39cd.slice/crio-f5110dd8434a3c3cdc102ed4f1bae36a2437ea05b036f9528ef33eec3afa446c WatchSource:0}: Error finding container f5110dd8434a3c3cdc102ed4f1bae36a2437ea05b036f9528ef33eec3afa446c: Status 404 returned error can't find the container with id f5110dd8434a3c3cdc102ed4f1bae36a2437ea05b036f9528ef33eec3afa446c Jan 30 06:41:00 crc kubenswrapper[4841]: I0130 06:41:00.421097 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2xqx8"] Jan 30 06:41:00 crc kubenswrapper[4841]: W0130 06:41:00.428657 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12061be_982a_4441_baa0_7d1da89faede.slice/crio-74689e5a8dba0695914d7c7ae1e753399179b8b61f59f8cc263ec3b6ead6437f WatchSource:0}: Error finding container 74689e5a8dba0695914d7c7ae1e753399179b8b61f59f8cc263ec3b6ead6437f: Status 404 returned error can't find the container with id 74689e5a8dba0695914d7c7ae1e753399179b8b61f59f8cc263ec3b6ead6437f Jan 30 06:41:00 crc kubenswrapper[4841]: I0130 06:41:00.439707 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f23627-635e-4613-97aa-cc743d1800ad" path="/var/lib/kubelet/pods/e1f23627-635e-4613-97aa-cc743d1800ad/volumes" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.122735 4841 generic.go:334] "Generic (PLEG): container finished" podID="98ed3339-b24b-44c5-b2f5-47a5c84a39cd" containerID="65ab125f74e01f81d0468ef3b9613680f09d9bcd57c7cdafaa370a8e415c699a" exitCode=0 Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.122819 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" event={"ID":"98ed3339-b24b-44c5-b2f5-47a5c84a39cd","Type":"ContainerDied","Data":"65ab125f74e01f81d0468ef3b9613680f09d9bcd57c7cdafaa370a8e415c699a"} Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.122873 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" event={"ID":"98ed3339-b24b-44c5-b2f5-47a5c84a39cd","Type":"ContainerStarted","Data":"f5110dd8434a3c3cdc102ed4f1bae36a2437ea05b036f9528ef33eec3afa446c"} Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.126938 4841 generic.go:334] "Generic (PLEG): container finished" podID="e12061be-982a-4441-baa0-7d1da89faede" containerID="8c677b9630005a503c0c52a8017c1b3519a71f5ec6e024bfc27d1f4e97396f2d" exitCode=0 Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.127016 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqx8" event={"ID":"e12061be-982a-4441-baa0-7d1da89faede","Type":"ContainerDied","Data":"8c677b9630005a503c0c52a8017c1b3519a71f5ec6e024bfc27d1f4e97396f2d"} Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.127042 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqx8" event={"ID":"e12061be-982a-4441-baa0-7d1da89faede","Type":"ContainerStarted","Data":"74689e5a8dba0695914d7c7ae1e753399179b8b61f59f8cc263ec3b6ead6437f"} Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.135576 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mt4km" event={"ID":"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b","Type":"ContainerStarted","Data":"49590a69bd692d7838bf8e4c7214aaeb0d2b3727c496d838fb409d9ab693be83"} Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.228970 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mt4km" podStartSLOduration=2.22893664 podStartE2EDuration="2.22893664s" podCreationTimestamp="2026-01-30 06:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:41:01.212115392 +0000 UTC m=+5598.205588030" watchObservedRunningTime="2026-01-30 06:41:01.22893664 +0000 UTC m=+5598.222409278" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.587855 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5db8797b66-krqln"] Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.594021 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.599663 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.653565 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5db8797b66-krqln"] Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.775273 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/975a1df1-de82-496c-b7cf-7dd0a83bb6de-run-httpd\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.775458 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/975a1df1-de82-496c-b7cf-7dd0a83bb6de-etc-swift\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.775527 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975a1df1-de82-496c-b7cf-7dd0a83bb6de-config-data\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.775770 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/975a1df1-de82-496c-b7cf-7dd0a83bb6de-log-httpd\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.775817 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbtkk\" (UniqueName: \"kubernetes.io/projected/975a1df1-de82-496c-b7cf-7dd0a83bb6de-kube-api-access-fbtkk\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.775930 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975a1df1-de82-496c-b7cf-7dd0a83bb6de-combined-ca-bundle\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.877129 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/975a1df1-de82-496c-b7cf-7dd0a83bb6de-etc-swift\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.877194 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975a1df1-de82-496c-b7cf-7dd0a83bb6de-config-data\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.877267 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/975a1df1-de82-496c-b7cf-7dd0a83bb6de-log-httpd\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.877290 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbtkk\" (UniqueName: \"kubernetes.io/projected/975a1df1-de82-496c-b7cf-7dd0a83bb6de-kube-api-access-fbtkk\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.877331 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975a1df1-de82-496c-b7cf-7dd0a83bb6de-combined-ca-bundle\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.877414 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/975a1df1-de82-496c-b7cf-7dd0a83bb6de-run-httpd\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.877942 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/975a1df1-de82-496c-b7cf-7dd0a83bb6de-run-httpd\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.878071 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/975a1df1-de82-496c-b7cf-7dd0a83bb6de-log-httpd\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.885592 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975a1df1-de82-496c-b7cf-7dd0a83bb6de-config-data\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.886701 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/975a1df1-de82-496c-b7cf-7dd0a83bb6de-etc-swift\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.888810 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975a1df1-de82-496c-b7cf-7dd0a83bb6de-combined-ca-bundle\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.898057 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbtkk\" (UniqueName: \"kubernetes.io/projected/975a1df1-de82-496c-b7cf-7dd0a83bb6de-kube-api-access-fbtkk\") pod \"swift-proxy-5db8797b66-krqln\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:01 crc kubenswrapper[4841]: I0130 06:41:01.930042 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:02 crc kubenswrapper[4841]: I0130 06:41:02.148340 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" event={"ID":"98ed3339-b24b-44c5-b2f5-47a5c84a39cd","Type":"ContainerStarted","Data":"ac000c6e9be776fa61763225bed18d5d5cb0c95daccf056091f71f064a0d7d38"} Jan 30 06:41:02 crc kubenswrapper[4841]: I0130 06:41:02.148916 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:41:02 crc kubenswrapper[4841]: I0130 06:41:02.156118 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqx8" event={"ID":"e12061be-982a-4441-baa0-7d1da89faede","Type":"ContainerStarted","Data":"447f61a7b9f4403aa1b6d5510cdd4308d7fb741b1c037f2aaf24d50782ce4894"} Jan 30 06:41:02 crc kubenswrapper[4841]: I0130 06:41:02.167928 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" podStartSLOduration=3.167915456 podStartE2EDuration="3.167915456s" podCreationTimestamp="2026-01-30 06:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:41:02.165579104 +0000 UTC m=+5599.159051742" watchObservedRunningTime="2026-01-30 06:41:02.167915456 +0000 UTC m=+5599.161388084" Jan 30 06:41:02 crc kubenswrapper[4841]: I0130 06:41:02.608319 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5db8797b66-krqln"] Jan 30 06:41:02 crc kubenswrapper[4841]: W0130 06:41:02.618230 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod975a1df1_de82_496c_b7cf_7dd0a83bb6de.slice/crio-155342417fabb4f88c5274deda7312c644986b075de2462cc3cf4ffea17c11f4 WatchSource:0}: Error finding container 155342417fabb4f88c5274deda7312c644986b075de2462cc3cf4ffea17c11f4: Status 404 returned error can't find the container with id 155342417fabb4f88c5274deda7312c644986b075de2462cc3cf4ffea17c11f4 Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.032620 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-666dbb5c86-lvn9k"] Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.038666 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.041531 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.042308 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.064844 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-666dbb5c86-lvn9k"] Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.193589 4841 generic.go:334] "Generic (PLEG): container finished" podID="e12061be-982a-4441-baa0-7d1da89faede" containerID="447f61a7b9f4403aa1b6d5510cdd4308d7fb741b1c037f2aaf24d50782ce4894" exitCode=0 Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.193664 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqx8" event={"ID":"e12061be-982a-4441-baa0-7d1da89faede","Type":"ContainerDied","Data":"447f61a7b9f4403aa1b6d5510cdd4308d7fb741b1c037f2aaf24d50782ce4894"} Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.201293 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5db8797b66-krqln" event={"ID":"975a1df1-de82-496c-b7cf-7dd0a83bb6de","Type":"ContainerStarted","Data":"b6b769ee2b4984aa64b8bbd41e2662acf984ed8acd7111aeda3fed6fec681422"} Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.201783 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5db8797b66-krqln" event={"ID":"975a1df1-de82-496c-b7cf-7dd0a83bb6de","Type":"ContainerStarted","Data":"fd262c94e2fedea4bb7803fb7f54e1fab69cfa0f020b0094dc66d102ca526360"} Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.201797 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5db8797b66-krqln" event={"ID":"975a1df1-de82-496c-b7cf-7dd0a83bb6de","Type":"ContainerStarted","Data":"155342417fabb4f88c5274deda7312c644986b075de2462cc3cf4ffea17c11f4"} Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.202910 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eab423d-c14d-4368-8fa2-7cf5147b9410-combined-ca-bundle\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.202961 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eab423d-c14d-4368-8fa2-7cf5147b9410-internal-tls-certs\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.202989 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eab423d-c14d-4368-8fa2-7cf5147b9410-log-httpd\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.203043 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eab423d-c14d-4368-8fa2-7cf5147b9410-run-httpd\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.203067 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6eab423d-c14d-4368-8fa2-7cf5147b9410-etc-swift\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.203100 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eab423d-c14d-4368-8fa2-7cf5147b9410-config-data\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.203135 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eab423d-c14d-4368-8fa2-7cf5147b9410-public-tls-certs\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.203191 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg45c\" (UniqueName: \"kubernetes.io/projected/6eab423d-c14d-4368-8fa2-7cf5147b9410-kube-api-access-xg45c\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.204162 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.204197 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.239613 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5db8797b66-krqln" podStartSLOduration=2.239591811 podStartE2EDuration="2.239591811s" podCreationTimestamp="2026-01-30 06:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:41:03.23654978 +0000 UTC m=+5600.230022418" watchObservedRunningTime="2026-01-30 06:41:03.239591811 +0000 UTC m=+5600.233064459" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.305639 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eab423d-c14d-4368-8fa2-7cf5147b9410-config-data\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.305746 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eab423d-c14d-4368-8fa2-7cf5147b9410-public-tls-certs\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.305823 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg45c\" (UniqueName: \"kubernetes.io/projected/6eab423d-c14d-4368-8fa2-7cf5147b9410-kube-api-access-xg45c\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.305859 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eab423d-c14d-4368-8fa2-7cf5147b9410-combined-ca-bundle\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.305901 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eab423d-c14d-4368-8fa2-7cf5147b9410-internal-tls-certs\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.305930 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eab423d-c14d-4368-8fa2-7cf5147b9410-log-httpd\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.306030 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eab423d-c14d-4368-8fa2-7cf5147b9410-run-httpd\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.306054 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6eab423d-c14d-4368-8fa2-7cf5147b9410-etc-swift\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.307474 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eab423d-c14d-4368-8fa2-7cf5147b9410-log-httpd\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.308956 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eab423d-c14d-4368-8fa2-7cf5147b9410-run-httpd\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.311016 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eab423d-c14d-4368-8fa2-7cf5147b9410-combined-ca-bundle\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.311051 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eab423d-c14d-4368-8fa2-7cf5147b9410-public-tls-certs\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.311139 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eab423d-c14d-4368-8fa2-7cf5147b9410-internal-tls-certs\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.311752 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eab423d-c14d-4368-8fa2-7cf5147b9410-config-data\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.315275 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6eab423d-c14d-4368-8fa2-7cf5147b9410-etc-swift\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.326151 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg45c\" (UniqueName: \"kubernetes.io/projected/6eab423d-c14d-4368-8fa2-7cf5147b9410-kube-api-access-xg45c\") pod \"swift-proxy-666dbb5c86-lvn9k\" (UID: \"6eab423d-c14d-4368-8fa2-7cf5147b9410\") " pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:03 crc kubenswrapper[4841]: I0130 06:41:03.411936 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:04 crc kubenswrapper[4841]: I0130 06:41:04.000598 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-666dbb5c86-lvn9k"] Jan 30 06:41:04 crc kubenswrapper[4841]: I0130 06:41:04.209193 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqx8" event={"ID":"e12061be-982a-4441-baa0-7d1da89faede","Type":"ContainerStarted","Data":"554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465"} Jan 30 06:41:04 crc kubenswrapper[4841]: I0130 06:41:04.214736 4841 generic.go:334] "Generic (PLEG): container finished" podID="e345d041-d2b4-4aa7-aae0-35c5dbba9d0b" containerID="49590a69bd692d7838bf8e4c7214aaeb0d2b3727c496d838fb409d9ab693be83" exitCode=0 Jan 30 06:41:04 crc kubenswrapper[4841]: I0130 06:41:04.214789 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mt4km" event={"ID":"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b","Type":"ContainerDied","Data":"49590a69bd692d7838bf8e4c7214aaeb0d2b3727c496d838fb409d9ab693be83"} Jan 30 06:41:04 crc kubenswrapper[4841]: I0130 06:41:04.216519 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-666dbb5c86-lvn9k" event={"ID":"6eab423d-c14d-4368-8fa2-7cf5147b9410","Type":"ContainerStarted","Data":"58b434d58e1785e40cd482829c5468744bf155ef947eb6389aea6eb7e810eaa9"} Jan 30 06:41:04 crc kubenswrapper[4841]: I0130 06:41:04.236465 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2xqx8" podStartSLOduration=2.597763108 podStartE2EDuration="5.236385798s" podCreationTimestamp="2026-01-30 06:40:59 +0000 UTC" firstStartedPulling="2026-01-30 06:41:01.128971444 +0000 UTC m=+5598.122444102" lastFinishedPulling="2026-01-30 06:41:03.767594154 +0000 UTC m=+5600.761066792" observedRunningTime="2026-01-30 06:41:04.226290749 +0000 UTC m=+5601.219763387" watchObservedRunningTime="2026-01-30 06:41:04.236385798 +0000 UTC m=+5601.229858436" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.224564 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-666dbb5c86-lvn9k" event={"ID":"6eab423d-c14d-4368-8fa2-7cf5147b9410","Type":"ContainerStarted","Data":"d04ff4dffca62dd971da591a1f49119530558571faeb6b008aaa52aa23f5e7c0"} Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.227249 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-666dbb5c86-lvn9k" event={"ID":"6eab423d-c14d-4368-8fa2-7cf5147b9410","Type":"ContainerStarted","Data":"a27f3765c56b87f2bf2d2b1404d3efdb1bddfbc58a77d548195da3977eef5dd7"} Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.255273 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-666dbb5c86-lvn9k" podStartSLOduration=2.255253055 podStartE2EDuration="2.255253055s" podCreationTimestamp="2026-01-30 06:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:41:05.249796329 +0000 UTC m=+5602.243268967" watchObservedRunningTime="2026-01-30 06:41:05.255253055 +0000 UTC m=+5602.248725703" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.692183 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.763301 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spnnv\" (UniqueName: \"kubernetes.io/projected/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-kube-api-access-spnnv\") pod \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.763436 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-etc-swift\") pod \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.763501 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-combined-ca-bundle\") pod \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.763584 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-scripts\") pod \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.763642 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-dispersionconf\") pod \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.763665 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-ring-data-devices\") pod \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.763735 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-swiftconf\") pod \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\" (UID: \"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b\") " Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.764179 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b" (UID: "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.764869 4841 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.764964 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b" (UID: "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.769264 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-kube-api-access-spnnv" (OuterVolumeSpecName: "kube-api-access-spnnv") pod "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b" (UID: "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b"). InnerVolumeSpecName "kube-api-access-spnnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.773277 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b" (UID: "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.799947 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b" (UID: "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.813821 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-scripts" (OuterVolumeSpecName: "scripts") pod "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b" (UID: "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.820548 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b" (UID: "e345d041-d2b4-4aa7-aae0-35c5dbba9d0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.866632 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.866677 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.866690 4841 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.866703 4841 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.866714 4841 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:05 crc kubenswrapper[4841]: I0130 06:41:05.866725 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spnnv\" (UniqueName: \"kubernetes.io/projected/e345d041-d2b4-4aa7-aae0-35c5dbba9d0b-kube-api-access-spnnv\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:06 crc kubenswrapper[4841]: I0130 06:41:06.238095 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mt4km" event={"ID":"e345d041-d2b4-4aa7-aae0-35c5dbba9d0b","Type":"ContainerDied","Data":"76d7c66530d14bedd571eda34d87516b99f9769b343ef49d794cb2d6c5a4c41f"} Jan 30 06:41:06 crc kubenswrapper[4841]: I0130 06:41:06.238151 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76d7c66530d14bedd571eda34d87516b99f9769b343ef49d794cb2d6c5a4c41f" Jan 30 06:41:06 crc kubenswrapper[4841]: I0130 06:41:06.238180 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mt4km" Jan 30 06:41:06 crc kubenswrapper[4841]: I0130 06:41:06.238552 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:06 crc kubenswrapper[4841]: I0130 06:41:06.238858 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:09 crc kubenswrapper[4841]: I0130 06:41:09.558961 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:41:09 crc kubenswrapper[4841]: I0130 06:41:09.626989 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fc84cddc-q5v86"] Jan 30 06:41:09 crc kubenswrapper[4841]: I0130 06:41:09.627216 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" podUID="b34d8bc6-5a83-4050-b4eb-b66cb248dab6" containerName="dnsmasq-dns" containerID="cri-o://0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620" gracePeriod=10 Jan 30 06:41:09 crc kubenswrapper[4841]: I0130 06:41:09.962050 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:41:09 crc kubenswrapper[4841]: I0130 06:41:09.962378 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.019180 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.094762 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.263219 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-config\") pod \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.263342 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-dns-svc\") pod \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.263379 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-ovsdbserver-nb\") pod \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.263457 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4bs8\" (UniqueName: \"kubernetes.io/projected/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-kube-api-access-l4bs8\") pod \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.263526 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-ovsdbserver-sb\") pod \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\" (UID: \"b34d8bc6-5a83-4050-b4eb-b66cb248dab6\") " Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.269020 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-kube-api-access-l4bs8" (OuterVolumeSpecName: "kube-api-access-l4bs8") pod "b34d8bc6-5a83-4050-b4eb-b66cb248dab6" (UID: "b34d8bc6-5a83-4050-b4eb-b66cb248dab6"). InnerVolumeSpecName "kube-api-access-l4bs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.281240 4841 generic.go:334] "Generic (PLEG): container finished" podID="b34d8bc6-5a83-4050-b4eb-b66cb248dab6" containerID="0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620" exitCode=0 Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.282386 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.282475 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" event={"ID":"b34d8bc6-5a83-4050-b4eb-b66cb248dab6","Type":"ContainerDied","Data":"0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620"} Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.282526 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fc84cddc-q5v86" event={"ID":"b34d8bc6-5a83-4050-b4eb-b66cb248dab6","Type":"ContainerDied","Data":"cb003a2a0f9b997765ccc49df33a34d1627dd701042dfea7be19c94cb6fcd425"} Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.282546 4841 scope.go:117] "RemoveContainer" containerID="0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.317933 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b34d8bc6-5a83-4050-b4eb-b66cb248dab6" (UID: "b34d8bc6-5a83-4050-b4eb-b66cb248dab6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.326253 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-config" (OuterVolumeSpecName: "config") pod "b34d8bc6-5a83-4050-b4eb-b66cb248dab6" (UID: "b34d8bc6-5a83-4050-b4eb-b66cb248dab6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.326933 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b34d8bc6-5a83-4050-b4eb-b66cb248dab6" (UID: "b34d8bc6-5a83-4050-b4eb-b66cb248dab6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.336022 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b34d8bc6-5a83-4050-b4eb-b66cb248dab6" (UID: "b34d8bc6-5a83-4050-b4eb-b66cb248dab6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.338251 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.365968 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4bs8\" (UniqueName: \"kubernetes.io/projected/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-kube-api-access-l4bs8\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.366007 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.366021 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.366033 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.366044 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34d8bc6-5a83-4050-b4eb-b66cb248dab6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.392358 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2xqx8"] Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.398201 4841 scope.go:117] "RemoveContainer" containerID="d130794e0a6cdfa92432f727865f05440ca84039d67e43c4bf287737b1ff617b" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.421070 4841 scope.go:117] "RemoveContainer" containerID="0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620" Jan 30 06:41:10 crc kubenswrapper[4841]: E0130 06:41:10.421426 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620\": container with ID starting with 0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620 not found: ID does not exist" containerID="0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.421463 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620"} err="failed to get container status \"0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620\": rpc error: code = NotFound desc = could not find container \"0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620\": container with ID starting with 0ed7637c1f818aef8ec49f8f1abf4bd14ab310d79cdb25d6d52ba3a7bdea6620 not found: ID does not exist" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.421490 4841 scope.go:117] "RemoveContainer" containerID="d130794e0a6cdfa92432f727865f05440ca84039d67e43c4bf287737b1ff617b" Jan 30 06:41:10 crc kubenswrapper[4841]: E0130 06:41:10.421807 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d130794e0a6cdfa92432f727865f05440ca84039d67e43c4bf287737b1ff617b\": container with ID starting with d130794e0a6cdfa92432f727865f05440ca84039d67e43c4bf287737b1ff617b not found: ID does not exist" containerID="d130794e0a6cdfa92432f727865f05440ca84039d67e43c4bf287737b1ff617b" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.421836 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d130794e0a6cdfa92432f727865f05440ca84039d67e43c4bf287737b1ff617b"} err="failed to get container status \"d130794e0a6cdfa92432f727865f05440ca84039d67e43c4bf287737b1ff617b\": rpc error: code = NotFound desc = could not find container \"d130794e0a6cdfa92432f727865f05440ca84039d67e43c4bf287737b1ff617b\": container with ID starting with d130794e0a6cdfa92432f727865f05440ca84039d67e43c4bf287737b1ff617b not found: ID does not exist" Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.608198 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fc84cddc-q5v86"] Jan 30 06:41:10 crc kubenswrapper[4841]: I0130 06:41:10.614509 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fc84cddc-q5v86"] Jan 30 06:41:11 crc kubenswrapper[4841]: I0130 06:41:11.932781 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:11 crc kubenswrapper[4841]: I0130 06:41:11.934802 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:12 crc kubenswrapper[4841]: I0130 06:41:12.297450 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2xqx8" podUID="e12061be-982a-4441-baa0-7d1da89faede" containerName="registry-server" containerID="cri-o://554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465" gracePeriod=2 Jan 30 06:41:12 crc kubenswrapper[4841]: I0130 06:41:12.455384 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b34d8bc6-5a83-4050-b4eb-b66cb248dab6" path="/var/lib/kubelet/pods/b34d8bc6-5a83-4050-b4eb-b66cb248dab6/volumes" Jan 30 06:41:12 crc kubenswrapper[4841]: I0130 06:41:12.759198 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:41:12 crc kubenswrapper[4841]: I0130 06:41:12.909349 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e12061be-982a-4441-baa0-7d1da89faede-utilities\") pod \"e12061be-982a-4441-baa0-7d1da89faede\" (UID: \"e12061be-982a-4441-baa0-7d1da89faede\") " Jan 30 06:41:12 crc kubenswrapper[4841]: I0130 06:41:12.909454 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e12061be-982a-4441-baa0-7d1da89faede-catalog-content\") pod \"e12061be-982a-4441-baa0-7d1da89faede\" (UID: \"e12061be-982a-4441-baa0-7d1da89faede\") " Jan 30 06:41:12 crc kubenswrapper[4841]: I0130 06:41:12.909574 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zxj7\" (UniqueName: \"kubernetes.io/projected/e12061be-982a-4441-baa0-7d1da89faede-kube-api-access-7zxj7\") pod \"e12061be-982a-4441-baa0-7d1da89faede\" (UID: \"e12061be-982a-4441-baa0-7d1da89faede\") " Jan 30 06:41:12 crc kubenswrapper[4841]: I0130 06:41:12.912450 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e12061be-982a-4441-baa0-7d1da89faede-utilities" (OuterVolumeSpecName: "utilities") pod "e12061be-982a-4441-baa0-7d1da89faede" (UID: "e12061be-982a-4441-baa0-7d1da89faede"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:41:12 crc kubenswrapper[4841]: I0130 06:41:12.917371 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e12061be-982a-4441-baa0-7d1da89faede-kube-api-access-7zxj7" (OuterVolumeSpecName: "kube-api-access-7zxj7") pod "e12061be-982a-4441-baa0-7d1da89faede" (UID: "e12061be-982a-4441-baa0-7d1da89faede"). InnerVolumeSpecName "kube-api-access-7zxj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.012104 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zxj7\" (UniqueName: \"kubernetes.io/projected/e12061be-982a-4441-baa0-7d1da89faede-kube-api-access-7zxj7\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.012129 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e12061be-982a-4441-baa0-7d1da89faede-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.309788 4841 generic.go:334] "Generic (PLEG): container finished" podID="e12061be-982a-4441-baa0-7d1da89faede" containerID="554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465" exitCode=0 Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.309874 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqx8" event={"ID":"e12061be-982a-4441-baa0-7d1da89faede","Type":"ContainerDied","Data":"554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465"} Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.310193 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqx8" event={"ID":"e12061be-982a-4441-baa0-7d1da89faede","Type":"ContainerDied","Data":"74689e5a8dba0695914d7c7ae1e753399179b8b61f59f8cc263ec3b6ead6437f"} Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.309951 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xqx8" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.310218 4841 scope.go:117] "RemoveContainer" containerID="554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.342240 4841 scope.go:117] "RemoveContainer" containerID="447f61a7b9f4403aa1b6d5510cdd4308d7fb741b1c037f2aaf24d50782ce4894" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.373771 4841 scope.go:117] "RemoveContainer" containerID="8c677b9630005a503c0c52a8017c1b3519a71f5ec6e024bfc27d1f4e97396f2d" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.422474 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e12061be-982a-4441-baa0-7d1da89faede-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e12061be-982a-4441-baa0-7d1da89faede" (UID: "e12061be-982a-4441-baa0-7d1da89faede"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.430801 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.430849 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-666dbb5c86-lvn9k" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.441022 4841 scope.go:117] "RemoveContainer" containerID="554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465" Jan 30 06:41:13 crc kubenswrapper[4841]: E0130 06:41:13.441866 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465\": container with ID starting with 554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465 not found: ID does not exist" containerID="554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.441906 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465"} err="failed to get container status \"554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465\": rpc error: code = NotFound desc = could not find container \"554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465\": container with ID starting with 554eb7b109f26ec4d07c4258073bb89e73499debd80acd55cbbffe8f0ab68465 not found: ID does not exist" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.441929 4841 scope.go:117] "RemoveContainer" containerID="447f61a7b9f4403aa1b6d5510cdd4308d7fb741b1c037f2aaf24d50782ce4894" Jan 30 06:41:13 crc kubenswrapper[4841]: E0130 06:41:13.442237 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"447f61a7b9f4403aa1b6d5510cdd4308d7fb741b1c037f2aaf24d50782ce4894\": container with ID starting with 447f61a7b9f4403aa1b6d5510cdd4308d7fb741b1c037f2aaf24d50782ce4894 not found: ID does not exist" containerID="447f61a7b9f4403aa1b6d5510cdd4308d7fb741b1c037f2aaf24d50782ce4894" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.442254 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"447f61a7b9f4403aa1b6d5510cdd4308d7fb741b1c037f2aaf24d50782ce4894"} err="failed to get container status \"447f61a7b9f4403aa1b6d5510cdd4308d7fb741b1c037f2aaf24d50782ce4894\": rpc error: code = NotFound desc = could not find container \"447f61a7b9f4403aa1b6d5510cdd4308d7fb741b1c037f2aaf24d50782ce4894\": container with ID starting with 447f61a7b9f4403aa1b6d5510cdd4308d7fb741b1c037f2aaf24d50782ce4894 not found: ID does not exist" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.442267 4841 scope.go:117] "RemoveContainer" containerID="8c677b9630005a503c0c52a8017c1b3519a71f5ec6e024bfc27d1f4e97396f2d" Jan 30 06:41:13 crc kubenswrapper[4841]: E0130 06:41:13.442795 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c677b9630005a503c0c52a8017c1b3519a71f5ec6e024bfc27d1f4e97396f2d\": container with ID starting with 8c677b9630005a503c0c52a8017c1b3519a71f5ec6e024bfc27d1f4e97396f2d not found: ID does not exist" containerID="8c677b9630005a503c0c52a8017c1b3519a71f5ec6e024bfc27d1f4e97396f2d" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.442816 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c677b9630005a503c0c52a8017c1b3519a71f5ec6e024bfc27d1f4e97396f2d"} err="failed to get container status \"8c677b9630005a503c0c52a8017c1b3519a71f5ec6e024bfc27d1f4e97396f2d\": rpc error: code = NotFound desc = could not find container \"8c677b9630005a503c0c52a8017c1b3519a71f5ec6e024bfc27d1f4e97396f2d\": container with ID starting with 8c677b9630005a503c0c52a8017c1b3519a71f5ec6e024bfc27d1f4e97396f2d not found: ID does not exist" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.488258 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5db8797b66-krqln"] Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.490228 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5db8797b66-krqln" podUID="975a1df1-de82-496c-b7cf-7dd0a83bb6de" containerName="proxy-httpd" containerID="cri-o://fd262c94e2fedea4bb7803fb7f54e1fab69cfa0f020b0094dc66d102ca526360" gracePeriod=30 Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.490682 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5db8797b66-krqln" podUID="975a1df1-de82-496c-b7cf-7dd0a83bb6de" containerName="proxy-server" containerID="cri-o://b6b769ee2b4984aa64b8bbd41e2662acf984ed8acd7111aeda3fed6fec681422" gracePeriod=30 Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.533357 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e12061be-982a-4441-baa0-7d1da89faede-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.684388 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2xqx8"] Jan 30 06:41:13 crc kubenswrapper[4841]: I0130 06:41:13.692736 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2xqx8"] Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.319704 4841 generic.go:334] "Generic (PLEG): container finished" podID="975a1df1-de82-496c-b7cf-7dd0a83bb6de" containerID="b6b769ee2b4984aa64b8bbd41e2662acf984ed8acd7111aeda3fed6fec681422" exitCode=0 Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.319906 4841 generic.go:334] "Generic (PLEG): container finished" podID="975a1df1-de82-496c-b7cf-7dd0a83bb6de" containerID="fd262c94e2fedea4bb7803fb7f54e1fab69cfa0f020b0094dc66d102ca526360" exitCode=0 Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.319878 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5db8797b66-krqln" event={"ID":"975a1df1-de82-496c-b7cf-7dd0a83bb6de","Type":"ContainerDied","Data":"b6b769ee2b4984aa64b8bbd41e2662acf984ed8acd7111aeda3fed6fec681422"} Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.319958 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5db8797b66-krqln" event={"ID":"975a1df1-de82-496c-b7cf-7dd0a83bb6de","Type":"ContainerDied","Data":"fd262c94e2fedea4bb7803fb7f54e1fab69cfa0f020b0094dc66d102ca526360"} Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.439832 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e12061be-982a-4441-baa0-7d1da89faede" path="/var/lib/kubelet/pods/e12061be-982a-4441-baa0-7d1da89faede/volumes" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.708725 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.855505 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/975a1df1-de82-496c-b7cf-7dd0a83bb6de-etc-swift\") pod \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.855646 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/975a1df1-de82-496c-b7cf-7dd0a83bb6de-log-httpd\") pod \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.855688 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975a1df1-de82-496c-b7cf-7dd0a83bb6de-combined-ca-bundle\") pod \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.855727 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975a1df1-de82-496c-b7cf-7dd0a83bb6de-config-data\") pod \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.855779 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/975a1df1-de82-496c-b7cf-7dd0a83bb6de-run-httpd\") pod \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.855813 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbtkk\" (UniqueName: \"kubernetes.io/projected/975a1df1-de82-496c-b7cf-7dd0a83bb6de-kube-api-access-fbtkk\") pod \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\" (UID: \"975a1df1-de82-496c-b7cf-7dd0a83bb6de\") " Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.856206 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/975a1df1-de82-496c-b7cf-7dd0a83bb6de-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "975a1df1-de82-496c-b7cf-7dd0a83bb6de" (UID: "975a1df1-de82-496c-b7cf-7dd0a83bb6de"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.856614 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/975a1df1-de82-496c-b7cf-7dd0a83bb6de-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "975a1df1-de82-496c-b7cf-7dd0a83bb6de" (UID: "975a1df1-de82-496c-b7cf-7dd0a83bb6de"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.861246 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975a1df1-de82-496c-b7cf-7dd0a83bb6de-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "975a1df1-de82-496c-b7cf-7dd0a83bb6de" (UID: "975a1df1-de82-496c-b7cf-7dd0a83bb6de"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.861442 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975a1df1-de82-496c-b7cf-7dd0a83bb6de-kube-api-access-fbtkk" (OuterVolumeSpecName: "kube-api-access-fbtkk") pod "975a1df1-de82-496c-b7cf-7dd0a83bb6de" (UID: "975a1df1-de82-496c-b7cf-7dd0a83bb6de"). InnerVolumeSpecName "kube-api-access-fbtkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.904878 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/975a1df1-de82-496c-b7cf-7dd0a83bb6de-config-data" (OuterVolumeSpecName: "config-data") pod "975a1df1-de82-496c-b7cf-7dd0a83bb6de" (UID: "975a1df1-de82-496c-b7cf-7dd0a83bb6de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.914733 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/975a1df1-de82-496c-b7cf-7dd0a83bb6de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "975a1df1-de82-496c-b7cf-7dd0a83bb6de" (UID: "975a1df1-de82-496c-b7cf-7dd0a83bb6de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.958550 4841 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/975a1df1-de82-496c-b7cf-7dd0a83bb6de-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.958592 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/975a1df1-de82-496c-b7cf-7dd0a83bb6de-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.958604 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975a1df1-de82-496c-b7cf-7dd0a83bb6de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.958618 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975a1df1-de82-496c-b7cf-7dd0a83bb6de-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.958630 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/975a1df1-de82-496c-b7cf-7dd0a83bb6de-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:14 crc kubenswrapper[4841]: I0130 06:41:14.958641 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbtkk\" (UniqueName: \"kubernetes.io/projected/975a1df1-de82-496c-b7cf-7dd0a83bb6de-kube-api-access-fbtkk\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:15 crc kubenswrapper[4841]: I0130 06:41:15.338311 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5db8797b66-krqln" event={"ID":"975a1df1-de82-496c-b7cf-7dd0a83bb6de","Type":"ContainerDied","Data":"155342417fabb4f88c5274deda7312c644986b075de2462cc3cf4ffea17c11f4"} Jan 30 06:41:15 crc kubenswrapper[4841]: I0130 06:41:15.338433 4841 scope.go:117] "RemoveContainer" containerID="b6b769ee2b4984aa64b8bbd41e2662acf984ed8acd7111aeda3fed6fec681422" Jan 30 06:41:15 crc kubenswrapper[4841]: I0130 06:41:15.338543 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5db8797b66-krqln" Jan 30 06:41:15 crc kubenswrapper[4841]: I0130 06:41:15.372179 4841 scope.go:117] "RemoveContainer" containerID="fd262c94e2fedea4bb7803fb7f54e1fab69cfa0f020b0094dc66d102ca526360" Jan 30 06:41:15 crc kubenswrapper[4841]: I0130 06:41:15.396072 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5db8797b66-krqln"] Jan 30 06:41:15 crc kubenswrapper[4841]: I0130 06:41:15.402455 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5db8797b66-krqln"] Jan 30 06:41:16 crc kubenswrapper[4841]: I0130 06:41:16.456768 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="975a1df1-de82-496c-b7cf-7dd0a83bb6de" path="/var/lib/kubelet/pods/975a1df1-de82-496c-b7cf-7dd0a83bb6de/volumes" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.030347 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-tcrrl"] Jan 30 06:41:20 crc kubenswrapper[4841]: E0130 06:41:20.031428 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34d8bc6-5a83-4050-b4eb-b66cb248dab6" containerName="init" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.031450 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34d8bc6-5a83-4050-b4eb-b66cb248dab6" containerName="init" Jan 30 06:41:20 crc kubenswrapper[4841]: E0130 06:41:20.031479 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975a1df1-de82-496c-b7cf-7dd0a83bb6de" containerName="proxy-server" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.031493 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="975a1df1-de82-496c-b7cf-7dd0a83bb6de" containerName="proxy-server" Jan 30 06:41:20 crc kubenswrapper[4841]: E0130 06:41:20.031521 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e345d041-d2b4-4aa7-aae0-35c5dbba9d0b" containerName="swift-ring-rebalance" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.031535 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e345d041-d2b4-4aa7-aae0-35c5dbba9d0b" containerName="swift-ring-rebalance" Jan 30 06:41:20 crc kubenswrapper[4841]: E0130 06:41:20.031558 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e12061be-982a-4441-baa0-7d1da89faede" containerName="registry-server" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.031570 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12061be-982a-4441-baa0-7d1da89faede" containerName="registry-server" Jan 30 06:41:20 crc kubenswrapper[4841]: E0130 06:41:20.031592 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975a1df1-de82-496c-b7cf-7dd0a83bb6de" containerName="proxy-httpd" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.031606 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="975a1df1-de82-496c-b7cf-7dd0a83bb6de" containerName="proxy-httpd" Jan 30 06:41:20 crc kubenswrapper[4841]: E0130 06:41:20.031631 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34d8bc6-5a83-4050-b4eb-b66cb248dab6" containerName="dnsmasq-dns" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.031643 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34d8bc6-5a83-4050-b4eb-b66cb248dab6" containerName="dnsmasq-dns" Jan 30 06:41:20 crc kubenswrapper[4841]: E0130 06:41:20.031658 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e12061be-982a-4441-baa0-7d1da89faede" containerName="extract-content" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.031670 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12061be-982a-4441-baa0-7d1da89faede" containerName="extract-content" Jan 30 06:41:20 crc kubenswrapper[4841]: E0130 06:41:20.031697 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e12061be-982a-4441-baa0-7d1da89faede" containerName="extract-utilities" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.031709 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12061be-982a-4441-baa0-7d1da89faede" containerName="extract-utilities" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.032014 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e345d041-d2b4-4aa7-aae0-35c5dbba9d0b" containerName="swift-ring-rebalance" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.032048 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="975a1df1-de82-496c-b7cf-7dd0a83bb6de" containerName="proxy-server" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.032067 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e12061be-982a-4441-baa0-7d1da89faede" containerName="registry-server" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.032089 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b34d8bc6-5a83-4050-b4eb-b66cb248dab6" containerName="dnsmasq-dns" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.032119 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="975a1df1-de82-496c-b7cf-7dd0a83bb6de" containerName="proxy-httpd" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.033028 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tcrrl" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.042136 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tcrrl"] Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.122819 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d67b-account-create-update-vkk58"] Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.125299 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d67b-account-create-update-vkk58" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.130121 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.162853 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlbg4\" (UniqueName: \"kubernetes.io/projected/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43-kube-api-access-wlbg4\") pod \"cinder-db-create-tcrrl\" (UID: \"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43\") " pod="openstack/cinder-db-create-tcrrl" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.162979 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43-operator-scripts\") pod \"cinder-db-create-tcrrl\" (UID: \"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43\") " pod="openstack/cinder-db-create-tcrrl" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.167027 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d67b-account-create-update-vkk58"] Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.264449 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlbg4\" (UniqueName: \"kubernetes.io/projected/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43-kube-api-access-wlbg4\") pod \"cinder-db-create-tcrrl\" (UID: \"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43\") " pod="openstack/cinder-db-create-tcrrl" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.264501 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e57083-b3fd-4ac6-b8bf-3761a913defc-operator-scripts\") pod \"cinder-d67b-account-create-update-vkk58\" (UID: \"f9e57083-b3fd-4ac6-b8bf-3761a913defc\") " pod="openstack/cinder-d67b-account-create-update-vkk58" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.264579 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcbfh\" (UniqueName: \"kubernetes.io/projected/f9e57083-b3fd-4ac6-b8bf-3761a913defc-kube-api-access-hcbfh\") pod \"cinder-d67b-account-create-update-vkk58\" (UID: \"f9e57083-b3fd-4ac6-b8bf-3761a913defc\") " pod="openstack/cinder-d67b-account-create-update-vkk58" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.264603 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43-operator-scripts\") pod \"cinder-db-create-tcrrl\" (UID: \"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43\") " pod="openstack/cinder-db-create-tcrrl" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.265610 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43-operator-scripts\") pod \"cinder-db-create-tcrrl\" (UID: \"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43\") " pod="openstack/cinder-db-create-tcrrl" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.291494 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlbg4\" (UniqueName: \"kubernetes.io/projected/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43-kube-api-access-wlbg4\") pod \"cinder-db-create-tcrrl\" (UID: \"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43\") " pod="openstack/cinder-db-create-tcrrl" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.368051 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tcrrl" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.369076 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e57083-b3fd-4ac6-b8bf-3761a913defc-operator-scripts\") pod \"cinder-d67b-account-create-update-vkk58\" (UID: \"f9e57083-b3fd-4ac6-b8bf-3761a913defc\") " pod="openstack/cinder-d67b-account-create-update-vkk58" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.369211 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcbfh\" (UniqueName: \"kubernetes.io/projected/f9e57083-b3fd-4ac6-b8bf-3761a913defc-kube-api-access-hcbfh\") pod \"cinder-d67b-account-create-update-vkk58\" (UID: \"f9e57083-b3fd-4ac6-b8bf-3761a913defc\") " pod="openstack/cinder-d67b-account-create-update-vkk58" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.371095 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e57083-b3fd-4ac6-b8bf-3761a913defc-operator-scripts\") pod \"cinder-d67b-account-create-update-vkk58\" (UID: \"f9e57083-b3fd-4ac6-b8bf-3761a913defc\") " pod="openstack/cinder-d67b-account-create-update-vkk58" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.393157 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcbfh\" (UniqueName: \"kubernetes.io/projected/f9e57083-b3fd-4ac6-b8bf-3761a913defc-kube-api-access-hcbfh\") pod \"cinder-d67b-account-create-update-vkk58\" (UID: \"f9e57083-b3fd-4ac6-b8bf-3761a913defc\") " pod="openstack/cinder-d67b-account-create-update-vkk58" Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.448589 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d67b-account-create-update-vkk58" Jan 30 06:41:20 crc kubenswrapper[4841]: W0130 06:41:20.879477 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3a4d80f_1e00_4dd4_b5ee_1e300fb25c43.slice/crio-8a89561094699ac330b1c5d590767b675b7c3088a3ba6e0a794cd933ccab5e5b WatchSource:0}: Error finding container 8a89561094699ac330b1c5d590767b675b7c3088a3ba6e0a794cd933ccab5e5b: Status 404 returned error can't find the container with id 8a89561094699ac330b1c5d590767b675b7c3088a3ba6e0a794cd933ccab5e5b Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.879560 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tcrrl"] Jan 30 06:41:20 crc kubenswrapper[4841]: I0130 06:41:20.948726 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d67b-account-create-update-vkk58"] Jan 30 06:41:20 crc kubenswrapper[4841]: W0130 06:41:20.951057 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e57083_b3fd_4ac6_b8bf_3761a913defc.slice/crio-e94ee485f2df478dcf2345da1c9c8016d6ccb33035cc6a2abbf521c62ff594f9 WatchSource:0}: Error finding container e94ee485f2df478dcf2345da1c9c8016d6ccb33035cc6a2abbf521c62ff594f9: Status 404 returned error can't find the container with id e94ee485f2df478dcf2345da1c9c8016d6ccb33035cc6a2abbf521c62ff594f9 Jan 30 06:41:21 crc kubenswrapper[4841]: I0130 06:41:21.426145 4841 generic.go:334] "Generic (PLEG): container finished" podID="f9e57083-b3fd-4ac6-b8bf-3761a913defc" containerID="eda2754cf0f3fbd0436f21c07e9cfbc7d78955b14f44a4941ef6db6def2ff46c" exitCode=0 Jan 30 06:41:21 crc kubenswrapper[4841]: I0130 06:41:21.426282 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d67b-account-create-update-vkk58" event={"ID":"f9e57083-b3fd-4ac6-b8bf-3761a913defc","Type":"ContainerDied","Data":"eda2754cf0f3fbd0436f21c07e9cfbc7d78955b14f44a4941ef6db6def2ff46c"} Jan 30 06:41:21 crc kubenswrapper[4841]: I0130 06:41:21.426468 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d67b-account-create-update-vkk58" event={"ID":"f9e57083-b3fd-4ac6-b8bf-3761a913defc","Type":"ContainerStarted","Data":"e94ee485f2df478dcf2345da1c9c8016d6ccb33035cc6a2abbf521c62ff594f9"} Jan 30 06:41:21 crc kubenswrapper[4841]: I0130 06:41:21.431598 4841 generic.go:334] "Generic (PLEG): container finished" podID="e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43" containerID="4ace0a36f73702b5c9588fb2b4ac8ca3801a27ccfbcb1fb94475b8a13c78744a" exitCode=0 Jan 30 06:41:21 crc kubenswrapper[4841]: I0130 06:41:21.431689 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tcrrl" event={"ID":"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43","Type":"ContainerDied","Data":"4ace0a36f73702b5c9588fb2b4ac8ca3801a27ccfbcb1fb94475b8a13c78744a"} Jan 30 06:41:21 crc kubenswrapper[4841]: I0130 06:41:21.431747 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tcrrl" event={"ID":"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43","Type":"ContainerStarted","Data":"8a89561094699ac330b1c5d590767b675b7c3088a3ba6e0a794cd933ccab5e5b"} Jan 30 06:41:22 crc kubenswrapper[4841]: I0130 06:41:22.774543 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d67b-account-create-update-vkk58" Jan 30 06:41:22 crc kubenswrapper[4841]: I0130 06:41:22.899723 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tcrrl" Jan 30 06:41:22 crc kubenswrapper[4841]: I0130 06:41:22.938683 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e57083-b3fd-4ac6-b8bf-3761a913defc-operator-scripts\") pod \"f9e57083-b3fd-4ac6-b8bf-3761a913defc\" (UID: \"f9e57083-b3fd-4ac6-b8bf-3761a913defc\") " Jan 30 06:41:22 crc kubenswrapper[4841]: I0130 06:41:22.938747 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcbfh\" (UniqueName: \"kubernetes.io/projected/f9e57083-b3fd-4ac6-b8bf-3761a913defc-kube-api-access-hcbfh\") pod \"f9e57083-b3fd-4ac6-b8bf-3761a913defc\" (UID: \"f9e57083-b3fd-4ac6-b8bf-3761a913defc\") " Jan 30 06:41:22 crc kubenswrapper[4841]: I0130 06:41:22.939109 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e57083-b3fd-4ac6-b8bf-3761a913defc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9e57083-b3fd-4ac6-b8bf-3761a913defc" (UID: "f9e57083-b3fd-4ac6-b8bf-3761a913defc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:41:22 crc kubenswrapper[4841]: I0130 06:41:22.944965 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e57083-b3fd-4ac6-b8bf-3761a913defc-kube-api-access-hcbfh" (OuterVolumeSpecName: "kube-api-access-hcbfh") pod "f9e57083-b3fd-4ac6-b8bf-3761a913defc" (UID: "f9e57083-b3fd-4ac6-b8bf-3761a913defc"). InnerVolumeSpecName "kube-api-access-hcbfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.039698 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43-operator-scripts\") pod \"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43\" (UID: \"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43\") " Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.039868 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlbg4\" (UniqueName: \"kubernetes.io/projected/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43-kube-api-access-wlbg4\") pod \"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43\" (UID: \"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43\") " Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.040184 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43" (UID: "e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.040207 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e57083-b3fd-4ac6-b8bf-3761a913defc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.040222 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcbfh\" (UniqueName: \"kubernetes.io/projected/f9e57083-b3fd-4ac6-b8bf-3761a913defc-kube-api-access-hcbfh\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.044651 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43-kube-api-access-wlbg4" (OuterVolumeSpecName: "kube-api-access-wlbg4") pod "e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43" (UID: "e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43"). InnerVolumeSpecName "kube-api-access-wlbg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.141946 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlbg4\" (UniqueName: \"kubernetes.io/projected/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43-kube-api-access-wlbg4\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.141971 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.450373 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tcrrl" Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.450387 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tcrrl" event={"ID":"e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43","Type":"ContainerDied","Data":"8a89561094699ac330b1c5d590767b675b7c3088a3ba6e0a794cd933ccab5e5b"} Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.450505 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a89561094699ac330b1c5d590767b675b7c3088a3ba6e0a794cd933ccab5e5b" Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.452119 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d67b-account-create-update-vkk58" event={"ID":"f9e57083-b3fd-4ac6-b8bf-3761a913defc","Type":"ContainerDied","Data":"e94ee485f2df478dcf2345da1c9c8016d6ccb33035cc6a2abbf521c62ff594f9"} Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.452163 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e94ee485f2df478dcf2345da1c9c8016d6ccb33035cc6a2abbf521c62ff594f9" Jan 30 06:41:23 crc kubenswrapper[4841]: I0130 06:41:23.452193 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d67b-account-create-update-vkk58" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.365303 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-nh96x"] Jan 30 06:41:25 crc kubenswrapper[4841]: E0130 06:41:25.365982 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e57083-b3fd-4ac6-b8bf-3761a913defc" containerName="mariadb-account-create-update" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.365999 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e57083-b3fd-4ac6-b8bf-3761a913defc" containerName="mariadb-account-create-update" Jan 30 06:41:25 crc kubenswrapper[4841]: E0130 06:41:25.366011 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43" containerName="mariadb-database-create" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.366019 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43" containerName="mariadb-database-create" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.366213 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e57083-b3fd-4ac6-b8bf-3761a913defc" containerName="mariadb-account-create-update" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.366248 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43" containerName="mariadb-database-create" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.366912 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.368674 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.368882 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7jjkz" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.369257 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.379308 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nh96x"] Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.485271 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-config-data\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.485430 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-scripts\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.485501 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89cp8\" (UniqueName: \"kubernetes.io/projected/06c6cefc-90f2-482c-9436-12542e71f13c-kube-api-access-89cp8\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.485559 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-combined-ca-bundle\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.485659 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-db-sync-config-data\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.485686 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06c6cefc-90f2-482c-9436-12542e71f13c-etc-machine-id\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.587523 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89cp8\" (UniqueName: \"kubernetes.io/projected/06c6cefc-90f2-482c-9436-12542e71f13c-kube-api-access-89cp8\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.587580 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-combined-ca-bundle\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.587643 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-db-sync-config-data\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.587669 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06c6cefc-90f2-482c-9436-12542e71f13c-etc-machine-id\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.587740 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-config-data\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.587829 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-scripts\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.588853 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06c6cefc-90f2-482c-9436-12542e71f13c-etc-machine-id\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.595831 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-scripts\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.596136 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-config-data\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.597749 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-db-sync-config-data\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.610255 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-combined-ca-bundle\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.619042 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89cp8\" (UniqueName: \"kubernetes.io/projected/06c6cefc-90f2-482c-9436-12542e71f13c-kube-api-access-89cp8\") pod \"cinder-db-sync-nh96x\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:25 crc kubenswrapper[4841]: I0130 06:41:25.701064 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:26 crc kubenswrapper[4841]: I0130 06:41:26.186011 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nh96x"] Jan 30 06:41:26 crc kubenswrapper[4841]: I0130 06:41:26.482919 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nh96x" event={"ID":"06c6cefc-90f2-482c-9436-12542e71f13c","Type":"ContainerStarted","Data":"9e590684d0ff7ee1f3157205db443767fbc0d2d987876c8b73fe36d872518e14"} Jan 30 06:41:27 crc kubenswrapper[4841]: I0130 06:41:27.495733 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nh96x" event={"ID":"06c6cefc-90f2-482c-9436-12542e71f13c","Type":"ContainerStarted","Data":"70b1b32b149b58c6eed23d2aa01c089ac5a2727f76ae45632700dd9e2d709f9e"} Jan 30 06:41:27 crc kubenswrapper[4841]: I0130 06:41:27.529176 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-nh96x" podStartSLOduration=2.529157368 podStartE2EDuration="2.529157368s" podCreationTimestamp="2026-01-30 06:41:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:41:27.518062502 +0000 UTC m=+5624.511535180" watchObservedRunningTime="2026-01-30 06:41:27.529157368 +0000 UTC m=+5624.522630016" Jan 30 06:41:29 crc kubenswrapper[4841]: I0130 06:41:29.521768 4841 generic.go:334] "Generic (PLEG): container finished" podID="06c6cefc-90f2-482c-9436-12542e71f13c" containerID="70b1b32b149b58c6eed23d2aa01c089ac5a2727f76ae45632700dd9e2d709f9e" exitCode=0 Jan 30 06:41:29 crc kubenswrapper[4841]: I0130 06:41:29.521862 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nh96x" event={"ID":"06c6cefc-90f2-482c-9436-12542e71f13c","Type":"ContainerDied","Data":"70b1b32b149b58c6eed23d2aa01c089ac5a2727f76ae45632700dd9e2d709f9e"} Jan 30 06:41:30 crc kubenswrapper[4841]: I0130 06:41:30.931805 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.096575 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-db-sync-config-data\") pod \"06c6cefc-90f2-482c-9436-12542e71f13c\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.096641 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89cp8\" (UniqueName: \"kubernetes.io/projected/06c6cefc-90f2-482c-9436-12542e71f13c-kube-api-access-89cp8\") pod \"06c6cefc-90f2-482c-9436-12542e71f13c\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.096669 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-combined-ca-bundle\") pod \"06c6cefc-90f2-482c-9436-12542e71f13c\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.096766 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06c6cefc-90f2-482c-9436-12542e71f13c-etc-machine-id\") pod \"06c6cefc-90f2-482c-9436-12542e71f13c\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.096840 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-scripts\") pod \"06c6cefc-90f2-482c-9436-12542e71f13c\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.096918 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06c6cefc-90f2-482c-9436-12542e71f13c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "06c6cefc-90f2-482c-9436-12542e71f13c" (UID: "06c6cefc-90f2-482c-9436-12542e71f13c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.097007 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-config-data\") pod \"06c6cefc-90f2-482c-9436-12542e71f13c\" (UID: \"06c6cefc-90f2-482c-9436-12542e71f13c\") " Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.097538 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06c6cefc-90f2-482c-9436-12542e71f13c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.102712 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "06c6cefc-90f2-482c-9436-12542e71f13c" (UID: "06c6cefc-90f2-482c-9436-12542e71f13c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.103917 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c6cefc-90f2-482c-9436-12542e71f13c-kube-api-access-89cp8" (OuterVolumeSpecName: "kube-api-access-89cp8") pod "06c6cefc-90f2-482c-9436-12542e71f13c" (UID: "06c6cefc-90f2-482c-9436-12542e71f13c"). InnerVolumeSpecName "kube-api-access-89cp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.103958 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-scripts" (OuterVolumeSpecName: "scripts") pod "06c6cefc-90f2-482c-9436-12542e71f13c" (UID: "06c6cefc-90f2-482c-9436-12542e71f13c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.140433 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06c6cefc-90f2-482c-9436-12542e71f13c" (UID: "06c6cefc-90f2-482c-9436-12542e71f13c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.164103 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-config-data" (OuterVolumeSpecName: "config-data") pod "06c6cefc-90f2-482c-9436-12542e71f13c" (UID: "06c6cefc-90f2-482c-9436-12542e71f13c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.198857 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.198888 4841 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.198899 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89cp8\" (UniqueName: \"kubernetes.io/projected/06c6cefc-90f2-482c-9436-12542e71f13c-kube-api-access-89cp8\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.198910 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.198918 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06c6cefc-90f2-482c-9436-12542e71f13c-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.558302 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nh96x" event={"ID":"06c6cefc-90f2-482c-9436-12542e71f13c","Type":"ContainerDied","Data":"9e590684d0ff7ee1f3157205db443767fbc0d2d987876c8b73fe36d872518e14"} Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.558627 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e590684d0ff7ee1f3157205db443767fbc0d2d987876c8b73fe36d872518e14" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.558692 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nh96x" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.976577 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ddb7bdbd9-frmht"] Jan 30 06:41:31 crc kubenswrapper[4841]: E0130 06:41:31.977008 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06c6cefc-90f2-482c-9436-12542e71f13c" containerName="cinder-db-sync" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.977026 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c6cefc-90f2-482c-9436-12542e71f13c" containerName="cinder-db-sync" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.977231 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="06c6cefc-90f2-482c-9436-12542e71f13c" containerName="cinder-db-sync" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.978165 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:31 crc kubenswrapper[4841]: I0130 06:41:31.994803 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ddb7bdbd9-frmht"] Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.124155 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-config\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.124215 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt9pm\" (UniqueName: \"kubernetes.io/projected/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-kube-api-access-bt9pm\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.124308 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-ovsdbserver-nb\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.124326 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-dns-svc\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.124379 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-ovsdbserver-sb\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.137583 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.138851 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.140773 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7jjkz" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.141064 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.141147 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.141278 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.155689 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.225643 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-ovsdbserver-nb\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.225691 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-dns-svc\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.225751 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-ovsdbserver-sb\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.225805 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-config\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.225829 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt9pm\" (UniqueName: \"kubernetes.io/projected/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-kube-api-access-bt9pm\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.226594 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-ovsdbserver-sb\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.226619 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-ovsdbserver-nb\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.226594 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-dns-svc\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.226851 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-config\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.242148 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt9pm\" (UniqueName: \"kubernetes.io/projected/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-kube-api-access-bt9pm\") pod \"dnsmasq-dns-7ddb7bdbd9-frmht\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.314274 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.327099 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-logs\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.327143 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.327272 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.327458 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-scripts\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.327483 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-config-data-custom\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.327511 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-config-data\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.327597 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcx7g\" (UniqueName: \"kubernetes.io/projected/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-kube-api-access-vcx7g\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.429541 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.429838 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-scripts\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.429854 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-config-data-custom\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.429874 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-config-data\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.429910 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcx7g\" (UniqueName: \"kubernetes.io/projected/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-kube-api-access-vcx7g\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.429949 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-logs\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.429969 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.430043 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.430461 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-logs\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.438118 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.443998 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-config-data-custom\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.449275 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-config-data\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.454903 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcx7g\" (UniqueName: \"kubernetes.io/projected/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-kube-api-access-vcx7g\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.455744 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-scripts\") pod \"cinder-api-0\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.751580 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:41:32 crc kubenswrapper[4841]: I0130 06:41:32.917936 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ddb7bdbd9-frmht"] Jan 30 06:41:33 crc kubenswrapper[4841]: I0130 06:41:33.216845 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:41:33 crc kubenswrapper[4841]: W0130 06:41:33.231177 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f8411ef_f7ea_4a61_bb6f_58105c9d1bef.slice/crio-21016d057b004dde753bcdfa7ea2fce7159f5328010a9f95ca09214734e3d24b WatchSource:0}: Error finding container 21016d057b004dde753bcdfa7ea2fce7159f5328010a9f95ca09214734e3d24b: Status 404 returned error can't find the container with id 21016d057b004dde753bcdfa7ea2fce7159f5328010a9f95ca09214734e3d24b Jan 30 06:41:33 crc kubenswrapper[4841]: I0130 06:41:33.592208 4841 generic.go:334] "Generic (PLEG): container finished" podID="ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" containerID="62f3ccdf3fca402e14cbb5381667d579911d5599cd0a2a5f20e009765ef424d8" exitCode=0 Jan 30 06:41:33 crc kubenswrapper[4841]: I0130 06:41:33.592247 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" event={"ID":"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c","Type":"ContainerDied","Data":"62f3ccdf3fca402e14cbb5381667d579911d5599cd0a2a5f20e009765ef424d8"} Jan 30 06:41:33 crc kubenswrapper[4841]: I0130 06:41:33.592286 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" event={"ID":"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c","Type":"ContainerStarted","Data":"a5ec3e169419ad48a3bcd5daeef550111b20e2e5f7599e63476eab7ca7a7532c"} Jan 30 06:41:33 crc kubenswrapper[4841]: I0130 06:41:33.597288 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef","Type":"ContainerStarted","Data":"21016d057b004dde753bcdfa7ea2fce7159f5328010a9f95ca09214734e3d24b"} Jan 30 06:41:34 crc kubenswrapper[4841]: I0130 06:41:34.608458 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" event={"ID":"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c","Type":"ContainerStarted","Data":"9c485f799d5c5fbfcd23a5f96ca4c8c301dc863b46a22284fc74b96700d63030"} Jan 30 06:41:34 crc kubenswrapper[4841]: I0130 06:41:34.608902 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:34 crc kubenswrapper[4841]: I0130 06:41:34.610811 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef","Type":"ContainerStarted","Data":"6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763"} Jan 30 06:41:34 crc kubenswrapper[4841]: I0130 06:41:34.610846 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef","Type":"ContainerStarted","Data":"c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1"} Jan 30 06:41:34 crc kubenswrapper[4841]: I0130 06:41:34.610922 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 06:41:34 crc kubenswrapper[4841]: I0130 06:41:34.641542 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" podStartSLOduration=3.641526157 podStartE2EDuration="3.641526157s" podCreationTimestamp="2026-01-30 06:41:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:41:34.632692021 +0000 UTC m=+5631.626164669" watchObservedRunningTime="2026-01-30 06:41:34.641526157 +0000 UTC m=+5631.634998795" Jan 30 06:41:34 crc kubenswrapper[4841]: I0130 06:41:34.656455 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.656429194 podStartE2EDuration="2.656429194s" podCreationTimestamp="2026-01-30 06:41:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:41:34.651453162 +0000 UTC m=+5631.644925840" watchObservedRunningTime="2026-01-30 06:41:34.656429194 +0000 UTC m=+5631.649901872" Jan 30 06:41:34 crc kubenswrapper[4841]: I0130 06:41:34.922991 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:41:36 crc kubenswrapper[4841]: I0130 06:41:36.624437 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" containerName="cinder-api-log" containerID="cri-o://c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1" gracePeriod=30 Jan 30 06:41:36 crc kubenswrapper[4841]: I0130 06:41:36.624459 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" containerName="cinder-api" containerID="cri-o://6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763" gracePeriod=30 Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.198868 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.238529 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcx7g\" (UniqueName: \"kubernetes.io/projected/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-kube-api-access-vcx7g\") pod \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.238642 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-scripts\") pod \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.238742 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-combined-ca-bundle\") pod \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.238792 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-config-data\") pod \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.238819 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-config-data-custom\") pod \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.238857 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-etc-machine-id\") pod \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.238887 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-logs\") pod \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\" (UID: \"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef\") " Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.239885 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-logs" (OuterVolumeSpecName: "logs") pod "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" (UID: "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.242754 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" (UID: "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.245672 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" (UID: "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.246710 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-kube-api-access-vcx7g" (OuterVolumeSpecName: "kube-api-access-vcx7g") pod "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" (UID: "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef"). InnerVolumeSpecName "kube-api-access-vcx7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.247983 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-scripts" (OuterVolumeSpecName: "scripts") pod "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" (UID: "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.277588 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" (UID: "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.299997 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-config-data" (OuterVolumeSpecName: "config-data") pod "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" (UID: "5f8411ef-f7ea-4a61-bb6f-58105c9d1bef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.340718 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.340752 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.340765 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.340776 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.340788 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.340799 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcx7g\" (UniqueName: \"kubernetes.io/projected/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-kube-api-access-vcx7g\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.340812 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.635267 4841 generic.go:334] "Generic (PLEG): container finished" podID="5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" containerID="6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763" exitCode=0 Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.635307 4841 generic.go:334] "Generic (PLEG): container finished" podID="5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" containerID="c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1" exitCode=143 Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.635318 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef","Type":"ContainerDied","Data":"6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763"} Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.635427 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef","Type":"ContainerDied","Data":"c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1"} Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.635434 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.635467 4841 scope.go:117] "RemoveContainer" containerID="6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.635451 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5f8411ef-f7ea-4a61-bb6f-58105c9d1bef","Type":"ContainerDied","Data":"21016d057b004dde753bcdfa7ea2fce7159f5328010a9f95ca09214734e3d24b"} Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.663619 4841 scope.go:117] "RemoveContainer" containerID="c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.694596 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.705145 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.705859 4841 scope.go:117] "RemoveContainer" containerID="6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763" Jan 30 06:41:37 crc kubenswrapper[4841]: E0130 06:41:37.713167 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763\": container with ID starting with 6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763 not found: ID does not exist" containerID="6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.713230 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763"} err="failed to get container status \"6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763\": rpc error: code = NotFound desc = could not find container \"6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763\": container with ID starting with 6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763 not found: ID does not exist" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.713272 4841 scope.go:117] "RemoveContainer" containerID="c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1" Jan 30 06:41:37 crc kubenswrapper[4841]: E0130 06:41:37.713810 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1\": container with ID starting with c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1 not found: ID does not exist" containerID="c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.713860 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1"} err="failed to get container status \"c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1\": rpc error: code = NotFound desc = could not find container \"c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1\": container with ID starting with c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1 not found: ID does not exist" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.713893 4841 scope.go:117] "RemoveContainer" containerID="6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.714645 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763"} err="failed to get container status \"6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763\": rpc error: code = NotFound desc = could not find container \"6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763\": container with ID starting with 6750913aa2223d8f81ed13a86b87049955ba7958a8cb858b5de50d7f4285e763 not found: ID does not exist" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.714713 4841 scope.go:117] "RemoveContainer" containerID="c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.715415 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1"} err="failed to get container status \"c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1\": rpc error: code = NotFound desc = could not find container \"c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1\": container with ID starting with c9cd4f4ac995f139681ac7611ae4b39d0cf20c4bc321f62f36f80fddccd08ba1 not found: ID does not exist" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.733725 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:41:37 crc kubenswrapper[4841]: E0130 06:41:37.734508 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" containerName="cinder-api" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.734553 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" containerName="cinder-api" Jan 30 06:41:37 crc kubenswrapper[4841]: E0130 06:41:37.734624 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" containerName="cinder-api-log" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.734642 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" containerName="cinder-api-log" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.735034 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" containerName="cinder-api" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.735088 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" containerName="cinder-api-log" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.736895 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.745093 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.745550 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.745812 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.746645 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.746865 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.747015 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.747207 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7jjkz" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.749647 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-config-data\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.749713 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.749766 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/511b3720-7eec-46e8-9152-68e80b421928-logs\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.749806 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-public-tls-certs\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.749887 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.749915 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/511b3720-7eec-46e8-9152-68e80b421928-etc-machine-id\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.749952 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-scripts\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.749983 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-config-data-custom\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.750011 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq7qn\" (UniqueName: \"kubernetes.io/projected/511b3720-7eec-46e8-9152-68e80b421928-kube-api-access-sq7qn\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.851680 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/511b3720-7eec-46e8-9152-68e80b421928-etc-machine-id\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.851752 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-scripts\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.851787 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-config-data-custom\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.851818 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq7qn\" (UniqueName: \"kubernetes.io/projected/511b3720-7eec-46e8-9152-68e80b421928-kube-api-access-sq7qn\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.851847 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-config-data\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.851869 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/511b3720-7eec-46e8-9152-68e80b421928-etc-machine-id\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.851889 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.852313 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/511b3720-7eec-46e8-9152-68e80b421928-logs\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.852499 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-public-tls-certs\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.852801 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.853942 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/511b3720-7eec-46e8-9152-68e80b421928-logs\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.857927 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-public-tls-certs\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.858010 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-scripts\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.858265 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.860193 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.865458 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-config-data\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.866274 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-config-data-custom\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:37 crc kubenswrapper[4841]: I0130 06:41:37.879322 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq7qn\" (UniqueName: \"kubernetes.io/projected/511b3720-7eec-46e8-9152-68e80b421928-kube-api-access-sq7qn\") pod \"cinder-api-0\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " pod="openstack/cinder-api-0" Jan 30 06:41:38 crc kubenswrapper[4841]: I0130 06:41:38.108528 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:41:38 crc kubenswrapper[4841]: I0130 06:41:38.446261 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8411ef-f7ea-4a61-bb6f-58105c9d1bef" path="/var/lib/kubelet/pods/5f8411ef-f7ea-4a61-bb6f-58105c9d1bef/volumes" Jan 30 06:41:38 crc kubenswrapper[4841]: W0130 06:41:38.640980 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod511b3720_7eec_46e8_9152_68e80b421928.slice/crio-b3fec028612627fae7a35b1d22b62d318b2726db66abb17d21c2e52f0304439b WatchSource:0}: Error finding container b3fec028612627fae7a35b1d22b62d318b2726db66abb17d21c2e52f0304439b: Status 404 returned error can't find the container with id b3fec028612627fae7a35b1d22b62d318b2726db66abb17d21c2e52f0304439b Jan 30 06:41:38 crc kubenswrapper[4841]: I0130 06:41:38.642812 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:41:39 crc kubenswrapper[4841]: I0130 06:41:39.663771 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"511b3720-7eec-46e8-9152-68e80b421928","Type":"ContainerStarted","Data":"818d39892a6c6c11d2ad98cd64ae0a1b8fe49ec10e3b6450a88ad9471a6e0ea3"} Jan 30 06:41:39 crc kubenswrapper[4841]: I0130 06:41:39.664141 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"511b3720-7eec-46e8-9152-68e80b421928","Type":"ContainerStarted","Data":"b3fec028612627fae7a35b1d22b62d318b2726db66abb17d21c2e52f0304439b"} Jan 30 06:41:40 crc kubenswrapper[4841]: I0130 06:41:40.463846 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:41:40 crc kubenswrapper[4841]: I0130 06:41:40.463921 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:41:40 crc kubenswrapper[4841]: I0130 06:41:40.676673 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"511b3720-7eec-46e8-9152-68e80b421928","Type":"ContainerStarted","Data":"692c64788051a188cf0bd8815565445d11731b5b5c4c2895fee133ca5319cb4b"} Jan 30 06:41:40 crc kubenswrapper[4841]: I0130 06:41:40.676865 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 06:41:40 crc kubenswrapper[4841]: I0130 06:41:40.716128 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.716102185 podStartE2EDuration="3.716102185s" podCreationTimestamp="2026-01-30 06:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:41:40.702909263 +0000 UTC m=+5637.696381941" watchObservedRunningTime="2026-01-30 06:41:40.716102185 +0000 UTC m=+5637.709574853" Jan 30 06:41:42 crc kubenswrapper[4841]: I0130 06:41:42.316614 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:41:42 crc kubenswrapper[4841]: I0130 06:41:42.422799 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f65fd76ff-29kqp"] Jan 30 06:41:42 crc kubenswrapper[4841]: I0130 06:41:42.423086 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" podUID="98ed3339-b24b-44c5-b2f5-47a5c84a39cd" containerName="dnsmasq-dns" containerID="cri-o://ac000c6e9be776fa61763225bed18d5d5cb0c95daccf056091f71f064a0d7d38" gracePeriod=10 Jan 30 06:41:42 crc kubenswrapper[4841]: I0130 06:41:42.723420 4841 generic.go:334] "Generic (PLEG): container finished" podID="98ed3339-b24b-44c5-b2f5-47a5c84a39cd" containerID="ac000c6e9be776fa61763225bed18d5d5cb0c95daccf056091f71f064a0d7d38" exitCode=0 Jan 30 06:41:42 crc kubenswrapper[4841]: I0130 06:41:42.723626 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" event={"ID":"98ed3339-b24b-44c5-b2f5-47a5c84a39cd","Type":"ContainerDied","Data":"ac000c6e9be776fa61763225bed18d5d5cb0c95daccf056091f71f064a0d7d38"} Jan 30 06:41:42 crc kubenswrapper[4841]: I0130 06:41:42.950627 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.060624 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-dns-svc\") pod \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.060678 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-config\") pod \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.060732 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-ovsdbserver-sb\") pod \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.060780 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgblx\" (UniqueName: \"kubernetes.io/projected/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-kube-api-access-tgblx\") pod \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.060828 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-ovsdbserver-nb\") pod \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\" (UID: \"98ed3339-b24b-44c5-b2f5-47a5c84a39cd\") " Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.086502 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-kube-api-access-tgblx" (OuterVolumeSpecName: "kube-api-access-tgblx") pod "98ed3339-b24b-44c5-b2f5-47a5c84a39cd" (UID: "98ed3339-b24b-44c5-b2f5-47a5c84a39cd"). InnerVolumeSpecName "kube-api-access-tgblx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.119320 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-config" (OuterVolumeSpecName: "config") pod "98ed3339-b24b-44c5-b2f5-47a5c84a39cd" (UID: "98ed3339-b24b-44c5-b2f5-47a5c84a39cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.128454 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "98ed3339-b24b-44c5-b2f5-47a5c84a39cd" (UID: "98ed3339-b24b-44c5-b2f5-47a5c84a39cd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.134819 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "98ed3339-b24b-44c5-b2f5-47a5c84a39cd" (UID: "98ed3339-b24b-44c5-b2f5-47a5c84a39cd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.137810 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98ed3339-b24b-44c5-b2f5-47a5c84a39cd" (UID: "98ed3339-b24b-44c5-b2f5-47a5c84a39cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.162783 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.162828 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.162905 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.162914 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.162923 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgblx\" (UniqueName: \"kubernetes.io/projected/98ed3339-b24b-44c5-b2f5-47a5c84a39cd-kube-api-access-tgblx\") on node \"crc\" DevicePath \"\"" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.734565 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" event={"ID":"98ed3339-b24b-44c5-b2f5-47a5c84a39cd","Type":"ContainerDied","Data":"f5110dd8434a3c3cdc102ed4f1bae36a2437ea05b036f9528ef33eec3afa446c"} Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.734651 4841 scope.go:117] "RemoveContainer" containerID="ac000c6e9be776fa61763225bed18d5d5cb0c95daccf056091f71f064a0d7d38" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.734681 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f65fd76ff-29kqp" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.763562 4841 scope.go:117] "RemoveContainer" containerID="65ab125f74e01f81d0468ef3b9613680f09d9bcd57c7cdafaa370a8e415c699a" Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.783224 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f65fd76ff-29kqp"] Jan 30 06:41:43 crc kubenswrapper[4841]: I0130 06:41:43.790480 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f65fd76ff-29kqp"] Jan 30 06:41:44 crc kubenswrapper[4841]: I0130 06:41:44.444227 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98ed3339-b24b-44c5-b2f5-47a5c84a39cd" path="/var/lib/kubelet/pods/98ed3339-b24b-44c5-b2f5-47a5c84a39cd/volumes" Jan 30 06:41:49 crc kubenswrapper[4841]: I0130 06:41:49.966904 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.129919 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:42:07 crc kubenswrapper[4841]: E0130 06:42:07.131155 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98ed3339-b24b-44c5-b2f5-47a5c84a39cd" containerName="init" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.131182 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ed3339-b24b-44c5-b2f5-47a5c84a39cd" containerName="init" Jan 30 06:42:07 crc kubenswrapper[4841]: E0130 06:42:07.131213 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98ed3339-b24b-44c5-b2f5-47a5c84a39cd" containerName="dnsmasq-dns" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.131227 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ed3339-b24b-44c5-b2f5-47a5c84a39cd" containerName="dnsmasq-dns" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.131624 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="98ed3339-b24b-44c5-b2f5-47a5c84a39cd" containerName="dnsmasq-dns" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.133606 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.138688 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.149764 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.301144 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.301255 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkl98\" (UniqueName: \"kubernetes.io/projected/5a45b425-bf73-4efe-b978-38207c627bff-kube-api-access-vkl98\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.301316 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a45b425-bf73-4efe-b978-38207c627bff-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.301348 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.301474 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.301541 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.403544 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.404013 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.404159 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkl98\" (UniqueName: \"kubernetes.io/projected/5a45b425-bf73-4efe-b978-38207c627bff-kube-api-access-vkl98\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.404300 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a45b425-bf73-4efe-b978-38207c627bff-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.404446 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.404494 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a45b425-bf73-4efe-b978-38207c627bff-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.404587 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.413223 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.414608 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.418201 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.423732 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.428168 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkl98\" (UniqueName: \"kubernetes.io/projected/5a45b425-bf73-4efe-b978-38207c627bff-kube-api-access-vkl98\") pod \"cinder-scheduler-0\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.455062 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.938628 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:42:07 crc kubenswrapper[4841]: I0130 06:42:07.999485 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a45b425-bf73-4efe-b978-38207c627bff","Type":"ContainerStarted","Data":"2a8df443c5deefce0cb32c3a830e7ba8a4ba6107fe1395dc894d0cef40a0855f"} Jan 30 06:42:08 crc kubenswrapper[4841]: I0130 06:42:08.348505 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:42:08 crc kubenswrapper[4841]: I0130 06:42:08.348757 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="511b3720-7eec-46e8-9152-68e80b421928" containerName="cinder-api-log" containerID="cri-o://818d39892a6c6c11d2ad98cd64ae0a1b8fe49ec10e3b6450a88ad9471a6e0ea3" gracePeriod=30 Jan 30 06:42:08 crc kubenswrapper[4841]: I0130 06:42:08.349108 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="511b3720-7eec-46e8-9152-68e80b421928" containerName="cinder-api" containerID="cri-o://692c64788051a188cf0bd8815565445d11731b5b5c4c2895fee133ca5319cb4b" gracePeriod=30 Jan 30 06:42:09 crc kubenswrapper[4841]: I0130 06:42:09.009720 4841 generic.go:334] "Generic (PLEG): container finished" podID="511b3720-7eec-46e8-9152-68e80b421928" containerID="818d39892a6c6c11d2ad98cd64ae0a1b8fe49ec10e3b6450a88ad9471a6e0ea3" exitCode=143 Jan 30 06:42:09 crc kubenswrapper[4841]: I0130 06:42:09.009804 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"511b3720-7eec-46e8-9152-68e80b421928","Type":"ContainerDied","Data":"818d39892a6c6c11d2ad98cd64ae0a1b8fe49ec10e3b6450a88ad9471a6e0ea3"} Jan 30 06:42:09 crc kubenswrapper[4841]: I0130 06:42:09.011333 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a45b425-bf73-4efe-b978-38207c627bff","Type":"ContainerStarted","Data":"f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a"} Jan 30 06:42:10 crc kubenswrapper[4841]: I0130 06:42:10.043942 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a45b425-bf73-4efe-b978-38207c627bff","Type":"ContainerStarted","Data":"99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9"} Jan 30 06:42:10 crc kubenswrapper[4841]: I0130 06:42:10.076235 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.076214481 podStartE2EDuration="3.076214481s" podCreationTimestamp="2026-01-30 06:42:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:10.074797193 +0000 UTC m=+5667.068269871" watchObservedRunningTime="2026-01-30 06:42:10.076214481 +0000 UTC m=+5667.069687139" Jan 30 06:42:10 crc kubenswrapper[4841]: I0130 06:42:10.463472 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:42:10 crc kubenswrapper[4841]: I0130 06:42:10.464076 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.065228 4841 generic.go:334] "Generic (PLEG): container finished" podID="511b3720-7eec-46e8-9152-68e80b421928" containerID="692c64788051a188cf0bd8815565445d11731b5b5c4c2895fee133ca5319cb4b" exitCode=0 Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.065277 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"511b3720-7eec-46e8-9152-68e80b421928","Type":"ContainerDied","Data":"692c64788051a188cf0bd8815565445d11731b5b5c4c2895fee133ca5319cb4b"} Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.349299 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.473008 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.516914 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/511b3720-7eec-46e8-9152-68e80b421928-logs\") pod \"511b3720-7eec-46e8-9152-68e80b421928\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.516986 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-config-data\") pod \"511b3720-7eec-46e8-9152-68e80b421928\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.517054 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-scripts\") pod \"511b3720-7eec-46e8-9152-68e80b421928\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.517093 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq7qn\" (UniqueName: \"kubernetes.io/projected/511b3720-7eec-46e8-9152-68e80b421928-kube-api-access-sq7qn\") pod \"511b3720-7eec-46e8-9152-68e80b421928\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.517155 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-config-data-custom\") pod \"511b3720-7eec-46e8-9152-68e80b421928\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.517227 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-internal-tls-certs\") pod \"511b3720-7eec-46e8-9152-68e80b421928\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.517244 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/511b3720-7eec-46e8-9152-68e80b421928-etc-machine-id\") pod \"511b3720-7eec-46e8-9152-68e80b421928\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.517266 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-public-tls-certs\") pod \"511b3720-7eec-46e8-9152-68e80b421928\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.517302 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-combined-ca-bundle\") pod \"511b3720-7eec-46e8-9152-68e80b421928\" (UID: \"511b3720-7eec-46e8-9152-68e80b421928\") " Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.536200 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/511b3720-7eec-46e8-9152-68e80b421928-kube-api-access-sq7qn" (OuterVolumeSpecName: "kube-api-access-sq7qn") pod "511b3720-7eec-46e8-9152-68e80b421928" (UID: "511b3720-7eec-46e8-9152-68e80b421928"). InnerVolumeSpecName "kube-api-access-sq7qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.536483 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/511b3720-7eec-46e8-9152-68e80b421928-logs" (OuterVolumeSpecName: "logs") pod "511b3720-7eec-46e8-9152-68e80b421928" (UID: "511b3720-7eec-46e8-9152-68e80b421928"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.536793 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/511b3720-7eec-46e8-9152-68e80b421928-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "511b3720-7eec-46e8-9152-68e80b421928" (UID: "511b3720-7eec-46e8-9152-68e80b421928"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.550597 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-scripts" (OuterVolumeSpecName: "scripts") pod "511b3720-7eec-46e8-9152-68e80b421928" (UID: "511b3720-7eec-46e8-9152-68e80b421928"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.561678 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "511b3720-7eec-46e8-9152-68e80b421928" (UID: "511b3720-7eec-46e8-9152-68e80b421928"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.612550 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "511b3720-7eec-46e8-9152-68e80b421928" (UID: "511b3720-7eec-46e8-9152-68e80b421928"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.619997 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.620025 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq7qn\" (UniqueName: \"kubernetes.io/projected/511b3720-7eec-46e8-9152-68e80b421928-kube-api-access-sq7qn\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.620035 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.620043 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/511b3720-7eec-46e8-9152-68e80b421928-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.620051 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.620060 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/511b3720-7eec-46e8-9152-68e80b421928-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.625390 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "511b3720-7eec-46e8-9152-68e80b421928" (UID: "511b3720-7eec-46e8-9152-68e80b421928"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.628473 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "511b3720-7eec-46e8-9152-68e80b421928" (UID: "511b3720-7eec-46e8-9152-68e80b421928"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.638689 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-config-data" (OuterVolumeSpecName: "config-data") pod "511b3720-7eec-46e8-9152-68e80b421928" (UID: "511b3720-7eec-46e8-9152-68e80b421928"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.723520 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.723557 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:12 crc kubenswrapper[4841]: I0130 06:42:12.723570 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/511b3720-7eec-46e8-9152-68e80b421928-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.079304 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"511b3720-7eec-46e8-9152-68e80b421928","Type":"ContainerDied","Data":"b3fec028612627fae7a35b1d22b62d318b2726db66abb17d21c2e52f0304439b"} Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.079362 4841 scope.go:117] "RemoveContainer" containerID="692c64788051a188cf0bd8815565445d11731b5b5c4c2895fee133ca5319cb4b" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.079528 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.116877 4841 scope.go:117] "RemoveContainer" containerID="818d39892a6c6c11d2ad98cd64ae0a1b8fe49ec10e3b6450a88ad9471a6e0ea3" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.127489 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.141329 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.153705 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:42:13 crc kubenswrapper[4841]: E0130 06:42:13.157980 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="511b3720-7eec-46e8-9152-68e80b421928" containerName="cinder-api-log" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.158023 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="511b3720-7eec-46e8-9152-68e80b421928" containerName="cinder-api-log" Jan 30 06:42:13 crc kubenswrapper[4841]: E0130 06:42:13.158040 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="511b3720-7eec-46e8-9152-68e80b421928" containerName="cinder-api" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.158048 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="511b3720-7eec-46e8-9152-68e80b421928" containerName="cinder-api" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.158263 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="511b3720-7eec-46e8-9152-68e80b421928" containerName="cinder-api-log" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.158298 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="511b3720-7eec-46e8-9152-68e80b421928" containerName="cinder-api" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.159468 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.162580 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.162767 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.162897 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.171027 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.337199 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbwqd\" (UniqueName: \"kubernetes.io/projected/881d50e8-f2e2-4d39-b27d-b893a75b9470-kube-api-access-sbwqd\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.337262 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/881d50e8-f2e2-4d39-b27d-b893a75b9470-etc-machine-id\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.337288 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-config-data\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.337309 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.337327 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-config-data-custom\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.337354 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-scripts\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.337526 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/881d50e8-f2e2-4d39-b27d-b893a75b9470-logs\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.337729 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.337786 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-public-tls-certs\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.439638 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbwqd\" (UniqueName: \"kubernetes.io/projected/881d50e8-f2e2-4d39-b27d-b893a75b9470-kube-api-access-sbwqd\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.439695 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/881d50e8-f2e2-4d39-b27d-b893a75b9470-etc-machine-id\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.439714 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-config-data\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.439736 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.439758 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-config-data-custom\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.439786 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-scripts\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.439830 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/881d50e8-f2e2-4d39-b27d-b893a75b9470-logs\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.439857 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.439878 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-public-tls-certs\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.440836 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/881d50e8-f2e2-4d39-b27d-b893a75b9470-etc-machine-id\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.441145 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/881d50e8-f2e2-4d39-b27d-b893a75b9470-logs\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.444724 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.444882 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-scripts\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.446505 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-public-tls-certs\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.447466 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.448596 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-config-data\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.450631 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/881d50e8-f2e2-4d39-b27d-b893a75b9470-config-data-custom\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.471355 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbwqd\" (UniqueName: \"kubernetes.io/projected/881d50e8-f2e2-4d39-b27d-b893a75b9470-kube-api-access-sbwqd\") pod \"cinder-api-0\" (UID: \"881d50e8-f2e2-4d39-b27d-b893a75b9470\") " pod="openstack/cinder-api-0" Jan 30 06:42:13 crc kubenswrapper[4841]: I0130 06:42:13.494928 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 06:42:14 crc kubenswrapper[4841]: I0130 06:42:14.028953 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 06:42:14 crc kubenswrapper[4841]: I0130 06:42:14.109337 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"881d50e8-f2e2-4d39-b27d-b893a75b9470","Type":"ContainerStarted","Data":"7d22661cd775d494c9155238edd85fe2bde62f7e62a69c25f52e688f21983d38"} Jan 30 06:42:14 crc kubenswrapper[4841]: I0130 06:42:14.453492 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="511b3720-7eec-46e8-9152-68e80b421928" path="/var/lib/kubelet/pods/511b3720-7eec-46e8-9152-68e80b421928/volumes" Jan 30 06:42:15 crc kubenswrapper[4841]: I0130 06:42:15.128352 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"881d50e8-f2e2-4d39-b27d-b893a75b9470","Type":"ContainerStarted","Data":"57db06003bc81c66b98cc7950f653aaf4f44591f1210f66f47049eb314a76226"} Jan 30 06:42:16 crc kubenswrapper[4841]: I0130 06:42:16.140572 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"881d50e8-f2e2-4d39-b27d-b893a75b9470","Type":"ContainerStarted","Data":"a4edfae602984b08b543b71b667e965114fded7e5a86a5669790b1d743134976"} Jan 30 06:42:16 crc kubenswrapper[4841]: I0130 06:42:16.140825 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 06:42:16 crc kubenswrapper[4841]: I0130 06:42:16.184216 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.184184209 podStartE2EDuration="3.184184209s" podCreationTimestamp="2026-01-30 06:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:16.165789629 +0000 UTC m=+5673.159262307" watchObservedRunningTime="2026-01-30 06:42:16.184184209 +0000 UTC m=+5673.177656887" Jan 30 06:42:17 crc kubenswrapper[4841]: I0130 06:42:17.626638 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 06:42:17 crc kubenswrapper[4841]: I0130 06:42:17.695645 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:42:18 crc kubenswrapper[4841]: I0130 06:42:18.180198 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5a45b425-bf73-4efe-b978-38207c627bff" containerName="cinder-scheduler" containerID="cri-o://f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a" gracePeriod=30 Jan 30 06:42:18 crc kubenswrapper[4841]: I0130 06:42:18.180604 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5a45b425-bf73-4efe-b978-38207c627bff" containerName="probe" containerID="cri-o://99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9" gracePeriod=30 Jan 30 06:42:19 crc kubenswrapper[4841]: I0130 06:42:19.193610 4841 generic.go:334] "Generic (PLEG): container finished" podID="5a45b425-bf73-4efe-b978-38207c627bff" containerID="99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9" exitCode=0 Jan 30 06:42:19 crc kubenswrapper[4841]: I0130 06:42:19.193847 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a45b425-bf73-4efe-b978-38207c627bff","Type":"ContainerDied","Data":"99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9"} Jan 30 06:42:19 crc kubenswrapper[4841]: I0130 06:42:19.943355 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.071323 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-config-data-custom\") pod \"5a45b425-bf73-4efe-b978-38207c627bff\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.071436 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-config-data\") pod \"5a45b425-bf73-4efe-b978-38207c627bff\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.071566 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-combined-ca-bundle\") pod \"5a45b425-bf73-4efe-b978-38207c627bff\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.071597 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a45b425-bf73-4efe-b978-38207c627bff-etc-machine-id\") pod \"5a45b425-bf73-4efe-b978-38207c627bff\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.071633 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkl98\" (UniqueName: \"kubernetes.io/projected/5a45b425-bf73-4efe-b978-38207c627bff-kube-api-access-vkl98\") pod \"5a45b425-bf73-4efe-b978-38207c627bff\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.071919 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-scripts\") pod \"5a45b425-bf73-4efe-b978-38207c627bff\" (UID: \"5a45b425-bf73-4efe-b978-38207c627bff\") " Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.072312 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a45b425-bf73-4efe-b978-38207c627bff-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5a45b425-bf73-4efe-b978-38207c627bff" (UID: "5a45b425-bf73-4efe-b978-38207c627bff"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.073013 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a45b425-bf73-4efe-b978-38207c627bff-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.079299 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a45b425-bf73-4efe-b978-38207c627bff-kube-api-access-vkl98" (OuterVolumeSpecName: "kube-api-access-vkl98") pod "5a45b425-bf73-4efe-b978-38207c627bff" (UID: "5a45b425-bf73-4efe-b978-38207c627bff"). InnerVolumeSpecName "kube-api-access-vkl98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.080631 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-scripts" (OuterVolumeSpecName: "scripts") pod "5a45b425-bf73-4efe-b978-38207c627bff" (UID: "5a45b425-bf73-4efe-b978-38207c627bff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.080718 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5a45b425-bf73-4efe-b978-38207c627bff" (UID: "5a45b425-bf73-4efe-b978-38207c627bff"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.174793 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.174829 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.174842 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkl98\" (UniqueName: \"kubernetes.io/projected/5a45b425-bf73-4efe-b978-38207c627bff-kube-api-access-vkl98\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.176938 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a45b425-bf73-4efe-b978-38207c627bff" (UID: "5a45b425-bf73-4efe-b978-38207c627bff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.206981 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-config-data" (OuterVolumeSpecName: "config-data") pod "5a45b425-bf73-4efe-b978-38207c627bff" (UID: "5a45b425-bf73-4efe-b978-38207c627bff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.209051 4841 generic.go:334] "Generic (PLEG): container finished" podID="5a45b425-bf73-4efe-b978-38207c627bff" containerID="f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a" exitCode=0 Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.209109 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a45b425-bf73-4efe-b978-38207c627bff","Type":"ContainerDied","Data":"f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a"} Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.209146 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a45b425-bf73-4efe-b978-38207c627bff","Type":"ContainerDied","Data":"2a8df443c5deefce0cb32c3a830e7ba8a4ba6107fe1395dc894d0cef40a0855f"} Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.209174 4841 scope.go:117] "RemoveContainer" containerID="99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.209377 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.278106 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.278144 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a45b425-bf73-4efe-b978-38207c627bff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.307851 4841 scope.go:117] "RemoveContainer" containerID="f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.312285 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.323079 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.336016 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:42:20 crc kubenswrapper[4841]: E0130 06:42:20.336379 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a45b425-bf73-4efe-b978-38207c627bff" containerName="cinder-scheduler" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.336416 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a45b425-bf73-4efe-b978-38207c627bff" containerName="cinder-scheduler" Jan 30 06:42:20 crc kubenswrapper[4841]: E0130 06:42:20.336441 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a45b425-bf73-4efe-b978-38207c627bff" containerName="probe" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.336450 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a45b425-bf73-4efe-b978-38207c627bff" containerName="probe" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.336653 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a45b425-bf73-4efe-b978-38207c627bff" containerName="cinder-scheduler" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.336673 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a45b425-bf73-4efe-b978-38207c627bff" containerName="probe" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.338992 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.345203 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.358697 4841 scope.go:117] "RemoveContainer" containerID="99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9" Jan 30 06:42:20 crc kubenswrapper[4841]: E0130 06:42:20.359306 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9\": container with ID starting with 99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9 not found: ID does not exist" containerID="99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.359469 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9"} err="failed to get container status \"99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9\": rpc error: code = NotFound desc = could not find container \"99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9\": container with ID starting with 99b1d439cfbc18546f9e2aef592c4484dfe569860e077cd09afcb0aeee89dfb9 not found: ID does not exist" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.359617 4841 scope.go:117] "RemoveContainer" containerID="f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a" Jan 30 06:42:20 crc kubenswrapper[4841]: E0130 06:42:20.360121 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a\": container with ID starting with f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a not found: ID does not exist" containerID="f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.360174 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a"} err="failed to get container status \"f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a\": rpc error: code = NotFound desc = could not find container \"f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a\": container with ID starting with f26607372a025c72f1984ee3856aea944f2d2fe9c89f66f1157668dd0ec5fe5a not found: ID does not exist" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.371507 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.381168 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.381246 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-config-data\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.381277 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.381350 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-scripts\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.381643 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn5cd\" (UniqueName: \"kubernetes.io/projected/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-kube-api-access-vn5cd\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.381723 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.443454 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a45b425-bf73-4efe-b978-38207c627bff" path="/var/lib/kubelet/pods/5a45b425-bf73-4efe-b978-38207c627bff/volumes" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.483028 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-scripts\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.483501 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn5cd\" (UniqueName: \"kubernetes.io/projected/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-kube-api-access-vn5cd\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.483585 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.483721 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.483807 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-config-data\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.483952 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.484273 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.487295 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-scripts\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.487323 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.487557 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-config-data\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.491079 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.503148 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn5cd\" (UniqueName: \"kubernetes.io/projected/9e49f703-0cd0-4d76-aa9a-7694ac86e74b-kube-api-access-vn5cd\") pod \"cinder-scheduler-0\" (UID: \"9e49f703-0cd0-4d76-aa9a-7694ac86e74b\") " pod="openstack/cinder-scheduler-0" Jan 30 06:42:20 crc kubenswrapper[4841]: I0130 06:42:20.660134 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 06:42:21 crc kubenswrapper[4841]: I0130 06:42:21.195863 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 06:42:21 crc kubenswrapper[4841]: I0130 06:42:21.223007 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9e49f703-0cd0-4d76-aa9a-7694ac86e74b","Type":"ContainerStarted","Data":"b191b1a8cee444cfdf5f9c29b30c5eb6188044f8aab58720918ae8c54f7bd439"} Jan 30 06:42:22 crc kubenswrapper[4841]: I0130 06:42:22.242266 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9e49f703-0cd0-4d76-aa9a-7694ac86e74b","Type":"ContainerStarted","Data":"d4d59be4244de30a9fe8db225379d78bd6a25471967a7f02aee485dac204ff21"} Jan 30 06:42:23 crc kubenswrapper[4841]: I0130 06:42:23.257472 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9e49f703-0cd0-4d76-aa9a-7694ac86e74b","Type":"ContainerStarted","Data":"2fd315a7ad1ce735f383b8ef550c5e97bd10671691cc59184577fc74b4e09d57"} Jan 30 06:42:23 crc kubenswrapper[4841]: I0130 06:42:23.303867 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.303842103 podStartE2EDuration="3.303842103s" podCreationTimestamp="2026-01-30 06:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:23.285570615 +0000 UTC m=+5680.279043283" watchObservedRunningTime="2026-01-30 06:42:23.303842103 +0000 UTC m=+5680.297314771" Jan 30 06:42:25 crc kubenswrapper[4841]: I0130 06:42:25.255939 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 06:42:25 crc kubenswrapper[4841]: I0130 06:42:25.660546 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 06:42:30 crc kubenswrapper[4841]: I0130 06:42:30.933643 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.605034 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-knpsp"] Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.606642 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-knpsp" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.615534 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-knpsp"] Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.687932 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c71e210-3b42-46f6-ab07-55bcbd35e2d1-operator-scripts\") pod \"glance-db-create-knpsp\" (UID: \"0c71e210-3b42-46f6-ab07-55bcbd35e2d1\") " pod="openstack/glance-db-create-knpsp" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.688100 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrg5f\" (UniqueName: \"kubernetes.io/projected/0c71e210-3b42-46f6-ab07-55bcbd35e2d1-kube-api-access-zrg5f\") pod \"glance-db-create-knpsp\" (UID: \"0c71e210-3b42-46f6-ab07-55bcbd35e2d1\") " pod="openstack/glance-db-create-knpsp" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.714756 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9431-account-create-update-fv8z4"] Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.716175 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9431-account-create-update-fv8z4" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.718432 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.721114 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9431-account-create-update-fv8z4"] Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.790360 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80-operator-scripts\") pod \"glance-9431-account-create-update-fv8z4\" (UID: \"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80\") " pod="openstack/glance-9431-account-create-update-fv8z4" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.790436 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrg5f\" (UniqueName: \"kubernetes.io/projected/0c71e210-3b42-46f6-ab07-55bcbd35e2d1-kube-api-access-zrg5f\") pod \"glance-db-create-knpsp\" (UID: \"0c71e210-3b42-46f6-ab07-55bcbd35e2d1\") " pod="openstack/glance-db-create-knpsp" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.790505 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c71e210-3b42-46f6-ab07-55bcbd35e2d1-operator-scripts\") pod \"glance-db-create-knpsp\" (UID: \"0c71e210-3b42-46f6-ab07-55bcbd35e2d1\") " pod="openstack/glance-db-create-knpsp" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.790538 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52xkj\" (UniqueName: \"kubernetes.io/projected/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80-kube-api-access-52xkj\") pod \"glance-9431-account-create-update-fv8z4\" (UID: \"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80\") " pod="openstack/glance-9431-account-create-update-fv8z4" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.791649 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c71e210-3b42-46f6-ab07-55bcbd35e2d1-operator-scripts\") pod \"glance-db-create-knpsp\" (UID: \"0c71e210-3b42-46f6-ab07-55bcbd35e2d1\") " pod="openstack/glance-db-create-knpsp" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.814791 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrg5f\" (UniqueName: \"kubernetes.io/projected/0c71e210-3b42-46f6-ab07-55bcbd35e2d1-kube-api-access-zrg5f\") pod \"glance-db-create-knpsp\" (UID: \"0c71e210-3b42-46f6-ab07-55bcbd35e2d1\") " pod="openstack/glance-db-create-knpsp" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.893703 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80-operator-scripts\") pod \"glance-9431-account-create-update-fv8z4\" (UID: \"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80\") " pod="openstack/glance-9431-account-create-update-fv8z4" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.893790 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52xkj\" (UniqueName: \"kubernetes.io/projected/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80-kube-api-access-52xkj\") pod \"glance-9431-account-create-update-fv8z4\" (UID: \"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80\") " pod="openstack/glance-9431-account-create-update-fv8z4" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.894503 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80-operator-scripts\") pod \"glance-9431-account-create-update-fv8z4\" (UID: \"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80\") " pod="openstack/glance-9431-account-create-update-fv8z4" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.910580 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52xkj\" (UniqueName: \"kubernetes.io/projected/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80-kube-api-access-52xkj\") pod \"glance-9431-account-create-update-fv8z4\" (UID: \"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80\") " pod="openstack/glance-9431-account-create-update-fv8z4" Jan 30 06:42:33 crc kubenswrapper[4841]: I0130 06:42:33.997327 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-knpsp" Jan 30 06:42:34 crc kubenswrapper[4841]: I0130 06:42:34.035822 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9431-account-create-update-fv8z4" Jan 30 06:42:35 crc kubenswrapper[4841]: I0130 06:42:35.096591 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-knpsp"] Jan 30 06:42:35 crc kubenswrapper[4841]: W0130 06:42:35.218170 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb82e5ef1_3deb_4d3b_8ca7_fbd50c8d3e80.slice/crio-333d249dd8a85de0843ade95dbaf34816a11b2179db7949b8b7c393d0d514e7d WatchSource:0}: Error finding container 333d249dd8a85de0843ade95dbaf34816a11b2179db7949b8b7c393d0d514e7d: Status 404 returned error can't find the container with id 333d249dd8a85de0843ade95dbaf34816a11b2179db7949b8b7c393d0d514e7d Jan 30 06:42:35 crc kubenswrapper[4841]: I0130 06:42:35.221163 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9431-account-create-update-fv8z4"] Jan 30 06:42:35 crc kubenswrapper[4841]: I0130 06:42:35.381038 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9431-account-create-update-fv8z4" event={"ID":"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80","Type":"ContainerStarted","Data":"333d249dd8a85de0843ade95dbaf34816a11b2179db7949b8b7c393d0d514e7d"} Jan 30 06:42:35 crc kubenswrapper[4841]: I0130 06:42:35.382260 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-knpsp" event={"ID":"0c71e210-3b42-46f6-ab07-55bcbd35e2d1","Type":"ContainerStarted","Data":"0ec0aba676eba2cf38437b2a2e2bf6bc45825621a9f2e39dac03a4a0e3c9aedb"} Jan 30 06:42:36 crc kubenswrapper[4841]: I0130 06:42:36.396128 4841 generic.go:334] "Generic (PLEG): container finished" podID="b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80" containerID="fa2ca2b693605bd47db25d5b8fbb79d075832829fe0ba13ef71ba32be9c736a4" exitCode=0 Jan 30 06:42:36 crc kubenswrapper[4841]: I0130 06:42:36.396200 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9431-account-create-update-fv8z4" event={"ID":"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80","Type":"ContainerDied","Data":"fa2ca2b693605bd47db25d5b8fbb79d075832829fe0ba13ef71ba32be9c736a4"} Jan 30 06:42:36 crc kubenswrapper[4841]: I0130 06:42:36.399320 4841 generic.go:334] "Generic (PLEG): container finished" podID="0c71e210-3b42-46f6-ab07-55bcbd35e2d1" containerID="eb94a0c2c11530f01bce52ca38fa5f747ab7a54fb94a92cba3fdf274f6c21662" exitCode=0 Jan 30 06:42:36 crc kubenswrapper[4841]: I0130 06:42:36.399351 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-knpsp" event={"ID":"0c71e210-3b42-46f6-ab07-55bcbd35e2d1","Type":"ContainerDied","Data":"eb94a0c2c11530f01bce52ca38fa5f747ab7a54fb94a92cba3fdf274f6c21662"} Jan 30 06:42:37 crc kubenswrapper[4841]: I0130 06:42:37.919634 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-knpsp" Jan 30 06:42:37 crc kubenswrapper[4841]: I0130 06:42:37.924776 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9431-account-create-update-fv8z4" Jan 30 06:42:37 crc kubenswrapper[4841]: I0130 06:42:37.989353 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrg5f\" (UniqueName: \"kubernetes.io/projected/0c71e210-3b42-46f6-ab07-55bcbd35e2d1-kube-api-access-zrg5f\") pod \"0c71e210-3b42-46f6-ab07-55bcbd35e2d1\" (UID: \"0c71e210-3b42-46f6-ab07-55bcbd35e2d1\") " Jan 30 06:42:37 crc kubenswrapper[4841]: I0130 06:42:37.989685 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80-operator-scripts\") pod \"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80\" (UID: \"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80\") " Jan 30 06:42:37 crc kubenswrapper[4841]: I0130 06:42:37.989716 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52xkj\" (UniqueName: \"kubernetes.io/projected/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80-kube-api-access-52xkj\") pod \"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80\" (UID: \"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80\") " Jan 30 06:42:37 crc kubenswrapper[4841]: I0130 06:42:37.989784 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c71e210-3b42-46f6-ab07-55bcbd35e2d1-operator-scripts\") pod \"0c71e210-3b42-46f6-ab07-55bcbd35e2d1\" (UID: \"0c71e210-3b42-46f6-ab07-55bcbd35e2d1\") " Jan 30 06:42:37 crc kubenswrapper[4841]: I0130 06:42:37.990338 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c71e210-3b42-46f6-ab07-55bcbd35e2d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c71e210-3b42-46f6-ab07-55bcbd35e2d1" (UID: "0c71e210-3b42-46f6-ab07-55bcbd35e2d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:42:37 crc kubenswrapper[4841]: I0130 06:42:37.990676 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80" (UID: "b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:42:37 crc kubenswrapper[4841]: I0130 06:42:37.990766 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c71e210-3b42-46f6-ab07-55bcbd35e2d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:37 crc kubenswrapper[4841]: I0130 06:42:37.998124 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80-kube-api-access-52xkj" (OuterVolumeSpecName: "kube-api-access-52xkj") pod "b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80" (UID: "b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80"). InnerVolumeSpecName "kube-api-access-52xkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:38 crc kubenswrapper[4841]: I0130 06:42:38.000858 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c71e210-3b42-46f6-ab07-55bcbd35e2d1-kube-api-access-zrg5f" (OuterVolumeSpecName: "kube-api-access-zrg5f") pod "0c71e210-3b42-46f6-ab07-55bcbd35e2d1" (UID: "0c71e210-3b42-46f6-ab07-55bcbd35e2d1"). InnerVolumeSpecName "kube-api-access-zrg5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:38 crc kubenswrapper[4841]: I0130 06:42:38.092704 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:38 crc kubenswrapper[4841]: I0130 06:42:38.092795 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52xkj\" (UniqueName: \"kubernetes.io/projected/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80-kube-api-access-52xkj\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:38 crc kubenswrapper[4841]: I0130 06:42:38.092821 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrg5f\" (UniqueName: \"kubernetes.io/projected/0c71e210-3b42-46f6-ab07-55bcbd35e2d1-kube-api-access-zrg5f\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:38 crc kubenswrapper[4841]: I0130 06:42:38.427249 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9431-account-create-update-fv8z4" event={"ID":"b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80","Type":"ContainerDied","Data":"333d249dd8a85de0843ade95dbaf34816a11b2179db7949b8b7c393d0d514e7d"} Jan 30 06:42:38 crc kubenswrapper[4841]: I0130 06:42:38.427316 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="333d249dd8a85de0843ade95dbaf34816a11b2179db7949b8b7c393d0d514e7d" Jan 30 06:42:38 crc kubenswrapper[4841]: I0130 06:42:38.427276 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9431-account-create-update-fv8z4" Jan 30 06:42:38 crc kubenswrapper[4841]: I0130 06:42:38.429462 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-knpsp" event={"ID":"0c71e210-3b42-46f6-ab07-55bcbd35e2d1","Type":"ContainerDied","Data":"0ec0aba676eba2cf38437b2a2e2bf6bc45825621a9f2e39dac03a4a0e3c9aedb"} Jan 30 06:42:38 crc kubenswrapper[4841]: I0130 06:42:38.429516 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ec0aba676eba2cf38437b2a2e2bf6bc45825621a9f2e39dac03a4a0e3c9aedb" Jan 30 06:42:38 crc kubenswrapper[4841]: I0130 06:42:38.429565 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-knpsp" Jan 30 06:42:40 crc kubenswrapper[4841]: I0130 06:42:40.463874 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:42:40 crc kubenswrapper[4841]: I0130 06:42:40.464452 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:42:40 crc kubenswrapper[4841]: I0130 06:42:40.464493 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 06:42:40 crc kubenswrapper[4841]: I0130 06:42:40.465243 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:42:40 crc kubenswrapper[4841]: I0130 06:42:40.465302 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" gracePeriod=600 Jan 30 06:42:40 crc kubenswrapper[4841]: E0130 06:42:40.594988 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:42:41 crc kubenswrapper[4841]: I0130 06:42:41.467956 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" exitCode=0 Jan 30 06:42:41 crc kubenswrapper[4841]: I0130 06:42:41.468058 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156"} Jan 30 06:42:41 crc kubenswrapper[4841]: I0130 06:42:41.468445 4841 scope.go:117] "RemoveContainer" containerID="d8439101c0c816d59529151ad7b9f22d723c26d50210e194b3b0d5dd43185d44" Jan 30 06:42:41 crc kubenswrapper[4841]: I0130 06:42:41.469611 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:42:41 crc kubenswrapper[4841]: E0130 06:42:41.470465 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:42:43 crc kubenswrapper[4841]: I0130 06:42:43.948728 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8pzcg"] Jan 30 06:42:43 crc kubenswrapper[4841]: E0130 06:42:43.949767 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80" containerName="mariadb-account-create-update" Jan 30 06:42:43 crc kubenswrapper[4841]: I0130 06:42:43.949803 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80" containerName="mariadb-account-create-update" Jan 30 06:42:43 crc kubenswrapper[4841]: E0130 06:42:43.949877 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c71e210-3b42-46f6-ab07-55bcbd35e2d1" containerName="mariadb-database-create" Jan 30 06:42:43 crc kubenswrapper[4841]: I0130 06:42:43.949894 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c71e210-3b42-46f6-ab07-55bcbd35e2d1" containerName="mariadb-database-create" Jan 30 06:42:43 crc kubenswrapper[4841]: I0130 06:42:43.950357 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c71e210-3b42-46f6-ab07-55bcbd35e2d1" containerName="mariadb-database-create" Jan 30 06:42:43 crc kubenswrapper[4841]: I0130 06:42:43.950468 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80" containerName="mariadb-account-create-update" Jan 30 06:42:43 crc kubenswrapper[4841]: I0130 06:42:43.951695 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:43 crc kubenswrapper[4841]: I0130 06:42:43.954832 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s4m68" Jan 30 06:42:43 crc kubenswrapper[4841]: I0130 06:42:43.954861 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 30 06:42:43 crc kubenswrapper[4841]: I0130 06:42:43.962715 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8pzcg"] Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.015271 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-config-data\") pod \"glance-db-sync-8pzcg\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.015668 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66dxp\" (UniqueName: \"kubernetes.io/projected/21d66874-b86c-45c1-80b3-23f6f4741c37-kube-api-access-66dxp\") pod \"glance-db-sync-8pzcg\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.015940 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-combined-ca-bundle\") pod \"glance-db-sync-8pzcg\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.016116 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-db-sync-config-data\") pod \"glance-db-sync-8pzcg\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.118168 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-db-sync-config-data\") pod \"glance-db-sync-8pzcg\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.118229 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-config-data\") pod \"glance-db-sync-8pzcg\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.118315 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66dxp\" (UniqueName: \"kubernetes.io/projected/21d66874-b86c-45c1-80b3-23f6f4741c37-kube-api-access-66dxp\") pod \"glance-db-sync-8pzcg\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.118439 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-combined-ca-bundle\") pod \"glance-db-sync-8pzcg\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.125558 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-combined-ca-bundle\") pod \"glance-db-sync-8pzcg\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.129496 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-config-data\") pod \"glance-db-sync-8pzcg\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.133556 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-db-sync-config-data\") pod \"glance-db-sync-8pzcg\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.158287 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66dxp\" (UniqueName: \"kubernetes.io/projected/21d66874-b86c-45c1-80b3-23f6f4741c37-kube-api-access-66dxp\") pod \"glance-db-sync-8pzcg\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:44 crc kubenswrapper[4841]: I0130 06:42:44.282926 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:45 crc kubenswrapper[4841]: I0130 06:42:45.128005 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8pzcg"] Jan 30 06:42:45 crc kubenswrapper[4841]: W0130 06:42:45.142393 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21d66874_b86c_45c1_80b3_23f6f4741c37.slice/crio-a40b53b5850c471f1afda84817bac8af122147e20970258774ef7e1ee2ad082f WatchSource:0}: Error finding container a40b53b5850c471f1afda84817bac8af122147e20970258774ef7e1ee2ad082f: Status 404 returned error can't find the container with id a40b53b5850c471f1afda84817bac8af122147e20970258774ef7e1ee2ad082f Jan 30 06:42:45 crc kubenswrapper[4841]: I0130 06:42:45.516797 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8pzcg" event={"ID":"21d66874-b86c-45c1-80b3-23f6f4741c37","Type":"ContainerStarted","Data":"a40b53b5850c471f1afda84817bac8af122147e20970258774ef7e1ee2ad082f"} Jan 30 06:42:46 crc kubenswrapper[4841]: I0130 06:42:46.531138 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8pzcg" event={"ID":"21d66874-b86c-45c1-80b3-23f6f4741c37","Type":"ContainerStarted","Data":"1c64e1f4c727edeb66150143ae8a5d7dbb6d77d5dc8cdbfd3be87b1c00dc58a4"} Jan 30 06:42:46 crc kubenswrapper[4841]: I0130 06:42:46.567064 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8pzcg" podStartSLOduration=3.567041144 podStartE2EDuration="3.567041144s" podCreationTimestamp="2026-01-30 06:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:46.548613902 +0000 UTC m=+5703.542086580" watchObservedRunningTime="2026-01-30 06:42:46.567041144 +0000 UTC m=+5703.560513792" Jan 30 06:42:49 crc kubenswrapper[4841]: I0130 06:42:49.569032 4841 generic.go:334] "Generic (PLEG): container finished" podID="21d66874-b86c-45c1-80b3-23f6f4741c37" containerID="1c64e1f4c727edeb66150143ae8a5d7dbb6d77d5dc8cdbfd3be87b1c00dc58a4" exitCode=0 Jan 30 06:42:49 crc kubenswrapper[4841]: I0130 06:42:49.569114 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8pzcg" event={"ID":"21d66874-b86c-45c1-80b3-23f6f4741c37","Type":"ContainerDied","Data":"1c64e1f4c727edeb66150143ae8a5d7dbb6d77d5dc8cdbfd3be87b1c00dc58a4"} Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.121552 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.187126 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66dxp\" (UniqueName: \"kubernetes.io/projected/21d66874-b86c-45c1-80b3-23f6f4741c37-kube-api-access-66dxp\") pod \"21d66874-b86c-45c1-80b3-23f6f4741c37\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.187293 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-config-data\") pod \"21d66874-b86c-45c1-80b3-23f6f4741c37\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.187315 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-db-sync-config-data\") pod \"21d66874-b86c-45c1-80b3-23f6f4741c37\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.187341 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-combined-ca-bundle\") pod \"21d66874-b86c-45c1-80b3-23f6f4741c37\" (UID: \"21d66874-b86c-45c1-80b3-23f6f4741c37\") " Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.194244 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "21d66874-b86c-45c1-80b3-23f6f4741c37" (UID: "21d66874-b86c-45c1-80b3-23f6f4741c37"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.206669 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d66874-b86c-45c1-80b3-23f6f4741c37-kube-api-access-66dxp" (OuterVolumeSpecName: "kube-api-access-66dxp") pod "21d66874-b86c-45c1-80b3-23f6f4741c37" (UID: "21d66874-b86c-45c1-80b3-23f6f4741c37"). InnerVolumeSpecName "kube-api-access-66dxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.229713 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21d66874-b86c-45c1-80b3-23f6f4741c37" (UID: "21d66874-b86c-45c1-80b3-23f6f4741c37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.247791 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-config-data" (OuterVolumeSpecName: "config-data") pod "21d66874-b86c-45c1-80b3-23f6f4741c37" (UID: "21d66874-b86c-45c1-80b3-23f6f4741c37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.288919 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66dxp\" (UniqueName: \"kubernetes.io/projected/21d66874-b86c-45c1-80b3-23f6f4741c37-kube-api-access-66dxp\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.288978 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.289003 4841 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.289019 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d66874-b86c-45c1-80b3-23f6f4741c37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.599001 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8pzcg" event={"ID":"21d66874-b86c-45c1-80b3-23f6f4741c37","Type":"ContainerDied","Data":"a40b53b5850c471f1afda84817bac8af122147e20970258774ef7e1ee2ad082f"} Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.599322 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a40b53b5850c471f1afda84817bac8af122147e20970258774ef7e1ee2ad082f" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.599103 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8pzcg" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.838959 4841 scope.go:117] "RemoveContainer" containerID="693fa6c383d647bff707e202f8e0064663d93ab71ebd7809058ad7b9c824121b" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.892994 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:42:51 crc kubenswrapper[4841]: E0130 06:42:51.895207 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d66874-b86c-45c1-80b3-23f6f4741c37" containerName="glance-db-sync" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.895232 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d66874-b86c-45c1-80b3-23f6f4741c37" containerName="glance-db-sync" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.895518 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d66874-b86c-45c1-80b3-23f6f4741c37" containerName="glance-db-sync" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.896609 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.901808 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s4m68" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.901958 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.902258 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.904856 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.994218 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b6594c57-jm9l7"] Jan 30 06:42:51 crc kubenswrapper[4841]: I0130 06:42:51.995544 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.004412 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8wm4\" (UniqueName: \"kubernetes.io/projected/126a050b-ff90-402d-8146-45c1851d55e3-kube-api-access-s8wm4\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.004480 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126a050b-ff90-402d-8146-45c1851d55e3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.004500 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.004559 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126a050b-ff90-402d-8146-45c1851d55e3-logs\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.004583 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-config-data\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.004605 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-scripts\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.041459 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6594c57-jm9l7"] Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.076368 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.077624 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.079924 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.093707 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107307 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107370 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8wm4\" (UniqueName: \"kubernetes.io/projected/126a050b-ff90-402d-8146-45c1851d55e3-kube-api-access-s8wm4\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107431 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126a050b-ff90-402d-8146-45c1851d55e3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107457 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107503 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107539 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-config\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107559 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107578 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126a050b-ff90-402d-8146-45c1851d55e3-logs\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107600 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-scripts\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107620 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-config-data\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107645 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-scripts\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107665 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-config-data\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107687 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/323921fc-1966-445c-b59f-349fb5396568-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107705 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b79v\" (UniqueName: \"kubernetes.io/projected/817caeb4-49b8-4e82-a870-e9bb3792ff16-kube-api-access-2b79v\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107734 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/323921fc-1966-445c-b59f-349fb5396568-logs\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107749 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp55g\" (UniqueName: \"kubernetes.io/projected/323921fc-1966-445c-b59f-349fb5396568-kube-api-access-gp55g\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.107769 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-dns-svc\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.108516 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126a050b-ff90-402d-8146-45c1851d55e3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.108801 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126a050b-ff90-402d-8146-45c1851d55e3-logs\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.113894 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-config-data\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.117010 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-scripts\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.126280 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.133970 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8wm4\" (UniqueName: \"kubernetes.io/projected/126a050b-ff90-402d-8146-45c1851d55e3-kube-api-access-s8wm4\") pod \"glance-default-external-api-0\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.209154 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.209204 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-config\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.209227 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.209253 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-scripts\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.209285 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-config-data\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.209303 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/323921fc-1966-445c-b59f-349fb5396568-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.209320 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b79v\" (UniqueName: \"kubernetes.io/projected/817caeb4-49b8-4e82-a870-e9bb3792ff16-kube-api-access-2b79v\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.209349 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/323921fc-1966-445c-b59f-349fb5396568-logs\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.209364 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp55g\" (UniqueName: \"kubernetes.io/projected/323921fc-1966-445c-b59f-349fb5396568-kube-api-access-gp55g\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.209382 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-dns-svc\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.209421 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.210013 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/323921fc-1966-445c-b59f-349fb5396568-logs\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.210154 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/323921fc-1966-445c-b59f-349fb5396568-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.210303 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.210333 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-config\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.210800 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.210971 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-dns-svc\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.213784 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-scripts\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.215159 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.215718 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-config-data\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.224160 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b79v\" (UniqueName: \"kubernetes.io/projected/817caeb4-49b8-4e82-a870-e9bb3792ff16-kube-api-access-2b79v\") pod \"dnsmasq-dns-6b6594c57-jm9l7\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.226151 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.232767 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp55g\" (UniqueName: \"kubernetes.io/projected/323921fc-1966-445c-b59f-349fb5396568-kube-api-access-gp55g\") pod \"glance-default-internal-api-0\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.365569 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.408022 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.725599 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.951343 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:42:52 crc kubenswrapper[4841]: I0130 06:42:52.951415 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:42:53 crc kubenswrapper[4841]: I0130 06:42:53.001574 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6594c57-jm9l7"] Jan 30 06:42:53 crc kubenswrapper[4841]: W0130 06:42:53.010365 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod817caeb4_49b8_4e82_a870_e9bb3792ff16.slice/crio-8e42a2589124d808a547427b4aced264d45d8d0c8091f1da5580fcce37549b29 WatchSource:0}: Error finding container 8e42a2589124d808a547427b4aced264d45d8d0c8091f1da5580fcce37549b29: Status 404 returned error can't find the container with id 8e42a2589124d808a547427b4aced264d45d8d0c8091f1da5580fcce37549b29 Jan 30 06:42:53 crc kubenswrapper[4841]: I0130 06:42:53.616858 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"323921fc-1966-445c-b59f-349fb5396568","Type":"ContainerStarted","Data":"2973ebe20c03117130dcd91a9081e87b210d967cefc50beb12d27e7de4c45c65"} Jan 30 06:42:53 crc kubenswrapper[4841]: I0130 06:42:53.617119 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"323921fc-1966-445c-b59f-349fb5396568","Type":"ContainerStarted","Data":"a151826ac5e441910c50702dd08ad0e4c9ba6da65af207a9a342a358c2482cd7"} Jan 30 06:42:53 crc kubenswrapper[4841]: I0130 06:42:53.618872 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"126a050b-ff90-402d-8146-45c1851d55e3","Type":"ContainerStarted","Data":"1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0"} Jan 30 06:42:53 crc kubenswrapper[4841]: I0130 06:42:53.618901 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"126a050b-ff90-402d-8146-45c1851d55e3","Type":"ContainerStarted","Data":"6c918de50a07846db3ee84d5736a263c42482d70fe21c300813d42d12ee07da2"} Jan 30 06:42:53 crc kubenswrapper[4841]: I0130 06:42:53.620565 4841 generic.go:334] "Generic (PLEG): container finished" podID="817caeb4-49b8-4e82-a870-e9bb3792ff16" containerID="00ef51876d22b3b5f234ad11e518c5acb25e88862e920270beab80256f38ced4" exitCode=0 Jan 30 06:42:53 crc kubenswrapper[4841]: I0130 06:42:53.620611 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" event={"ID":"817caeb4-49b8-4e82-a870-e9bb3792ff16","Type":"ContainerDied","Data":"00ef51876d22b3b5f234ad11e518c5acb25e88862e920270beab80256f38ced4"} Jan 30 06:42:53 crc kubenswrapper[4841]: I0130 06:42:53.620641 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" event={"ID":"817caeb4-49b8-4e82-a870-e9bb3792ff16","Type":"ContainerStarted","Data":"8e42a2589124d808a547427b4aced264d45d8d0c8091f1da5580fcce37549b29"} Jan 30 06:42:53 crc kubenswrapper[4841]: I0130 06:42:53.999301 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:42:54 crc kubenswrapper[4841]: I0130 06:42:54.445416 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:42:54 crc kubenswrapper[4841]: E0130 06:42:54.446032 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:42:54 crc kubenswrapper[4841]: I0130 06:42:54.632852 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" event={"ID":"817caeb4-49b8-4e82-a870-e9bb3792ff16","Type":"ContainerStarted","Data":"5f14bcfe4d5d1da7333f11614795fc8eeb991f60df718c2bcafabb4df680d67c"} Jan 30 06:42:54 crc kubenswrapper[4841]: I0130 06:42:54.634038 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:42:54 crc kubenswrapper[4841]: I0130 06:42:54.640771 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"323921fc-1966-445c-b59f-349fb5396568","Type":"ContainerStarted","Data":"63126f09e23095d3e5ab95b1aa0ca67793ef22b7b93db4712d1bebc5b6751bdb"} Jan 30 06:42:54 crc kubenswrapper[4841]: I0130 06:42:54.640950 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="323921fc-1966-445c-b59f-349fb5396568" containerName="glance-log" containerID="cri-o://2973ebe20c03117130dcd91a9081e87b210d967cefc50beb12d27e7de4c45c65" gracePeriod=30 Jan 30 06:42:54 crc kubenswrapper[4841]: I0130 06:42:54.641210 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="323921fc-1966-445c-b59f-349fb5396568" containerName="glance-httpd" containerID="cri-o://63126f09e23095d3e5ab95b1aa0ca67793ef22b7b93db4712d1bebc5b6751bdb" gracePeriod=30 Jan 30 06:42:54 crc kubenswrapper[4841]: I0130 06:42:54.648313 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"126a050b-ff90-402d-8146-45c1851d55e3","Type":"ContainerStarted","Data":"116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73"} Jan 30 06:42:54 crc kubenswrapper[4841]: I0130 06:42:54.648448 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="126a050b-ff90-402d-8146-45c1851d55e3" containerName="glance-log" containerID="cri-o://1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0" gracePeriod=30 Jan 30 06:42:54 crc kubenswrapper[4841]: I0130 06:42:54.648525 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="126a050b-ff90-402d-8146-45c1851d55e3" containerName="glance-httpd" containerID="cri-o://116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73" gracePeriod=30 Jan 30 06:42:54 crc kubenswrapper[4841]: I0130 06:42:54.680837 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" podStartSLOduration=3.680813113 podStartE2EDuration="3.680813113s" podCreationTimestamp="2026-01-30 06:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:54.656264628 +0000 UTC m=+5711.649737266" watchObservedRunningTime="2026-01-30 06:42:54.680813113 +0000 UTC m=+5711.674285761" Jan 30 06:42:54 crc kubenswrapper[4841]: I0130 06:42:54.689083 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.6890573829999997 podStartE2EDuration="3.689057383s" podCreationTimestamp="2026-01-30 06:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:54.683070943 +0000 UTC m=+5711.676543621" watchObservedRunningTime="2026-01-30 06:42:54.689057383 +0000 UTC m=+5711.682530051" Jan 30 06:42:54 crc kubenswrapper[4841]: I0130 06:42:54.704610 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.704592687 podStartE2EDuration="2.704592687s" podCreationTimestamp="2026-01-30 06:42:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:54.699171383 +0000 UTC m=+5711.692644021" watchObservedRunningTime="2026-01-30 06:42:54.704592687 +0000 UTC m=+5711.698065315" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.299194 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.390417 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-config-data\") pod \"126a050b-ff90-402d-8146-45c1851d55e3\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.390484 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-combined-ca-bundle\") pod \"126a050b-ff90-402d-8146-45c1851d55e3\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.390643 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126a050b-ff90-402d-8146-45c1851d55e3-httpd-run\") pod \"126a050b-ff90-402d-8146-45c1851d55e3\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.390707 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8wm4\" (UniqueName: \"kubernetes.io/projected/126a050b-ff90-402d-8146-45c1851d55e3-kube-api-access-s8wm4\") pod \"126a050b-ff90-402d-8146-45c1851d55e3\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.391641 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/126a050b-ff90-402d-8146-45c1851d55e3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "126a050b-ff90-402d-8146-45c1851d55e3" (UID: "126a050b-ff90-402d-8146-45c1851d55e3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.391783 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-scripts\") pod \"126a050b-ff90-402d-8146-45c1851d55e3\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.391842 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126a050b-ff90-402d-8146-45c1851d55e3-logs\") pod \"126a050b-ff90-402d-8146-45c1851d55e3\" (UID: \"126a050b-ff90-402d-8146-45c1851d55e3\") " Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.392091 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/126a050b-ff90-402d-8146-45c1851d55e3-logs" (OuterVolumeSpecName: "logs") pod "126a050b-ff90-402d-8146-45c1851d55e3" (UID: "126a050b-ff90-402d-8146-45c1851d55e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.392732 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126a050b-ff90-402d-8146-45c1851d55e3-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.392750 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126a050b-ff90-402d-8146-45c1851d55e3-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.399167 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-scripts" (OuterVolumeSpecName: "scripts") pod "126a050b-ff90-402d-8146-45c1851d55e3" (UID: "126a050b-ff90-402d-8146-45c1851d55e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.421346 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126a050b-ff90-402d-8146-45c1851d55e3-kube-api-access-s8wm4" (OuterVolumeSpecName: "kube-api-access-s8wm4") pod "126a050b-ff90-402d-8146-45c1851d55e3" (UID: "126a050b-ff90-402d-8146-45c1851d55e3"). InnerVolumeSpecName "kube-api-access-s8wm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.424044 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "126a050b-ff90-402d-8146-45c1851d55e3" (UID: "126a050b-ff90-402d-8146-45c1851d55e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.461758 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-config-data" (OuterVolumeSpecName: "config-data") pod "126a050b-ff90-402d-8146-45c1851d55e3" (UID: "126a050b-ff90-402d-8146-45c1851d55e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.494088 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8wm4\" (UniqueName: \"kubernetes.io/projected/126a050b-ff90-402d-8146-45c1851d55e3-kube-api-access-s8wm4\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.494112 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.494123 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.494132 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126a050b-ff90-402d-8146-45c1851d55e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.662618 4841 generic.go:334] "Generic (PLEG): container finished" podID="126a050b-ff90-402d-8146-45c1851d55e3" containerID="116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73" exitCode=0 Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.662675 4841 generic.go:334] "Generic (PLEG): container finished" podID="126a050b-ff90-402d-8146-45c1851d55e3" containerID="1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0" exitCode=143 Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.662746 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"126a050b-ff90-402d-8146-45c1851d55e3","Type":"ContainerDied","Data":"116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73"} Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.662787 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"126a050b-ff90-402d-8146-45c1851d55e3","Type":"ContainerDied","Data":"1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0"} Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.662806 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"126a050b-ff90-402d-8146-45c1851d55e3","Type":"ContainerDied","Data":"6c918de50a07846db3ee84d5736a263c42482d70fe21c300813d42d12ee07da2"} Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.662832 4841 scope.go:117] "RemoveContainer" containerID="116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.663012 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.675917 4841 generic.go:334] "Generic (PLEG): container finished" podID="323921fc-1966-445c-b59f-349fb5396568" containerID="63126f09e23095d3e5ab95b1aa0ca67793ef22b7b93db4712d1bebc5b6751bdb" exitCode=0 Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.675966 4841 generic.go:334] "Generic (PLEG): container finished" podID="323921fc-1966-445c-b59f-349fb5396568" containerID="2973ebe20c03117130dcd91a9081e87b210d967cefc50beb12d27e7de4c45c65" exitCode=143 Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.676323 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"323921fc-1966-445c-b59f-349fb5396568","Type":"ContainerDied","Data":"63126f09e23095d3e5ab95b1aa0ca67793ef22b7b93db4712d1bebc5b6751bdb"} Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.676517 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"323921fc-1966-445c-b59f-349fb5396568","Type":"ContainerDied","Data":"2973ebe20c03117130dcd91a9081e87b210d967cefc50beb12d27e7de4c45c65"} Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.729176 4841 scope.go:117] "RemoveContainer" containerID="1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.738076 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.756950 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.769133 4841 scope.go:117] "RemoveContainer" containerID="116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73" Jan 30 06:42:55 crc kubenswrapper[4841]: E0130 06:42:55.770751 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73\": container with ID starting with 116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73 not found: ID does not exist" containerID="116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.770878 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73"} err="failed to get container status \"116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73\": rpc error: code = NotFound desc = could not find container \"116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73\": container with ID starting with 116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73 not found: ID does not exist" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.770926 4841 scope.go:117] "RemoveContainer" containerID="1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0" Jan 30 06:42:55 crc kubenswrapper[4841]: E0130 06:42:55.771461 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0\": container with ID starting with 1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0 not found: ID does not exist" containerID="1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.771500 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0"} err="failed to get container status \"1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0\": rpc error: code = NotFound desc = could not find container \"1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0\": container with ID starting with 1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0 not found: ID does not exist" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.771530 4841 scope.go:117] "RemoveContainer" containerID="116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.772768 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73"} err="failed to get container status \"116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73\": rpc error: code = NotFound desc = could not find container \"116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73\": container with ID starting with 116ce6bc760aa74bdecee0979965e93bd954887836da184d4e82b5e93b04eb73 not found: ID does not exist" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.772799 4841 scope.go:117] "RemoveContainer" containerID="1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.773159 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0"} err="failed to get container status \"1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0\": rpc error: code = NotFound desc = could not find container \"1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0\": container with ID starting with 1d3f86dd16b4c05a3c038698f290e45fccbf1fada297f00bcf0f53490ebc8be0 not found: ID does not exist" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.778138 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:42:55 crc kubenswrapper[4841]: E0130 06:42:55.778546 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126a050b-ff90-402d-8146-45c1851d55e3" containerName="glance-httpd" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.778567 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="126a050b-ff90-402d-8146-45c1851d55e3" containerName="glance-httpd" Jan 30 06:42:55 crc kubenswrapper[4841]: E0130 06:42:55.778604 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126a050b-ff90-402d-8146-45c1851d55e3" containerName="glance-log" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.778612 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="126a050b-ff90-402d-8146-45c1851d55e3" containerName="glance-log" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.778832 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="126a050b-ff90-402d-8146-45c1851d55e3" containerName="glance-log" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.778864 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="126a050b-ff90-402d-8146-45c1851d55e3" containerName="glance-httpd" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.780822 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.783852 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.784769 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.804364 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.902931 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jflds\" (UniqueName: \"kubernetes.io/projected/95f2605a-425e-4320-a8c0-e78d3ad93fbb-kube-api-access-jflds\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.903018 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95f2605a-425e-4320-a8c0-e78d3ad93fbb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.903048 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.903182 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-scripts\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.903272 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.903335 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95f2605a-425e-4320-a8c0-e78d3ad93fbb-logs\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:55 crc kubenswrapper[4841]: I0130 06:42:55.903516 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-config-data\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.005142 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jflds\" (UniqueName: \"kubernetes.io/projected/95f2605a-425e-4320-a8c0-e78d3ad93fbb-kube-api-access-jflds\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.005221 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95f2605a-425e-4320-a8c0-e78d3ad93fbb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.005247 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.005281 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-scripts\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.005315 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.005338 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95f2605a-425e-4320-a8c0-e78d3ad93fbb-logs\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.005380 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-config-data\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.005869 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95f2605a-425e-4320-a8c0-e78d3ad93fbb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.006023 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95f2605a-425e-4320-a8c0-e78d3ad93fbb-logs\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.012913 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.013023 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.014667 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-scripts\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.016815 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-config-data\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.023821 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jflds\" (UniqueName: \"kubernetes.io/projected/95f2605a-425e-4320-a8c0-e78d3ad93fbb-kube-api-access-jflds\") pod \"glance-default-external-api-0\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.085648 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.107002 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.107021 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/323921fc-1966-445c-b59f-349fb5396568-logs\") pod \"323921fc-1966-445c-b59f-349fb5396568\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.107208 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-scripts\") pod \"323921fc-1966-445c-b59f-349fb5396568\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.107328 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp55g\" (UniqueName: \"kubernetes.io/projected/323921fc-1966-445c-b59f-349fb5396568-kube-api-access-gp55g\") pod \"323921fc-1966-445c-b59f-349fb5396568\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.107352 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/323921fc-1966-445c-b59f-349fb5396568-logs" (OuterVolumeSpecName: "logs") pod "323921fc-1966-445c-b59f-349fb5396568" (UID: "323921fc-1966-445c-b59f-349fb5396568"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.107488 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/323921fc-1966-445c-b59f-349fb5396568-httpd-run\") pod \"323921fc-1966-445c-b59f-349fb5396568\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.107544 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-config-data\") pod \"323921fc-1966-445c-b59f-349fb5396568\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.107586 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-combined-ca-bundle\") pod \"323921fc-1966-445c-b59f-349fb5396568\" (UID: \"323921fc-1966-445c-b59f-349fb5396568\") " Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.107656 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/323921fc-1966-445c-b59f-349fb5396568-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "323921fc-1966-445c-b59f-349fb5396568" (UID: "323921fc-1966-445c-b59f-349fb5396568"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.108122 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/323921fc-1966-445c-b59f-349fb5396568-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.108140 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/323921fc-1966-445c-b59f-349fb5396568-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.110786 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-scripts" (OuterVolumeSpecName: "scripts") pod "323921fc-1966-445c-b59f-349fb5396568" (UID: "323921fc-1966-445c-b59f-349fb5396568"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.112672 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/323921fc-1966-445c-b59f-349fb5396568-kube-api-access-gp55g" (OuterVolumeSpecName: "kube-api-access-gp55g") pod "323921fc-1966-445c-b59f-349fb5396568" (UID: "323921fc-1966-445c-b59f-349fb5396568"). InnerVolumeSpecName "kube-api-access-gp55g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.132340 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "323921fc-1966-445c-b59f-349fb5396568" (UID: "323921fc-1966-445c-b59f-349fb5396568"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.162612 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-config-data" (OuterVolumeSpecName: "config-data") pod "323921fc-1966-445c-b59f-349fb5396568" (UID: "323921fc-1966-445c-b59f-349fb5396568"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.210155 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.210193 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp55g\" (UniqueName: \"kubernetes.io/projected/323921fc-1966-445c-b59f-349fb5396568-kube-api-access-gp55g\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.210204 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.210213 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/323921fc-1966-445c-b59f-349fb5396568-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.448117 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="126a050b-ff90-402d-8146-45c1851d55e3" path="/var/lib/kubelet/pods/126a050b-ff90-402d-8146-45c1851d55e3/volumes" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.689597 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.691685 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"323921fc-1966-445c-b59f-349fb5396568","Type":"ContainerDied","Data":"a151826ac5e441910c50702dd08ad0e4c9ba6da65af207a9a342a358c2482cd7"} Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.691718 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.691768 4841 scope.go:117] "RemoveContainer" containerID="63126f09e23095d3e5ab95b1aa0ca67793ef22b7b93db4712d1bebc5b6751bdb" Jan 30 06:42:56 crc kubenswrapper[4841]: W0130 06:42:56.699035 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95f2605a_425e_4320_a8c0_e78d3ad93fbb.slice/crio-f82953e97852f28f78d9a9197436b66479991a54a292afbf06ca136c6a3af681 WatchSource:0}: Error finding container f82953e97852f28f78d9a9197436b66479991a54a292afbf06ca136c6a3af681: Status 404 returned error can't find the container with id f82953e97852f28f78d9a9197436b66479991a54a292afbf06ca136c6a3af681 Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.836821 4841 scope.go:117] "RemoveContainer" containerID="2973ebe20c03117130dcd91a9081e87b210d967cefc50beb12d27e7de4c45c65" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.850558 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.867235 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.878440 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:42:56 crc kubenswrapper[4841]: E0130 06:42:56.878856 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="323921fc-1966-445c-b59f-349fb5396568" containerName="glance-log" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.878883 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="323921fc-1966-445c-b59f-349fb5396568" containerName="glance-log" Jan 30 06:42:56 crc kubenswrapper[4841]: E0130 06:42:56.878915 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="323921fc-1966-445c-b59f-349fb5396568" containerName="glance-httpd" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.878928 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="323921fc-1966-445c-b59f-349fb5396568" containerName="glance-httpd" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.879149 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="323921fc-1966-445c-b59f-349fb5396568" containerName="glance-httpd" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.879174 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="323921fc-1966-445c-b59f-349fb5396568" containerName="glance-log" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.880458 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.886311 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.889214 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 06:42:56 crc kubenswrapper[4841]: I0130 06:42:56.889330 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.035145 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.035661 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.035707 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82js2\" (UniqueName: \"kubernetes.io/projected/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-kube-api-access-82js2\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.035752 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-logs\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.035866 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.035902 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.035945 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.138205 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.138263 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82js2\" (UniqueName: \"kubernetes.io/projected/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-kube-api-access-82js2\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.138289 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-logs\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.138386 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.138460 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.138492 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.138528 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.139086 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.140975 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-logs\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.145123 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.150477 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.154391 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.157652 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82js2\" (UniqueName: \"kubernetes.io/projected/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-kube-api-access-82js2\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.176251 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.198458 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.705299 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95f2605a-425e-4320-a8c0-e78d3ad93fbb","Type":"ContainerStarted","Data":"91bc07556715b9b9bfde8ae1343195414189761c5a84ed76fb8332062c9a4a0c"} Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.705726 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95f2605a-425e-4320-a8c0-e78d3ad93fbb","Type":"ContainerStarted","Data":"f82953e97852f28f78d9a9197436b66479991a54a292afbf06ca136c6a3af681"} Jan 30 06:42:57 crc kubenswrapper[4841]: I0130 06:42:57.846466 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:42:58 crc kubenswrapper[4841]: I0130 06:42:58.447121 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="323921fc-1966-445c-b59f-349fb5396568" path="/var/lib/kubelet/pods/323921fc-1966-445c-b59f-349fb5396568/volumes" Jan 30 06:42:58 crc kubenswrapper[4841]: I0130 06:42:58.731579 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95f2605a-425e-4320-a8c0-e78d3ad93fbb","Type":"ContainerStarted","Data":"71df0b5d837f19d4320f3cdb5e1bd39c13694d215319db8e6ce04bcd80b0fbae"} Jan 30 06:42:58 crc kubenswrapper[4841]: I0130 06:42:58.741508 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b63ca59c-2c67-4c8d-8cef-344f3e76ba06","Type":"ContainerStarted","Data":"f23d5789a7d0f92a2ffe4f082632a36d784cdaaf4d451234700e7ea77d4bc8d5"} Jan 30 06:42:58 crc kubenswrapper[4841]: I0130 06:42:58.741540 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b63ca59c-2c67-4c8d-8cef-344f3e76ba06","Type":"ContainerStarted","Data":"7bb80d22258d1cf0748f2f288adaec59ba82de119a9327a15a27c6bae10d45c4"} Jan 30 06:42:58 crc kubenswrapper[4841]: I0130 06:42:58.765613 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.765595317 podStartE2EDuration="3.765595317s" podCreationTimestamp="2026-01-30 06:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:58.754013808 +0000 UTC m=+5715.747486466" watchObservedRunningTime="2026-01-30 06:42:58.765595317 +0000 UTC m=+5715.759067945" Jan 30 06:42:59 crc kubenswrapper[4841]: I0130 06:42:59.757046 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b63ca59c-2c67-4c8d-8cef-344f3e76ba06","Type":"ContainerStarted","Data":"80f19d874887893ff5379339d65dec8893e864761923c4b813ab8523b9791d51"} Jan 30 06:42:59 crc kubenswrapper[4841]: I0130 06:42:59.800388 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.800346077 podStartE2EDuration="3.800346077s" podCreationTimestamp="2026-01-30 06:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:42:59.787236337 +0000 UTC m=+5716.780709015" watchObservedRunningTime="2026-01-30 06:42:59.800346077 +0000 UTC m=+5716.793818715" Jan 30 06:43:02 crc kubenswrapper[4841]: I0130 06:43:02.367689 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:43:02 crc kubenswrapper[4841]: I0130 06:43:02.480924 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ddb7bdbd9-frmht"] Jan 30 06:43:02 crc kubenswrapper[4841]: I0130 06:43:02.481187 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" podUID="ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" containerName="dnsmasq-dns" containerID="cri-o://9c485f799d5c5fbfcd23a5f96ca4c8c301dc863b46a22284fc74b96700d63030" gracePeriod=10 Jan 30 06:43:02 crc kubenswrapper[4841]: I0130 06:43:02.794823 4841 generic.go:334] "Generic (PLEG): container finished" podID="ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" containerID="9c485f799d5c5fbfcd23a5f96ca4c8c301dc863b46a22284fc74b96700d63030" exitCode=0 Jan 30 06:43:02 crc kubenswrapper[4841]: I0130 06:43:02.794909 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" event={"ID":"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c","Type":"ContainerDied","Data":"9c485f799d5c5fbfcd23a5f96ca4c8c301dc863b46a22284fc74b96700d63030"} Jan 30 06:43:02 crc kubenswrapper[4841]: I0130 06:43:02.976478 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:43:02 crc kubenswrapper[4841]: I0130 06:43:02.994957 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-dns-svc\") pod \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " Jan 30 06:43:02 crc kubenswrapper[4841]: I0130 06:43:02.995067 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-ovsdbserver-nb\") pod \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " Jan 30 06:43:02 crc kubenswrapper[4841]: I0130 06:43:02.995094 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt9pm\" (UniqueName: \"kubernetes.io/projected/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-kube-api-access-bt9pm\") pod \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " Jan 30 06:43:02 crc kubenswrapper[4841]: I0130 06:43:02.995116 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-config\") pod \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " Jan 30 06:43:02 crc kubenswrapper[4841]: I0130 06:43:02.995190 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-ovsdbserver-sb\") pod \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\" (UID: \"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c\") " Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.024526 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-kube-api-access-bt9pm" (OuterVolumeSpecName: "kube-api-access-bt9pm") pod "ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" (UID: "ef030afe-aec2-4af3-b1c0-33db5e4c1b4c"). InnerVolumeSpecName "kube-api-access-bt9pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.058185 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-config" (OuterVolumeSpecName: "config") pod "ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" (UID: "ef030afe-aec2-4af3-b1c0-33db5e4c1b4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.069861 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" (UID: "ef030afe-aec2-4af3-b1c0-33db5e4c1b4c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.074318 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" (UID: "ef030afe-aec2-4af3-b1c0-33db5e4c1b4c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.085339 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" (UID: "ef030afe-aec2-4af3-b1c0-33db5e4c1b4c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.097926 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.097970 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.097986 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.098000 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt9pm\" (UniqueName: \"kubernetes.io/projected/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-kube-api-access-bt9pm\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.098013 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.812054 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" event={"ID":"ef030afe-aec2-4af3-b1c0-33db5e4c1b4c","Type":"ContainerDied","Data":"a5ec3e169419ad48a3bcd5daeef550111b20e2e5f7599e63476eab7ca7a7532c"} Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.812349 4841 scope.go:117] "RemoveContainer" containerID="9c485f799d5c5fbfcd23a5f96ca4c8c301dc863b46a22284fc74b96700d63030" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.812207 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ddb7bdbd9-frmht" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.857034 4841 scope.go:117] "RemoveContainer" containerID="62f3ccdf3fca402e14cbb5381667d579911d5599cd0a2a5f20e009765ef424d8" Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.880300 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ddb7bdbd9-frmht"] Jan 30 06:43:03 crc kubenswrapper[4841]: I0130 06:43:03.894056 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ddb7bdbd9-frmht"] Jan 30 06:43:04 crc kubenswrapper[4841]: I0130 06:43:04.445872 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" path="/var/lib/kubelet/pods/ef030afe-aec2-4af3-b1c0-33db5e4c1b4c/volumes" Jan 30 06:43:06 crc kubenswrapper[4841]: I0130 06:43:06.108135 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 06:43:06 crc kubenswrapper[4841]: I0130 06:43:06.108708 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 06:43:06 crc kubenswrapper[4841]: I0130 06:43:06.165075 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 06:43:06 crc kubenswrapper[4841]: I0130 06:43:06.180386 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 06:43:06 crc kubenswrapper[4841]: I0130 06:43:06.846670 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 06:43:06 crc kubenswrapper[4841]: I0130 06:43:06.846760 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 06:43:07 crc kubenswrapper[4841]: I0130 06:43:07.199512 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 06:43:07 crc kubenswrapper[4841]: I0130 06:43:07.199625 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 06:43:07 crc kubenswrapper[4841]: I0130 06:43:07.253440 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 06:43:07 crc kubenswrapper[4841]: I0130 06:43:07.297843 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 06:43:07 crc kubenswrapper[4841]: I0130 06:43:07.435129 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:43:07 crc kubenswrapper[4841]: E0130 06:43:07.435427 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:43:07 crc kubenswrapper[4841]: I0130 06:43:07.860619 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 06:43:07 crc kubenswrapper[4841]: I0130 06:43:07.860681 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 06:43:08 crc kubenswrapper[4841]: I0130 06:43:08.636580 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 06:43:08 crc kubenswrapper[4841]: I0130 06:43:08.703909 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 06:43:09 crc kubenswrapper[4841]: I0130 06:43:09.677266 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 06:43:09 crc kubenswrapper[4841]: I0130 06:43:09.749996 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.047785 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-458jh"] Jan 30 06:43:16 crc kubenswrapper[4841]: E0130 06:43:16.048888 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" containerName="dnsmasq-dns" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.048909 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" containerName="dnsmasq-dns" Jan 30 06:43:16 crc kubenswrapper[4841]: E0130 06:43:16.048964 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" containerName="init" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.048975 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" containerName="init" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.049256 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef030afe-aec2-4af3-b1c0-33db5e4c1b4c" containerName="dnsmasq-dns" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.050096 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-458jh" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.086537 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-458jh"] Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.147634 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e656-account-create-update-hbqxm"] Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.148992 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e656-account-create-update-hbqxm" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.150992 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.165616 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e656-account-create-update-hbqxm"] Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.172960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbjzc\" (UniqueName: \"kubernetes.io/projected/784a3ed6-714b-4646-9070-4b660e0ee354-kube-api-access-fbjzc\") pod \"placement-db-create-458jh\" (UID: \"784a3ed6-714b-4646-9070-4b660e0ee354\") " pod="openstack/placement-db-create-458jh" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.173260 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784a3ed6-714b-4646-9070-4b660e0ee354-operator-scripts\") pod \"placement-db-create-458jh\" (UID: \"784a3ed6-714b-4646-9070-4b660e0ee354\") " pod="openstack/placement-db-create-458jh" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.275087 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784a3ed6-714b-4646-9070-4b660e0ee354-operator-scripts\") pod \"placement-db-create-458jh\" (UID: \"784a3ed6-714b-4646-9070-4b660e0ee354\") " pod="openstack/placement-db-create-458jh" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.275208 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbjzc\" (UniqueName: \"kubernetes.io/projected/784a3ed6-714b-4646-9070-4b660e0ee354-kube-api-access-fbjzc\") pod \"placement-db-create-458jh\" (UID: \"784a3ed6-714b-4646-9070-4b660e0ee354\") " pod="openstack/placement-db-create-458jh" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.275450 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkx7k\" (UniqueName: \"kubernetes.io/projected/6db6f32a-2832-4c6d-a811-172f210560d4-kube-api-access-nkx7k\") pod \"placement-e656-account-create-update-hbqxm\" (UID: \"6db6f32a-2832-4c6d-a811-172f210560d4\") " pod="openstack/placement-e656-account-create-update-hbqxm" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.275500 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6db6f32a-2832-4c6d-a811-172f210560d4-operator-scripts\") pod \"placement-e656-account-create-update-hbqxm\" (UID: \"6db6f32a-2832-4c6d-a811-172f210560d4\") " pod="openstack/placement-e656-account-create-update-hbqxm" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.276216 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784a3ed6-714b-4646-9070-4b660e0ee354-operator-scripts\") pod \"placement-db-create-458jh\" (UID: \"784a3ed6-714b-4646-9070-4b660e0ee354\") " pod="openstack/placement-db-create-458jh" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.297250 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbjzc\" (UniqueName: \"kubernetes.io/projected/784a3ed6-714b-4646-9070-4b660e0ee354-kube-api-access-fbjzc\") pod \"placement-db-create-458jh\" (UID: \"784a3ed6-714b-4646-9070-4b660e0ee354\") " pod="openstack/placement-db-create-458jh" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.377530 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkx7k\" (UniqueName: \"kubernetes.io/projected/6db6f32a-2832-4c6d-a811-172f210560d4-kube-api-access-nkx7k\") pod \"placement-e656-account-create-update-hbqxm\" (UID: \"6db6f32a-2832-4c6d-a811-172f210560d4\") " pod="openstack/placement-e656-account-create-update-hbqxm" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.377587 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6db6f32a-2832-4c6d-a811-172f210560d4-operator-scripts\") pod \"placement-e656-account-create-update-hbqxm\" (UID: \"6db6f32a-2832-4c6d-a811-172f210560d4\") " pod="openstack/placement-e656-account-create-update-hbqxm" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.378520 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6db6f32a-2832-4c6d-a811-172f210560d4-operator-scripts\") pod \"placement-e656-account-create-update-hbqxm\" (UID: \"6db6f32a-2832-4c6d-a811-172f210560d4\") " pod="openstack/placement-e656-account-create-update-hbqxm" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.380830 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-458jh" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.411475 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkx7k\" (UniqueName: \"kubernetes.io/projected/6db6f32a-2832-4c6d-a811-172f210560d4-kube-api-access-nkx7k\") pod \"placement-e656-account-create-update-hbqxm\" (UID: \"6db6f32a-2832-4c6d-a811-172f210560d4\") " pod="openstack/placement-e656-account-create-update-hbqxm" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.473622 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e656-account-create-update-hbqxm" Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.922141 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-458jh"] Jan 30 06:43:16 crc kubenswrapper[4841]: W0130 06:43:16.926765 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod784a3ed6_714b_4646_9070_4b660e0ee354.slice/crio-83aa1b4c920d47e1ff047594125144f4ee1af15a8a732e51ad204d547cdbfd88 WatchSource:0}: Error finding container 83aa1b4c920d47e1ff047594125144f4ee1af15a8a732e51ad204d547cdbfd88: Status 404 returned error can't find the container with id 83aa1b4c920d47e1ff047594125144f4ee1af15a8a732e51ad204d547cdbfd88 Jan 30 06:43:16 crc kubenswrapper[4841]: I0130 06:43:16.959268 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-458jh" event={"ID":"784a3ed6-714b-4646-9070-4b660e0ee354","Type":"ContainerStarted","Data":"83aa1b4c920d47e1ff047594125144f4ee1af15a8a732e51ad204d547cdbfd88"} Jan 30 06:43:17 crc kubenswrapper[4841]: I0130 06:43:17.004765 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e656-account-create-update-hbqxm"] Jan 30 06:43:17 crc kubenswrapper[4841]: I0130 06:43:17.970628 4841 generic.go:334] "Generic (PLEG): container finished" podID="6db6f32a-2832-4c6d-a811-172f210560d4" containerID="5feca6095bf49629173a7a7a17b80a89b2f2c3a7c6b0e0948420042eccf5d4fc" exitCode=0 Jan 30 06:43:17 crc kubenswrapper[4841]: I0130 06:43:17.971261 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e656-account-create-update-hbqxm" event={"ID":"6db6f32a-2832-4c6d-a811-172f210560d4","Type":"ContainerDied","Data":"5feca6095bf49629173a7a7a17b80a89b2f2c3a7c6b0e0948420042eccf5d4fc"} Jan 30 06:43:17 crc kubenswrapper[4841]: I0130 06:43:17.971425 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e656-account-create-update-hbqxm" event={"ID":"6db6f32a-2832-4c6d-a811-172f210560d4","Type":"ContainerStarted","Data":"e9ec1f64481cb00199c361a1b182bbdcef50417af0f0def10f0d78dc1f5c4a0b"} Jan 30 06:43:17 crc kubenswrapper[4841]: I0130 06:43:17.974637 4841 generic.go:334] "Generic (PLEG): container finished" podID="784a3ed6-714b-4646-9070-4b660e0ee354" containerID="60f9c1d8b3bcccb14da8a6e1fce165f56e460f764bc86eae5015fcbf64d02369" exitCode=0 Jan 30 06:43:17 crc kubenswrapper[4841]: I0130 06:43:17.974685 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-458jh" event={"ID":"784a3ed6-714b-4646-9070-4b660e0ee354","Type":"ContainerDied","Data":"60f9c1d8b3bcccb14da8a6e1fce165f56e460f764bc86eae5015fcbf64d02369"} Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.504013 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-458jh" Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.510620 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e656-account-create-update-hbqxm" Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.646991 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6db6f32a-2832-4c6d-a811-172f210560d4-operator-scripts\") pod \"6db6f32a-2832-4c6d-a811-172f210560d4\" (UID: \"6db6f32a-2832-4c6d-a811-172f210560d4\") " Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.647125 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkx7k\" (UniqueName: \"kubernetes.io/projected/6db6f32a-2832-4c6d-a811-172f210560d4-kube-api-access-nkx7k\") pod \"6db6f32a-2832-4c6d-a811-172f210560d4\" (UID: \"6db6f32a-2832-4c6d-a811-172f210560d4\") " Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.647207 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784a3ed6-714b-4646-9070-4b660e0ee354-operator-scripts\") pod \"784a3ed6-714b-4646-9070-4b660e0ee354\" (UID: \"784a3ed6-714b-4646-9070-4b660e0ee354\") " Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.647232 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbjzc\" (UniqueName: \"kubernetes.io/projected/784a3ed6-714b-4646-9070-4b660e0ee354-kube-api-access-fbjzc\") pod \"784a3ed6-714b-4646-9070-4b660e0ee354\" (UID: \"784a3ed6-714b-4646-9070-4b660e0ee354\") " Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.647891 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784a3ed6-714b-4646-9070-4b660e0ee354-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "784a3ed6-714b-4646-9070-4b660e0ee354" (UID: "784a3ed6-714b-4646-9070-4b660e0ee354"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.648143 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6db6f32a-2832-4c6d-a811-172f210560d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6db6f32a-2832-4c6d-a811-172f210560d4" (UID: "6db6f32a-2832-4c6d-a811-172f210560d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.653142 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db6f32a-2832-4c6d-a811-172f210560d4-kube-api-access-nkx7k" (OuterVolumeSpecName: "kube-api-access-nkx7k") pod "6db6f32a-2832-4c6d-a811-172f210560d4" (UID: "6db6f32a-2832-4c6d-a811-172f210560d4"). InnerVolumeSpecName "kube-api-access-nkx7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.654152 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784a3ed6-714b-4646-9070-4b660e0ee354-kube-api-access-fbjzc" (OuterVolumeSpecName: "kube-api-access-fbjzc") pod "784a3ed6-714b-4646-9070-4b660e0ee354" (UID: "784a3ed6-714b-4646-9070-4b660e0ee354"). InnerVolumeSpecName "kube-api-access-fbjzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.749433 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784a3ed6-714b-4646-9070-4b660e0ee354-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.749469 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbjzc\" (UniqueName: \"kubernetes.io/projected/784a3ed6-714b-4646-9070-4b660e0ee354-kube-api-access-fbjzc\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.749481 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6db6f32a-2832-4c6d-a811-172f210560d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:19 crc kubenswrapper[4841]: I0130 06:43:19.749490 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkx7k\" (UniqueName: \"kubernetes.io/projected/6db6f32a-2832-4c6d-a811-172f210560d4-kube-api-access-nkx7k\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:20 crc kubenswrapper[4841]: I0130 06:43:20.000709 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e656-account-create-update-hbqxm" event={"ID":"6db6f32a-2832-4c6d-a811-172f210560d4","Type":"ContainerDied","Data":"e9ec1f64481cb00199c361a1b182bbdcef50417af0f0def10f0d78dc1f5c4a0b"} Jan 30 06:43:20 crc kubenswrapper[4841]: I0130 06:43:20.000739 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e656-account-create-update-hbqxm" Jan 30 06:43:20 crc kubenswrapper[4841]: I0130 06:43:20.000755 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9ec1f64481cb00199c361a1b182bbdcef50417af0f0def10f0d78dc1f5c4a0b" Jan 30 06:43:20 crc kubenswrapper[4841]: I0130 06:43:20.002499 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-458jh" event={"ID":"784a3ed6-714b-4646-9070-4b660e0ee354","Type":"ContainerDied","Data":"83aa1b4c920d47e1ff047594125144f4ee1af15a8a732e51ad204d547cdbfd88"} Jan 30 06:43:20 crc kubenswrapper[4841]: I0130 06:43:20.002524 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83aa1b4c920d47e1ff047594125144f4ee1af15a8a732e51ad204d547cdbfd88" Jan 30 06:43:20 crc kubenswrapper[4841]: I0130 06:43:20.002581 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-458jh" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.488227 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b4d4c5f95-l26l4"] Jan 30 06:43:21 crc kubenswrapper[4841]: E0130 06:43:21.488899 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db6f32a-2832-4c6d-a811-172f210560d4" containerName="mariadb-account-create-update" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.488914 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db6f32a-2832-4c6d-a811-172f210560d4" containerName="mariadb-account-create-update" Jan 30 06:43:21 crc kubenswrapper[4841]: E0130 06:43:21.488960 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784a3ed6-714b-4646-9070-4b660e0ee354" containerName="mariadb-database-create" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.488970 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="784a3ed6-714b-4646-9070-4b660e0ee354" containerName="mariadb-database-create" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.489168 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="784a3ed6-714b-4646-9070-4b660e0ee354" containerName="mariadb-database-create" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.489206 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db6f32a-2832-4c6d-a811-172f210560d4" containerName="mariadb-account-create-update" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.490311 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.497816 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4d4c5f95-l26l4"] Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.510049 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-84ztj"] Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.511155 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.519111 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-84ztj"] Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.524638 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ngtg7" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.527393 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.527967 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.620208 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-scripts\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.620727 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp2xl\" (UniqueName: \"kubernetes.io/projected/f34d65dc-52e3-4321-8d51-a5657da47566-kube-api-access-gp2xl\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.620897 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-config\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.621003 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-config-data\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.621116 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-combined-ca-bundle\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.621303 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mdpr\" (UniqueName: \"kubernetes.io/projected/5b88f273-35e9-4a59-be7e-de1eebd92300-kube-api-access-8mdpr\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.621447 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-dns-svc\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.621547 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d65dc-52e3-4321-8d51-a5657da47566-logs\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.621615 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-ovsdbserver-nb\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.621698 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-ovsdbserver-sb\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.725332 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-config\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.725400 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-config-data\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.725439 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-combined-ca-bundle\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.725497 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mdpr\" (UniqueName: \"kubernetes.io/projected/5b88f273-35e9-4a59-be7e-de1eebd92300-kube-api-access-8mdpr\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.725522 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-dns-svc\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.725547 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-ovsdbserver-nb\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.725561 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d65dc-52e3-4321-8d51-a5657da47566-logs\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.725583 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-ovsdbserver-sb\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.725619 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-scripts\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.725641 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp2xl\" (UniqueName: \"kubernetes.io/projected/f34d65dc-52e3-4321-8d51-a5657da47566-kube-api-access-gp2xl\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.726802 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-config\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.726944 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d65dc-52e3-4321-8d51-a5657da47566-logs\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.727222 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-dns-svc\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.727333 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-ovsdbserver-sb\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.727609 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-ovsdbserver-nb\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.736734 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-config-data\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.736802 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-scripts\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.738922 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-combined-ca-bundle\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.749205 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mdpr\" (UniqueName: \"kubernetes.io/projected/5b88f273-35e9-4a59-be7e-de1eebd92300-kube-api-access-8mdpr\") pod \"dnsmasq-dns-b4d4c5f95-l26l4\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.749540 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp2xl\" (UniqueName: \"kubernetes.io/projected/f34d65dc-52e3-4321-8d51-a5657da47566-kube-api-access-gp2xl\") pod \"placement-db-sync-84ztj\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.807560 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:21 crc kubenswrapper[4841]: I0130 06:43:21.846000 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.227107 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d2b77"] Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.229229 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.234243 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2b77"] Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.328160 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4d4c5f95-l26l4"] Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.334001 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjxrv\" (UniqueName: \"kubernetes.io/projected/c2d7969d-058f-40ff-a4af-a3c8a60a7973-kube-api-access-tjxrv\") pod \"redhat-marketplace-d2b77\" (UID: \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\") " pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.334077 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d7969d-058f-40ff-a4af-a3c8a60a7973-catalog-content\") pod \"redhat-marketplace-d2b77\" (UID: \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\") " pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.334137 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d7969d-058f-40ff-a4af-a3c8a60a7973-utilities\") pod \"redhat-marketplace-d2b77\" (UID: \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\") " pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:22 crc kubenswrapper[4841]: W0130 06:43:22.338979 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b88f273_35e9_4a59_be7e_de1eebd92300.slice/crio-1fc85c7b4e49d56c1b9d7803bff90acb38b8bbdff1871c9e6160b3a544d32ff4 WatchSource:0}: Error finding container 1fc85c7b4e49d56c1b9d7803bff90acb38b8bbdff1871c9e6160b3a544d32ff4: Status 404 returned error can't find the container with id 1fc85c7b4e49d56c1b9d7803bff90acb38b8bbdff1871c9e6160b3a544d32ff4 Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.344267 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-84ztj"] Jan 30 06:43:22 crc kubenswrapper[4841]: W0130 06:43:22.347999 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf34d65dc_52e3_4321_8d51_a5657da47566.slice/crio-3670114a264b73eb1cc2f7987fbb8591fd465359feec8793cb7206ba456e7fa2 WatchSource:0}: Error finding container 3670114a264b73eb1cc2f7987fbb8591fd465359feec8793cb7206ba456e7fa2: Status 404 returned error can't find the container with id 3670114a264b73eb1cc2f7987fbb8591fd465359feec8793cb7206ba456e7fa2 Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.433222 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:43:22 crc kubenswrapper[4841]: E0130 06:43:22.433697 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.439371 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjxrv\" (UniqueName: \"kubernetes.io/projected/c2d7969d-058f-40ff-a4af-a3c8a60a7973-kube-api-access-tjxrv\") pod \"redhat-marketplace-d2b77\" (UID: \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\") " pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.439497 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d7969d-058f-40ff-a4af-a3c8a60a7973-catalog-content\") pod \"redhat-marketplace-d2b77\" (UID: \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\") " pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.439537 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d7969d-058f-40ff-a4af-a3c8a60a7973-utilities\") pod \"redhat-marketplace-d2b77\" (UID: \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\") " pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.439945 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d7969d-058f-40ff-a4af-a3c8a60a7973-utilities\") pod \"redhat-marketplace-d2b77\" (UID: \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\") " pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.440187 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d7969d-058f-40ff-a4af-a3c8a60a7973-catalog-content\") pod \"redhat-marketplace-d2b77\" (UID: \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\") " pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.462883 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjxrv\" (UniqueName: \"kubernetes.io/projected/c2d7969d-058f-40ff-a4af-a3c8a60a7973-kube-api-access-tjxrv\") pod \"redhat-marketplace-d2b77\" (UID: \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\") " pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.592055 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:22 crc kubenswrapper[4841]: I0130 06:43:22.836458 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2b77"] Jan 30 06:43:22 crc kubenswrapper[4841]: W0130 06:43:22.845684 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2d7969d_058f_40ff_a4af_a3c8a60a7973.slice/crio-4768fe9b9a0250303fcf2ba578a921103ebc295d266b833483f65853432c1122 WatchSource:0}: Error finding container 4768fe9b9a0250303fcf2ba578a921103ebc295d266b833483f65853432c1122: Status 404 returned error can't find the container with id 4768fe9b9a0250303fcf2ba578a921103ebc295d266b833483f65853432c1122 Jan 30 06:43:23 crc kubenswrapper[4841]: I0130 06:43:23.033183 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-84ztj" event={"ID":"f34d65dc-52e3-4321-8d51-a5657da47566","Type":"ContainerStarted","Data":"03fd3fa2289acdd2fc71a7b7c1a7396e9786a4b0b35bae38706e87599b64b962"} Jan 30 06:43:23 crc kubenswrapper[4841]: I0130 06:43:23.033445 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-84ztj" event={"ID":"f34d65dc-52e3-4321-8d51-a5657da47566","Type":"ContainerStarted","Data":"3670114a264b73eb1cc2f7987fbb8591fd465359feec8793cb7206ba456e7fa2"} Jan 30 06:43:23 crc kubenswrapper[4841]: I0130 06:43:23.036097 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2b77" event={"ID":"c2d7969d-058f-40ff-a4af-a3c8a60a7973","Type":"ContainerStarted","Data":"4dc2128bf1b45eea7b3440aa4e59ad1cc59e4e288d3ee88fd3cf41bf6b606e7d"} Jan 30 06:43:23 crc kubenswrapper[4841]: I0130 06:43:23.036119 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2b77" event={"ID":"c2d7969d-058f-40ff-a4af-a3c8a60a7973","Type":"ContainerStarted","Data":"4768fe9b9a0250303fcf2ba578a921103ebc295d266b833483f65853432c1122"} Jan 30 06:43:23 crc kubenswrapper[4841]: I0130 06:43:23.040337 4841 generic.go:334] "Generic (PLEG): container finished" podID="5b88f273-35e9-4a59-be7e-de1eebd92300" containerID="f68ec22e4f4f230cc6eb11e856ad1151f2ae4e247c6179d44d6fed5e67e21c97" exitCode=0 Jan 30 06:43:23 crc kubenswrapper[4841]: I0130 06:43:23.040383 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" event={"ID":"5b88f273-35e9-4a59-be7e-de1eebd92300","Type":"ContainerDied","Data":"f68ec22e4f4f230cc6eb11e856ad1151f2ae4e247c6179d44d6fed5e67e21c97"} Jan 30 06:43:23 crc kubenswrapper[4841]: I0130 06:43:23.040439 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" event={"ID":"5b88f273-35e9-4a59-be7e-de1eebd92300","Type":"ContainerStarted","Data":"1fc85c7b4e49d56c1b9d7803bff90acb38b8bbdff1871c9e6160b3a544d32ff4"} Jan 30 06:43:23 crc kubenswrapper[4841]: I0130 06:43:23.058045 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-84ztj" podStartSLOduration=2.058029111 podStartE2EDuration="2.058029111s" podCreationTimestamp="2026-01-30 06:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:43:23.051912338 +0000 UTC m=+5740.045384986" watchObservedRunningTime="2026-01-30 06:43:23.058029111 +0000 UTC m=+5740.051501749" Jan 30 06:43:24 crc kubenswrapper[4841]: I0130 06:43:24.055286 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" event={"ID":"5b88f273-35e9-4a59-be7e-de1eebd92300","Type":"ContainerStarted","Data":"8ae61371f9b8019989aff7c298726f6e521e295879dccf935c7ecca29daa1a6b"} Jan 30 06:43:24 crc kubenswrapper[4841]: I0130 06:43:24.056566 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:24 crc kubenswrapper[4841]: I0130 06:43:24.062114 4841 generic.go:334] "Generic (PLEG): container finished" podID="f34d65dc-52e3-4321-8d51-a5657da47566" containerID="03fd3fa2289acdd2fc71a7b7c1a7396e9786a4b0b35bae38706e87599b64b962" exitCode=0 Jan 30 06:43:24 crc kubenswrapper[4841]: I0130 06:43:24.062218 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-84ztj" event={"ID":"f34d65dc-52e3-4321-8d51-a5657da47566","Type":"ContainerDied","Data":"03fd3fa2289acdd2fc71a7b7c1a7396e9786a4b0b35bae38706e87599b64b962"} Jan 30 06:43:24 crc kubenswrapper[4841]: I0130 06:43:24.064477 4841 generic.go:334] "Generic (PLEG): container finished" podID="c2d7969d-058f-40ff-a4af-a3c8a60a7973" containerID="4dc2128bf1b45eea7b3440aa4e59ad1cc59e4e288d3ee88fd3cf41bf6b606e7d" exitCode=0 Jan 30 06:43:24 crc kubenswrapper[4841]: I0130 06:43:24.064520 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2b77" event={"ID":"c2d7969d-058f-40ff-a4af-a3c8a60a7973","Type":"ContainerDied","Data":"4dc2128bf1b45eea7b3440aa4e59ad1cc59e4e288d3ee88fd3cf41bf6b606e7d"} Jan 30 06:43:24 crc kubenswrapper[4841]: I0130 06:43:24.069805 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:43:24 crc kubenswrapper[4841]: I0130 06:43:24.094074 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" podStartSLOduration=3.094051205 podStartE2EDuration="3.094051205s" podCreationTimestamp="2026-01-30 06:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:43:24.083802222 +0000 UTC m=+5741.077274870" watchObservedRunningTime="2026-01-30 06:43:24.094051205 +0000 UTC m=+5741.087523853" Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.461232 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.635047 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp2xl\" (UniqueName: \"kubernetes.io/projected/f34d65dc-52e3-4321-8d51-a5657da47566-kube-api-access-gp2xl\") pod \"f34d65dc-52e3-4321-8d51-a5657da47566\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.635315 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-combined-ca-bundle\") pod \"f34d65dc-52e3-4321-8d51-a5657da47566\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.635341 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-scripts\") pod \"f34d65dc-52e3-4321-8d51-a5657da47566\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.635374 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d65dc-52e3-4321-8d51-a5657da47566-logs\") pod \"f34d65dc-52e3-4321-8d51-a5657da47566\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.635452 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-config-data\") pod \"f34d65dc-52e3-4321-8d51-a5657da47566\" (UID: \"f34d65dc-52e3-4321-8d51-a5657da47566\") " Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.635701 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f34d65dc-52e3-4321-8d51-a5657da47566-logs" (OuterVolumeSpecName: "logs") pod "f34d65dc-52e3-4321-8d51-a5657da47566" (UID: "f34d65dc-52e3-4321-8d51-a5657da47566"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.635828 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d65dc-52e3-4321-8d51-a5657da47566-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.642943 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34d65dc-52e3-4321-8d51-a5657da47566-kube-api-access-gp2xl" (OuterVolumeSpecName: "kube-api-access-gp2xl") pod "f34d65dc-52e3-4321-8d51-a5657da47566" (UID: "f34d65dc-52e3-4321-8d51-a5657da47566"). InnerVolumeSpecName "kube-api-access-gp2xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.646645 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-scripts" (OuterVolumeSpecName: "scripts") pod "f34d65dc-52e3-4321-8d51-a5657da47566" (UID: "f34d65dc-52e3-4321-8d51-a5657da47566"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.666238 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-config-data" (OuterVolumeSpecName: "config-data") pod "f34d65dc-52e3-4321-8d51-a5657da47566" (UID: "f34d65dc-52e3-4321-8d51-a5657da47566"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.678584 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f34d65dc-52e3-4321-8d51-a5657da47566" (UID: "f34d65dc-52e3-4321-8d51-a5657da47566"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.738009 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.738058 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp2xl\" (UniqueName: \"kubernetes.io/projected/f34d65dc-52e3-4321-8d51-a5657da47566-kube-api-access-gp2xl\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.738077 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:25 crc kubenswrapper[4841]: I0130 06:43:25.738095 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f34d65dc-52e3-4321-8d51-a5657da47566-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.101858 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-84ztj" event={"ID":"f34d65dc-52e3-4321-8d51-a5657da47566","Type":"ContainerDied","Data":"3670114a264b73eb1cc2f7987fbb8591fd465359feec8793cb7206ba456e7fa2"} Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.101918 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3670114a264b73eb1cc2f7987fbb8591fd465359feec8793cb7206ba456e7fa2" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.101942 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-84ztj" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.106429 4841 generic.go:334] "Generic (PLEG): container finished" podID="c2d7969d-058f-40ff-a4af-a3c8a60a7973" containerID="31b9766fbdd658619437a586a6fbeb54d02d6b9145fae5a96ba1460e7305a5db" exitCode=0 Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.106474 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2b77" event={"ID":"c2d7969d-058f-40ff-a4af-a3c8a60a7973","Type":"ContainerDied","Data":"31b9766fbdd658619437a586a6fbeb54d02d6b9145fae5a96ba1460e7305a5db"} Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.189613 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55cd7599f4-9hklc"] Jan 30 06:43:26 crc kubenswrapper[4841]: E0130 06:43:26.190351 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34d65dc-52e3-4321-8d51-a5657da47566" containerName="placement-db-sync" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.190378 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34d65dc-52e3-4321-8d51-a5657da47566" containerName="placement-db-sync" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.190635 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34d65dc-52e3-4321-8d51-a5657da47566" containerName="placement-db-sync" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.191776 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.193686 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.193870 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.194141 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ngtg7" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.194335 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.194580 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.203958 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55cd7599f4-9hklc"] Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.350714 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-scripts\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.350802 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-combined-ca-bundle\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.350847 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk7vz\" (UniqueName: \"kubernetes.io/projected/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-kube-api-access-rk7vz\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.350890 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-public-tls-certs\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.351081 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-internal-tls-certs\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.351158 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-config-data\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.351237 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-logs\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.452999 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-scripts\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.453110 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-combined-ca-bundle\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.453180 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk7vz\" (UniqueName: \"kubernetes.io/projected/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-kube-api-access-rk7vz\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.453244 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-public-tls-certs\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.453348 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-internal-tls-certs\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.453432 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-config-data\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.453498 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-logs\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.454146 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-logs\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.458147 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-public-tls-certs\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.459499 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-scripts\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.460083 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-config-data\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.460235 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-internal-tls-certs\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.466824 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-combined-ca-bundle\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.483532 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk7vz\" (UniqueName: \"kubernetes.io/projected/6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f-kube-api-access-rk7vz\") pod \"placement-55cd7599f4-9hklc\" (UID: \"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f\") " pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:26 crc kubenswrapper[4841]: I0130 06:43:26.550642 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:27 crc kubenswrapper[4841]: I0130 06:43:27.094270 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55cd7599f4-9hklc"] Jan 30 06:43:27 crc kubenswrapper[4841]: W0130 06:43:27.097963 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f4aa5c0_cef6_4fbb_9194_544ada4c5f8f.slice/crio-a8b6e6c596766eddf390ba3db1826d95cc4c9dcd59836e4815461f2d01fa7a1b WatchSource:0}: Error finding container a8b6e6c596766eddf390ba3db1826d95cc4c9dcd59836e4815461f2d01fa7a1b: Status 404 returned error can't find the container with id a8b6e6c596766eddf390ba3db1826d95cc4c9dcd59836e4815461f2d01fa7a1b Jan 30 06:43:27 crc kubenswrapper[4841]: I0130 06:43:27.114891 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2b77" event={"ID":"c2d7969d-058f-40ff-a4af-a3c8a60a7973","Type":"ContainerStarted","Data":"01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713"} Jan 30 06:43:27 crc kubenswrapper[4841]: I0130 06:43:27.116867 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55cd7599f4-9hklc" event={"ID":"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f","Type":"ContainerStarted","Data":"a8b6e6c596766eddf390ba3db1826d95cc4c9dcd59836e4815461f2d01fa7a1b"} Jan 30 06:43:27 crc kubenswrapper[4841]: I0130 06:43:27.141739 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d2b77" podStartSLOduration=2.675763692 podStartE2EDuration="5.141724616s" podCreationTimestamp="2026-01-30 06:43:22 +0000 UTC" firstStartedPulling="2026-01-30 06:43:24.069536722 +0000 UTC m=+5741.063009370" lastFinishedPulling="2026-01-30 06:43:26.535497616 +0000 UTC m=+5743.528970294" observedRunningTime="2026-01-30 06:43:27.138662225 +0000 UTC m=+5744.132134873" watchObservedRunningTime="2026-01-30 06:43:27.141724616 +0000 UTC m=+5744.135197254" Jan 30 06:43:28 crc kubenswrapper[4841]: I0130 06:43:28.132145 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55cd7599f4-9hklc" event={"ID":"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f","Type":"ContainerStarted","Data":"b9f085ab47d4c02043bf1cfe16ac4b02ca06f19e339caf994491833a91f0c089"} Jan 30 06:43:28 crc kubenswrapper[4841]: I0130 06:43:28.132733 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55cd7599f4-9hklc" event={"ID":"6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f","Type":"ContainerStarted","Data":"a7dd4821573f1478fa90ffc31fd80b300ce678ae328894a6e645421b4b1169fc"} Jan 30 06:43:28 crc kubenswrapper[4841]: I0130 06:43:28.164454 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55cd7599f4-9hklc" podStartSLOduration=2.164372163 podStartE2EDuration="2.164372163s" podCreationTimestamp="2026-01-30 06:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:43:28.155699512 +0000 UTC m=+5745.149172150" watchObservedRunningTime="2026-01-30 06:43:28.164372163 +0000 UTC m=+5745.157844851" Jan 30 06:43:29 crc kubenswrapper[4841]: I0130 06:43:29.143594 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:29 crc kubenswrapper[4841]: I0130 06:43:29.143676 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:31 crc kubenswrapper[4841]: I0130 06:43:31.809633 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:43:31 crc kubenswrapper[4841]: I0130 06:43:31.899326 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b6594c57-jm9l7"] Jan 30 06:43:31 crc kubenswrapper[4841]: I0130 06:43:31.899726 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" podUID="817caeb4-49b8-4e82-a870-e9bb3792ff16" containerName="dnsmasq-dns" containerID="cri-o://5f14bcfe4d5d1da7333f11614795fc8eeb991f60df718c2bcafabb4df680d67c" gracePeriod=10 Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.185301 4841 generic.go:334] "Generic (PLEG): container finished" podID="817caeb4-49b8-4e82-a870-e9bb3792ff16" containerID="5f14bcfe4d5d1da7333f11614795fc8eeb991f60df718c2bcafabb4df680d67c" exitCode=0 Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.186178 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" event={"ID":"817caeb4-49b8-4e82-a870-e9bb3792ff16","Type":"ContainerDied","Data":"5f14bcfe4d5d1da7333f11614795fc8eeb991f60df718c2bcafabb4df680d67c"} Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.400379 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.490848 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-config\") pod \"817caeb4-49b8-4e82-a870-e9bb3792ff16\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.491238 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-dns-svc\") pod \"817caeb4-49b8-4e82-a870-e9bb3792ff16\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.491287 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b79v\" (UniqueName: \"kubernetes.io/projected/817caeb4-49b8-4e82-a870-e9bb3792ff16-kube-api-access-2b79v\") pod \"817caeb4-49b8-4e82-a870-e9bb3792ff16\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.491325 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-ovsdbserver-nb\") pod \"817caeb4-49b8-4e82-a870-e9bb3792ff16\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.491358 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-ovsdbserver-sb\") pod \"817caeb4-49b8-4e82-a870-e9bb3792ff16\" (UID: \"817caeb4-49b8-4e82-a870-e9bb3792ff16\") " Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.499046 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/817caeb4-49b8-4e82-a870-e9bb3792ff16-kube-api-access-2b79v" (OuterVolumeSpecName: "kube-api-access-2b79v") pod "817caeb4-49b8-4e82-a870-e9bb3792ff16" (UID: "817caeb4-49b8-4e82-a870-e9bb3792ff16"). InnerVolumeSpecName "kube-api-access-2b79v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.531742 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "817caeb4-49b8-4e82-a870-e9bb3792ff16" (UID: "817caeb4-49b8-4e82-a870-e9bb3792ff16"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.607149 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.607597 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.607624 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b79v\" (UniqueName: \"kubernetes.io/projected/817caeb4-49b8-4e82-a870-e9bb3792ff16-kube-api-access-2b79v\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.608512 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.609092 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-config" (OuterVolumeSpecName: "config") pod "817caeb4-49b8-4e82-a870-e9bb3792ff16" (UID: "817caeb4-49b8-4e82-a870-e9bb3792ff16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.609764 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "817caeb4-49b8-4e82-a870-e9bb3792ff16" (UID: "817caeb4-49b8-4e82-a870-e9bb3792ff16"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.610010 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "817caeb4-49b8-4e82-a870-e9bb3792ff16" (UID: "817caeb4-49b8-4e82-a870-e9bb3792ff16"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.657549 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.709535 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.709570 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:32 crc kubenswrapper[4841]: I0130 06:43:32.709585 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/817caeb4-49b8-4e82-a870-e9bb3792ff16-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:33 crc kubenswrapper[4841]: I0130 06:43:33.209880 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" event={"ID":"817caeb4-49b8-4e82-a870-e9bb3792ff16","Type":"ContainerDied","Data":"8e42a2589124d808a547427b4aced264d45d8d0c8091f1da5580fcce37549b29"} Jan 30 06:43:33 crc kubenswrapper[4841]: I0130 06:43:33.209980 4841 scope.go:117] "RemoveContainer" containerID="5f14bcfe4d5d1da7333f11614795fc8eeb991f60df718c2bcafabb4df680d67c" Jan 30 06:43:33 crc kubenswrapper[4841]: I0130 06:43:33.210510 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" Jan 30 06:43:33 crc kubenswrapper[4841]: I0130 06:43:33.253849 4841 scope.go:117] "RemoveContainer" containerID="00ef51876d22b3b5f234ad11e518c5acb25e88862e920270beab80256f38ced4" Jan 30 06:43:33 crc kubenswrapper[4841]: I0130 06:43:33.277054 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b6594c57-jm9l7"] Jan 30 06:43:33 crc kubenswrapper[4841]: I0130 06:43:33.283599 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b6594c57-jm9l7"] Jan 30 06:43:33 crc kubenswrapper[4841]: I0130 06:43:33.301866 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:33 crc kubenswrapper[4841]: I0130 06:43:33.353673 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2b77"] Jan 30 06:43:34 crc kubenswrapper[4841]: I0130 06:43:34.452007 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="817caeb4-49b8-4e82-a870-e9bb3792ff16" path="/var/lib/kubelet/pods/817caeb4-49b8-4e82-a870-e9bb3792ff16/volumes" Jan 30 06:43:35 crc kubenswrapper[4841]: I0130 06:43:35.236187 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d2b77" podUID="c2d7969d-058f-40ff-a4af-a3c8a60a7973" containerName="registry-server" containerID="cri-o://01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713" gracePeriod=2 Jan 30 06:43:35 crc kubenswrapper[4841]: I0130 06:43:35.788829 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:35 crc kubenswrapper[4841]: I0130 06:43:35.880085 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d7969d-058f-40ff-a4af-a3c8a60a7973-catalog-content\") pod \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\" (UID: \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\") " Jan 30 06:43:35 crc kubenswrapper[4841]: I0130 06:43:35.880156 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjxrv\" (UniqueName: \"kubernetes.io/projected/c2d7969d-058f-40ff-a4af-a3c8a60a7973-kube-api-access-tjxrv\") pod \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\" (UID: \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\") " Jan 30 06:43:35 crc kubenswrapper[4841]: I0130 06:43:35.880332 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d7969d-058f-40ff-a4af-a3c8a60a7973-utilities\") pod \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\" (UID: \"c2d7969d-058f-40ff-a4af-a3c8a60a7973\") " Jan 30 06:43:35 crc kubenswrapper[4841]: I0130 06:43:35.882215 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2d7969d-058f-40ff-a4af-a3c8a60a7973-utilities" (OuterVolumeSpecName: "utilities") pod "c2d7969d-058f-40ff-a4af-a3c8a60a7973" (UID: "c2d7969d-058f-40ff-a4af-a3c8a60a7973"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:43:35 crc kubenswrapper[4841]: I0130 06:43:35.886182 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d7969d-058f-40ff-a4af-a3c8a60a7973-kube-api-access-tjxrv" (OuterVolumeSpecName: "kube-api-access-tjxrv") pod "c2d7969d-058f-40ff-a4af-a3c8a60a7973" (UID: "c2d7969d-058f-40ff-a4af-a3c8a60a7973"). InnerVolumeSpecName "kube-api-access-tjxrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:43:35 crc kubenswrapper[4841]: I0130 06:43:35.929087 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2d7969d-058f-40ff-a4af-a3c8a60a7973-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2d7969d-058f-40ff-a4af-a3c8a60a7973" (UID: "c2d7969d-058f-40ff-a4af-a3c8a60a7973"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:43:35 crc kubenswrapper[4841]: I0130 06:43:35.982823 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d7969d-058f-40ff-a4af-a3c8a60a7973-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:35 crc kubenswrapper[4841]: I0130 06:43:35.982884 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjxrv\" (UniqueName: \"kubernetes.io/projected/c2d7969d-058f-40ff-a4af-a3c8a60a7973-kube-api-access-tjxrv\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:35 crc kubenswrapper[4841]: I0130 06:43:35.982902 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d7969d-058f-40ff-a4af-a3c8a60a7973-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.251735 4841 generic.go:334] "Generic (PLEG): container finished" podID="c2d7969d-058f-40ff-a4af-a3c8a60a7973" containerID="01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713" exitCode=0 Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.251817 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2b77" event={"ID":"c2d7969d-058f-40ff-a4af-a3c8a60a7973","Type":"ContainerDied","Data":"01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713"} Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.251885 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2b77" event={"ID":"c2d7969d-058f-40ff-a4af-a3c8a60a7973","Type":"ContainerDied","Data":"4768fe9b9a0250303fcf2ba578a921103ebc295d266b833483f65853432c1122"} Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.251910 4841 scope.go:117] "RemoveContainer" containerID="01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713" Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.251843 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2b77" Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.295077 4841 scope.go:117] "RemoveContainer" containerID="31b9766fbdd658619437a586a6fbeb54d02d6b9145fae5a96ba1460e7305a5db" Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.318187 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2b77"] Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.333856 4841 scope.go:117] "RemoveContainer" containerID="4dc2128bf1b45eea7b3440aa4e59ad1cc59e4e288d3ee88fd3cf41bf6b606e7d" Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.333935 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2b77"] Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.415044 4841 scope.go:117] "RemoveContainer" containerID="01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713" Jan 30 06:43:36 crc kubenswrapper[4841]: E0130 06:43:36.415812 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713\": container with ID starting with 01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713 not found: ID does not exist" containerID="01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713" Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.415866 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713"} err="failed to get container status \"01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713\": rpc error: code = NotFound desc = could not find container \"01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713\": container with ID starting with 01f36520cdee1ca0cfc5387caf234096ccc1d71e577ed2c3c5f83dce6cbf2713 not found: ID does not exist" Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.415899 4841 scope.go:117] "RemoveContainer" containerID="31b9766fbdd658619437a586a6fbeb54d02d6b9145fae5a96ba1460e7305a5db" Jan 30 06:43:36 crc kubenswrapper[4841]: E0130 06:43:36.416439 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b9766fbdd658619437a586a6fbeb54d02d6b9145fae5a96ba1460e7305a5db\": container with ID starting with 31b9766fbdd658619437a586a6fbeb54d02d6b9145fae5a96ba1460e7305a5db not found: ID does not exist" containerID="31b9766fbdd658619437a586a6fbeb54d02d6b9145fae5a96ba1460e7305a5db" Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.416471 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b9766fbdd658619437a586a6fbeb54d02d6b9145fae5a96ba1460e7305a5db"} err="failed to get container status \"31b9766fbdd658619437a586a6fbeb54d02d6b9145fae5a96ba1460e7305a5db\": rpc error: code = NotFound desc = could not find container \"31b9766fbdd658619437a586a6fbeb54d02d6b9145fae5a96ba1460e7305a5db\": container with ID starting with 31b9766fbdd658619437a586a6fbeb54d02d6b9145fae5a96ba1460e7305a5db not found: ID does not exist" Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.416495 4841 scope.go:117] "RemoveContainer" containerID="4dc2128bf1b45eea7b3440aa4e59ad1cc59e4e288d3ee88fd3cf41bf6b606e7d" Jan 30 06:43:36 crc kubenswrapper[4841]: E0130 06:43:36.416851 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc2128bf1b45eea7b3440aa4e59ad1cc59e4e288d3ee88fd3cf41bf6b606e7d\": container with ID starting with 4dc2128bf1b45eea7b3440aa4e59ad1cc59e4e288d3ee88fd3cf41bf6b606e7d not found: ID does not exist" containerID="4dc2128bf1b45eea7b3440aa4e59ad1cc59e4e288d3ee88fd3cf41bf6b606e7d" Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.416904 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc2128bf1b45eea7b3440aa4e59ad1cc59e4e288d3ee88fd3cf41bf6b606e7d"} err="failed to get container status \"4dc2128bf1b45eea7b3440aa4e59ad1cc59e4e288d3ee88fd3cf41bf6b606e7d\": rpc error: code = NotFound desc = could not find container \"4dc2128bf1b45eea7b3440aa4e59ad1cc59e4e288d3ee88fd3cf41bf6b606e7d\": container with ID starting with 4dc2128bf1b45eea7b3440aa4e59ad1cc59e4e288d3ee88fd3cf41bf6b606e7d not found: ID does not exist" Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.431937 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:43:36 crc kubenswrapper[4841]: E0130 06:43:36.432819 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:43:36 crc kubenswrapper[4841]: I0130 06:43:36.449431 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d7969d-058f-40ff-a4af-a3c8a60a7973" path="/var/lib/kubelet/pods/c2d7969d-058f-40ff-a4af-a3c8a60a7973/volumes" Jan 30 06:43:37 crc kubenswrapper[4841]: I0130 06:43:37.366337 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b6594c57-jm9l7" podUID="817caeb4-49b8-4e82-a870-e9bb3792ff16" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.62:5353: i/o timeout" Jan 30 06:43:51 crc kubenswrapper[4841]: I0130 06:43:51.433507 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:43:51 crc kubenswrapper[4841]: E0130 06:43:51.434686 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:43:57 crc kubenswrapper[4841]: I0130 06:43:57.530116 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:43:57 crc kubenswrapper[4841]: I0130 06:43:57.531514 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55cd7599f4-9hklc" Jan 30 06:44:02 crc kubenswrapper[4841]: I0130 06:44:02.432446 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:44:02 crc kubenswrapper[4841]: E0130 06:44:02.433204 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:44:02 crc kubenswrapper[4841]: I0130 06:44:02.928475 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fnvqd"] Jan 30 06:44:02 crc kubenswrapper[4841]: E0130 06:44:02.928905 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="817caeb4-49b8-4e82-a870-e9bb3792ff16" containerName="init" Jan 30 06:44:02 crc kubenswrapper[4841]: I0130 06:44:02.928925 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="817caeb4-49b8-4e82-a870-e9bb3792ff16" containerName="init" Jan 30 06:44:02 crc kubenswrapper[4841]: E0130 06:44:02.928943 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d7969d-058f-40ff-a4af-a3c8a60a7973" containerName="extract-utilities" Jan 30 06:44:02 crc kubenswrapper[4841]: I0130 06:44:02.928953 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d7969d-058f-40ff-a4af-a3c8a60a7973" containerName="extract-utilities" Jan 30 06:44:02 crc kubenswrapper[4841]: E0130 06:44:02.928967 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d7969d-058f-40ff-a4af-a3c8a60a7973" containerName="extract-content" Jan 30 06:44:02 crc kubenswrapper[4841]: I0130 06:44:02.928975 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d7969d-058f-40ff-a4af-a3c8a60a7973" containerName="extract-content" Jan 30 06:44:02 crc kubenswrapper[4841]: E0130 06:44:02.929005 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="817caeb4-49b8-4e82-a870-e9bb3792ff16" containerName="dnsmasq-dns" Jan 30 06:44:02 crc kubenswrapper[4841]: I0130 06:44:02.929013 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="817caeb4-49b8-4e82-a870-e9bb3792ff16" containerName="dnsmasq-dns" Jan 30 06:44:02 crc kubenswrapper[4841]: E0130 06:44:02.929027 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d7969d-058f-40ff-a4af-a3c8a60a7973" containerName="registry-server" Jan 30 06:44:02 crc kubenswrapper[4841]: I0130 06:44:02.929035 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d7969d-058f-40ff-a4af-a3c8a60a7973" containerName="registry-server" Jan 30 06:44:02 crc kubenswrapper[4841]: I0130 06:44:02.929231 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="817caeb4-49b8-4e82-a870-e9bb3792ff16" containerName="dnsmasq-dns" Jan 30 06:44:02 crc kubenswrapper[4841]: I0130 06:44:02.929250 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d7969d-058f-40ff-a4af-a3c8a60a7973" containerName="registry-server" Jan 30 06:44:02 crc kubenswrapper[4841]: I0130 06:44:02.930754 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:02 crc kubenswrapper[4841]: I0130 06:44:02.945083 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fnvqd"] Jan 30 06:44:03 crc kubenswrapper[4841]: I0130 06:44:03.094231 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-utilities\") pod \"certified-operators-fnvqd\" (UID: \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\") " pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:03 crc kubenswrapper[4841]: I0130 06:44:03.094672 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxzx9\" (UniqueName: \"kubernetes.io/projected/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-kube-api-access-kxzx9\") pod \"certified-operators-fnvqd\" (UID: \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\") " pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:03 crc kubenswrapper[4841]: I0130 06:44:03.095475 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-catalog-content\") pod \"certified-operators-fnvqd\" (UID: \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\") " pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:03 crc kubenswrapper[4841]: I0130 06:44:03.198220 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxzx9\" (UniqueName: \"kubernetes.io/projected/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-kube-api-access-kxzx9\") pod \"certified-operators-fnvqd\" (UID: \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\") " pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:03 crc kubenswrapper[4841]: I0130 06:44:03.198275 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-catalog-content\") pod \"certified-operators-fnvqd\" (UID: \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\") " pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:03 crc kubenswrapper[4841]: I0130 06:44:03.198372 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-utilities\") pod \"certified-operators-fnvqd\" (UID: \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\") " pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:03 crc kubenswrapper[4841]: I0130 06:44:03.199135 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-utilities\") pod \"certified-operators-fnvqd\" (UID: \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\") " pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:03 crc kubenswrapper[4841]: I0130 06:44:03.199362 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-catalog-content\") pod \"certified-operators-fnvqd\" (UID: \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\") " pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:03 crc kubenswrapper[4841]: I0130 06:44:03.238930 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxzx9\" (UniqueName: \"kubernetes.io/projected/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-kube-api-access-kxzx9\") pod \"certified-operators-fnvqd\" (UID: \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\") " pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:03 crc kubenswrapper[4841]: I0130 06:44:03.257873 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:03 crc kubenswrapper[4841]: I0130 06:44:03.819117 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fnvqd"] Jan 30 06:44:03 crc kubenswrapper[4841]: W0130 06:44:03.826845 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb58c90fb_430a_4e5e_9c44_e69cec0afb2b.slice/crio-4830bff33b4da09dbaa3182ca2cf6b8d6003286b9eab958a25f10d67ad8d18e3 WatchSource:0}: Error finding container 4830bff33b4da09dbaa3182ca2cf6b8d6003286b9eab958a25f10d67ad8d18e3: Status 404 returned error can't find the container with id 4830bff33b4da09dbaa3182ca2cf6b8d6003286b9eab958a25f10d67ad8d18e3 Jan 30 06:44:04 crc kubenswrapper[4841]: I0130 06:44:04.547101 4841 generic.go:334] "Generic (PLEG): container finished" podID="b58c90fb-430a-4e5e-9c44-e69cec0afb2b" containerID="94dfc50da2ed0280cd84307d8156420e12df22e48b213b34ac434394cf34b4f8" exitCode=0 Jan 30 06:44:04 crc kubenswrapper[4841]: I0130 06:44:04.547181 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnvqd" event={"ID":"b58c90fb-430a-4e5e-9c44-e69cec0afb2b","Type":"ContainerDied","Data":"94dfc50da2ed0280cd84307d8156420e12df22e48b213b34ac434394cf34b4f8"} Jan 30 06:44:04 crc kubenswrapper[4841]: I0130 06:44:04.547439 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnvqd" event={"ID":"b58c90fb-430a-4e5e-9c44-e69cec0afb2b","Type":"ContainerStarted","Data":"4830bff33b4da09dbaa3182ca2cf6b8d6003286b9eab958a25f10d67ad8d18e3"} Jan 30 06:44:06 crc kubenswrapper[4841]: I0130 06:44:06.575473 4841 generic.go:334] "Generic (PLEG): container finished" podID="b58c90fb-430a-4e5e-9c44-e69cec0afb2b" containerID="464936ec0fd98f0dc65e6f13b3d10488aa0caee6db45cd2c00ccb7e266a5bacb" exitCode=0 Jan 30 06:44:06 crc kubenswrapper[4841]: I0130 06:44:06.575693 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnvqd" event={"ID":"b58c90fb-430a-4e5e-9c44-e69cec0afb2b","Type":"ContainerDied","Data":"464936ec0fd98f0dc65e6f13b3d10488aa0caee6db45cd2c00ccb7e266a5bacb"} Jan 30 06:44:07 crc kubenswrapper[4841]: I0130 06:44:07.588737 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnvqd" event={"ID":"b58c90fb-430a-4e5e-9c44-e69cec0afb2b","Type":"ContainerStarted","Data":"734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1"} Jan 30 06:44:07 crc kubenswrapper[4841]: I0130 06:44:07.618142 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fnvqd" podStartSLOduration=3.098051256 podStartE2EDuration="5.618122734s" podCreationTimestamp="2026-01-30 06:44:02 +0000 UTC" firstStartedPulling="2026-01-30 06:44:04.549214587 +0000 UTC m=+5781.542687255" lastFinishedPulling="2026-01-30 06:44:07.069286065 +0000 UTC m=+5784.062758733" observedRunningTime="2026-01-30 06:44:07.61306787 +0000 UTC m=+5784.606540518" watchObservedRunningTime="2026-01-30 06:44:07.618122734 +0000 UTC m=+5784.611595382" Jan 30 06:44:13 crc kubenswrapper[4841]: I0130 06:44:13.258116 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:13 crc kubenswrapper[4841]: I0130 06:44:13.258905 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:13 crc kubenswrapper[4841]: I0130 06:44:13.336732 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:13 crc kubenswrapper[4841]: I0130 06:44:13.802319 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:13 crc kubenswrapper[4841]: I0130 06:44:13.868087 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fnvqd"] Jan 30 06:44:14 crc kubenswrapper[4841]: I0130 06:44:14.451031 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:44:14 crc kubenswrapper[4841]: E0130 06:44:14.451784 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:44:15 crc kubenswrapper[4841]: I0130 06:44:15.749990 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fnvqd" podUID="b58c90fb-430a-4e5e-9c44-e69cec0afb2b" containerName="registry-server" containerID="cri-o://734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1" gracePeriod=2 Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.282712 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.381222 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-utilities\") pod \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\" (UID: \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\") " Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.381272 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxzx9\" (UniqueName: \"kubernetes.io/projected/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-kube-api-access-kxzx9\") pod \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\" (UID: \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\") " Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.381302 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-catalog-content\") pod \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\" (UID: \"b58c90fb-430a-4e5e-9c44-e69cec0afb2b\") " Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.382477 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-utilities" (OuterVolumeSpecName: "utilities") pod "b58c90fb-430a-4e5e-9c44-e69cec0afb2b" (UID: "b58c90fb-430a-4e5e-9c44-e69cec0afb2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.390226 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-kube-api-access-kxzx9" (OuterVolumeSpecName: "kube-api-access-kxzx9") pod "b58c90fb-430a-4e5e-9c44-e69cec0afb2b" (UID: "b58c90fb-430a-4e5e-9c44-e69cec0afb2b"). InnerVolumeSpecName "kube-api-access-kxzx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.483890 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.483920 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxzx9\" (UniqueName: \"kubernetes.io/projected/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-kube-api-access-kxzx9\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.749888 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b58c90fb-430a-4e5e-9c44-e69cec0afb2b" (UID: "b58c90fb-430a-4e5e-9c44-e69cec0afb2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.769533 4841 generic.go:334] "Generic (PLEG): container finished" podID="b58c90fb-430a-4e5e-9c44-e69cec0afb2b" containerID="734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1" exitCode=0 Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.769600 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnvqd" event={"ID":"b58c90fb-430a-4e5e-9c44-e69cec0afb2b","Type":"ContainerDied","Data":"734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1"} Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.769649 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnvqd" event={"ID":"b58c90fb-430a-4e5e-9c44-e69cec0afb2b","Type":"ContainerDied","Data":"4830bff33b4da09dbaa3182ca2cf6b8d6003286b9eab958a25f10d67ad8d18e3"} Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.769659 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnvqd" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.769677 4841 scope.go:117] "RemoveContainer" containerID="734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.791893 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b58c90fb-430a-4e5e-9c44-e69cec0afb2b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.805825 4841 scope.go:117] "RemoveContainer" containerID="464936ec0fd98f0dc65e6f13b3d10488aa0caee6db45cd2c00ccb7e266a5bacb" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.826337 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fnvqd"] Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.834031 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fnvqd"] Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.857922 4841 scope.go:117] "RemoveContainer" containerID="94dfc50da2ed0280cd84307d8156420e12df22e48b213b34ac434394cf34b4f8" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.917178 4841 scope.go:117] "RemoveContainer" containerID="734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1" Jan 30 06:44:16 crc kubenswrapper[4841]: E0130 06:44:16.921184 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1\": container with ID starting with 734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1 not found: ID does not exist" containerID="734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.921234 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1"} err="failed to get container status \"734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1\": rpc error: code = NotFound desc = could not find container \"734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1\": container with ID starting with 734f87f92e53fe03e3a4691612ea578469fb3812ad1c0315b9732e9d9ddc20f1 not found: ID does not exist" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.921268 4841 scope.go:117] "RemoveContainer" containerID="464936ec0fd98f0dc65e6f13b3d10488aa0caee6db45cd2c00ccb7e266a5bacb" Jan 30 06:44:16 crc kubenswrapper[4841]: E0130 06:44:16.921911 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"464936ec0fd98f0dc65e6f13b3d10488aa0caee6db45cd2c00ccb7e266a5bacb\": container with ID starting with 464936ec0fd98f0dc65e6f13b3d10488aa0caee6db45cd2c00ccb7e266a5bacb not found: ID does not exist" containerID="464936ec0fd98f0dc65e6f13b3d10488aa0caee6db45cd2c00ccb7e266a5bacb" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.922013 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"464936ec0fd98f0dc65e6f13b3d10488aa0caee6db45cd2c00ccb7e266a5bacb"} err="failed to get container status \"464936ec0fd98f0dc65e6f13b3d10488aa0caee6db45cd2c00ccb7e266a5bacb\": rpc error: code = NotFound desc = could not find container \"464936ec0fd98f0dc65e6f13b3d10488aa0caee6db45cd2c00ccb7e266a5bacb\": container with ID starting with 464936ec0fd98f0dc65e6f13b3d10488aa0caee6db45cd2c00ccb7e266a5bacb not found: ID does not exist" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.922109 4841 scope.go:117] "RemoveContainer" containerID="94dfc50da2ed0280cd84307d8156420e12df22e48b213b34ac434394cf34b4f8" Jan 30 06:44:16 crc kubenswrapper[4841]: E0130 06:44:16.922550 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94dfc50da2ed0280cd84307d8156420e12df22e48b213b34ac434394cf34b4f8\": container with ID starting with 94dfc50da2ed0280cd84307d8156420e12df22e48b213b34ac434394cf34b4f8 not found: ID does not exist" containerID="94dfc50da2ed0280cd84307d8156420e12df22e48b213b34ac434394cf34b4f8" Jan 30 06:44:16 crc kubenswrapper[4841]: I0130 06:44:16.922678 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94dfc50da2ed0280cd84307d8156420e12df22e48b213b34ac434394cf34b4f8"} err="failed to get container status \"94dfc50da2ed0280cd84307d8156420e12df22e48b213b34ac434394cf34b4f8\": rpc error: code = NotFound desc = could not find container \"94dfc50da2ed0280cd84307d8156420e12df22e48b213b34ac434394cf34b4f8\": container with ID starting with 94dfc50da2ed0280cd84307d8156420e12df22e48b213b34ac434394cf34b4f8 not found: ID does not exist" Jan 30 06:44:18 crc kubenswrapper[4841]: I0130 06:44:18.448320 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b58c90fb-430a-4e5e-9c44-e69cec0afb2b" path="/var/lib/kubelet/pods/b58c90fb-430a-4e5e-9c44-e69cec0afb2b/volumes" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.302320 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-85l9n"] Jan 30 06:44:19 crc kubenswrapper[4841]: E0130 06:44:19.302743 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58c90fb-430a-4e5e-9c44-e69cec0afb2b" containerName="extract-utilities" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.302765 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58c90fb-430a-4e5e-9c44-e69cec0afb2b" containerName="extract-utilities" Jan 30 06:44:19 crc kubenswrapper[4841]: E0130 06:44:19.302788 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58c90fb-430a-4e5e-9c44-e69cec0afb2b" containerName="extract-content" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.302797 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58c90fb-430a-4e5e-9c44-e69cec0afb2b" containerName="extract-content" Jan 30 06:44:19 crc kubenswrapper[4841]: E0130 06:44:19.302823 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58c90fb-430a-4e5e-9c44-e69cec0afb2b" containerName="registry-server" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.302835 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58c90fb-430a-4e5e-9c44-e69cec0afb2b" containerName="registry-server" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.303090 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58c90fb-430a-4e5e-9c44-e69cec0afb2b" containerName="registry-server" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.303812 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-85l9n" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.313660 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-85l9n"] Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.420417 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-75xt8"] Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.421790 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-75xt8" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.431296 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-75xt8"] Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.450331 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msjt7\" (UniqueName: \"kubernetes.io/projected/b8527b4c-dd51-4993-8997-909f5c4fd939-kube-api-access-msjt7\") pod \"nova-api-db-create-85l9n\" (UID: \"b8527b4c-dd51-4993-8997-909f5c4fd939\") " pod="openstack/nova-api-db-create-85l9n" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.450440 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8527b4c-dd51-4993-8997-909f5c4fd939-operator-scripts\") pod \"nova-api-db-create-85l9n\" (UID: \"b8527b4c-dd51-4993-8997-909f5c4fd939\") " pod="openstack/nova-api-db-create-85l9n" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.484353 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c2a2-account-create-update-2892c"] Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.485344 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c2a2-account-create-update-2892c" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.487909 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.511452 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c2a2-account-create-update-2892c"] Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.552138 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msjt7\" (UniqueName: \"kubernetes.io/projected/b8527b4c-dd51-4993-8997-909f5c4fd939-kube-api-access-msjt7\") pod \"nova-api-db-create-85l9n\" (UID: \"b8527b4c-dd51-4993-8997-909f5c4fd939\") " pod="openstack/nova-api-db-create-85l9n" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.552219 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b80476c5-38e5-46e8-ba13-a999779eca8c-operator-scripts\") pod \"nova-cell0-db-create-75xt8\" (UID: \"b80476c5-38e5-46e8-ba13-a999779eca8c\") " pod="openstack/nova-cell0-db-create-75xt8" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.552313 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f25j\" (UniqueName: \"kubernetes.io/projected/b80476c5-38e5-46e8-ba13-a999779eca8c-kube-api-access-5f25j\") pod \"nova-cell0-db-create-75xt8\" (UID: \"b80476c5-38e5-46e8-ba13-a999779eca8c\") " pod="openstack/nova-cell0-db-create-75xt8" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.552377 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8527b4c-dd51-4993-8997-909f5c4fd939-operator-scripts\") pod \"nova-api-db-create-85l9n\" (UID: \"b8527b4c-dd51-4993-8997-909f5c4fd939\") " pod="openstack/nova-api-db-create-85l9n" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.553706 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8527b4c-dd51-4993-8997-909f5c4fd939-operator-scripts\") pod \"nova-api-db-create-85l9n\" (UID: \"b8527b4c-dd51-4993-8997-909f5c4fd939\") " pod="openstack/nova-api-db-create-85l9n" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.576079 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msjt7\" (UniqueName: \"kubernetes.io/projected/b8527b4c-dd51-4993-8997-909f5c4fd939-kube-api-access-msjt7\") pod \"nova-api-db-create-85l9n\" (UID: \"b8527b4c-dd51-4993-8997-909f5c4fd939\") " pod="openstack/nova-api-db-create-85l9n" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.585192 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-pgdhb"] Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.587135 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pgdhb" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.595586 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-pgdhb"] Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.622164 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-85l9n" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.657180 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b80476c5-38e5-46e8-ba13-a999779eca8c-operator-scripts\") pod \"nova-cell0-db-create-75xt8\" (UID: \"b80476c5-38e5-46e8-ba13-a999779eca8c\") " pod="openstack/nova-cell0-db-create-75xt8" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.657430 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cz8q\" (UniqueName: \"kubernetes.io/projected/55f1a988-8b8f-4799-9efa-b4ec26393ad2-kube-api-access-5cz8q\") pod \"nova-api-c2a2-account-create-update-2892c\" (UID: \"55f1a988-8b8f-4799-9efa-b4ec26393ad2\") " pod="openstack/nova-api-c2a2-account-create-update-2892c" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.657608 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f25j\" (UniqueName: \"kubernetes.io/projected/b80476c5-38e5-46e8-ba13-a999779eca8c-kube-api-access-5f25j\") pod \"nova-cell0-db-create-75xt8\" (UID: \"b80476c5-38e5-46e8-ba13-a999779eca8c\") " pod="openstack/nova-cell0-db-create-75xt8" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.657720 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f1a988-8b8f-4799-9efa-b4ec26393ad2-operator-scripts\") pod \"nova-api-c2a2-account-create-update-2892c\" (UID: \"55f1a988-8b8f-4799-9efa-b4ec26393ad2\") " pod="openstack/nova-api-c2a2-account-create-update-2892c" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.657771 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b80476c5-38e5-46e8-ba13-a999779eca8c-operator-scripts\") pod \"nova-cell0-db-create-75xt8\" (UID: \"b80476c5-38e5-46e8-ba13-a999779eca8c\") " pod="openstack/nova-cell0-db-create-75xt8" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.715158 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-390c-account-create-update-jjv5j"] Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.717297 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f25j\" (UniqueName: \"kubernetes.io/projected/b80476c5-38e5-46e8-ba13-a999779eca8c-kube-api-access-5f25j\") pod \"nova-cell0-db-create-75xt8\" (UID: \"b80476c5-38e5-46e8-ba13-a999779eca8c\") " pod="openstack/nova-cell0-db-create-75xt8" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.718802 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-390c-account-create-update-jjv5j" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.721046 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.728846 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-390c-account-create-update-jjv5j"] Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.756345 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-75xt8" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.765661 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppn8w\" (UniqueName: \"kubernetes.io/projected/b3e01bb5-4f5d-453c-9909-115f85275590-kube-api-access-ppn8w\") pod \"nova-cell1-db-create-pgdhb\" (UID: \"b3e01bb5-4f5d-453c-9909-115f85275590\") " pod="openstack/nova-cell1-db-create-pgdhb" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.765722 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1de6617-005c-4b5a-b230-b1578d641b2b-operator-scripts\") pod \"nova-cell0-390c-account-create-update-jjv5j\" (UID: \"a1de6617-005c-4b5a-b230-b1578d641b2b\") " pod="openstack/nova-cell0-390c-account-create-update-jjv5j" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.765757 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3e01bb5-4f5d-453c-9909-115f85275590-operator-scripts\") pod \"nova-cell1-db-create-pgdhb\" (UID: \"b3e01bb5-4f5d-453c-9909-115f85275590\") " pod="openstack/nova-cell1-db-create-pgdhb" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.765795 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cz8q\" (UniqueName: \"kubernetes.io/projected/55f1a988-8b8f-4799-9efa-b4ec26393ad2-kube-api-access-5cz8q\") pod \"nova-api-c2a2-account-create-update-2892c\" (UID: \"55f1a988-8b8f-4799-9efa-b4ec26393ad2\") " pod="openstack/nova-api-c2a2-account-create-update-2892c" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.765838 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sg24\" (UniqueName: \"kubernetes.io/projected/a1de6617-005c-4b5a-b230-b1578d641b2b-kube-api-access-5sg24\") pod \"nova-cell0-390c-account-create-update-jjv5j\" (UID: \"a1de6617-005c-4b5a-b230-b1578d641b2b\") " pod="openstack/nova-cell0-390c-account-create-update-jjv5j" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.765881 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f1a988-8b8f-4799-9efa-b4ec26393ad2-operator-scripts\") pod \"nova-api-c2a2-account-create-update-2892c\" (UID: \"55f1a988-8b8f-4799-9efa-b4ec26393ad2\") " pod="openstack/nova-api-c2a2-account-create-update-2892c" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.766699 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f1a988-8b8f-4799-9efa-b4ec26393ad2-operator-scripts\") pod \"nova-api-c2a2-account-create-update-2892c\" (UID: \"55f1a988-8b8f-4799-9efa-b4ec26393ad2\") " pod="openstack/nova-api-c2a2-account-create-update-2892c" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.796354 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cz8q\" (UniqueName: \"kubernetes.io/projected/55f1a988-8b8f-4799-9efa-b4ec26393ad2-kube-api-access-5cz8q\") pod \"nova-api-c2a2-account-create-update-2892c\" (UID: \"55f1a988-8b8f-4799-9efa-b4ec26393ad2\") " pod="openstack/nova-api-c2a2-account-create-update-2892c" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.808341 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c2a2-account-create-update-2892c" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.867091 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppn8w\" (UniqueName: \"kubernetes.io/projected/b3e01bb5-4f5d-453c-9909-115f85275590-kube-api-access-ppn8w\") pod \"nova-cell1-db-create-pgdhb\" (UID: \"b3e01bb5-4f5d-453c-9909-115f85275590\") " pod="openstack/nova-cell1-db-create-pgdhb" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.867143 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1de6617-005c-4b5a-b230-b1578d641b2b-operator-scripts\") pod \"nova-cell0-390c-account-create-update-jjv5j\" (UID: \"a1de6617-005c-4b5a-b230-b1578d641b2b\") " pod="openstack/nova-cell0-390c-account-create-update-jjv5j" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.867173 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3e01bb5-4f5d-453c-9909-115f85275590-operator-scripts\") pod \"nova-cell1-db-create-pgdhb\" (UID: \"b3e01bb5-4f5d-453c-9909-115f85275590\") " pod="openstack/nova-cell1-db-create-pgdhb" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.867216 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sg24\" (UniqueName: \"kubernetes.io/projected/a1de6617-005c-4b5a-b230-b1578d641b2b-kube-api-access-5sg24\") pod \"nova-cell0-390c-account-create-update-jjv5j\" (UID: \"a1de6617-005c-4b5a-b230-b1578d641b2b\") " pod="openstack/nova-cell0-390c-account-create-update-jjv5j" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.868316 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1de6617-005c-4b5a-b230-b1578d641b2b-operator-scripts\") pod \"nova-cell0-390c-account-create-update-jjv5j\" (UID: \"a1de6617-005c-4b5a-b230-b1578d641b2b\") " pod="openstack/nova-cell0-390c-account-create-update-jjv5j" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.884527 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3e01bb5-4f5d-453c-9909-115f85275590-operator-scripts\") pod \"nova-cell1-db-create-pgdhb\" (UID: \"b3e01bb5-4f5d-453c-9909-115f85275590\") " pod="openstack/nova-cell1-db-create-pgdhb" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.903102 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sg24\" (UniqueName: \"kubernetes.io/projected/a1de6617-005c-4b5a-b230-b1578d641b2b-kube-api-access-5sg24\") pod \"nova-cell0-390c-account-create-update-jjv5j\" (UID: \"a1de6617-005c-4b5a-b230-b1578d641b2b\") " pod="openstack/nova-cell0-390c-account-create-update-jjv5j" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.905326 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppn8w\" (UniqueName: \"kubernetes.io/projected/b3e01bb5-4f5d-453c-9909-115f85275590-kube-api-access-ppn8w\") pod \"nova-cell1-db-create-pgdhb\" (UID: \"b3e01bb5-4f5d-453c-9909-115f85275590\") " pod="openstack/nova-cell1-db-create-pgdhb" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.914244 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1599-account-create-update-t5v9q"] Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.915247 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1599-account-create-update-t5v9q" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.925127 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.926002 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pgdhb" Jan 30 06:44:19 crc kubenswrapper[4841]: I0130 06:44:19.938389 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1599-account-create-update-t5v9q"] Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.070663 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v2tc\" (UniqueName: \"kubernetes.io/projected/83eb4b2d-ef33-4018-b164-277863dc9bd6-kube-api-access-8v2tc\") pod \"nova-cell1-1599-account-create-update-t5v9q\" (UID: \"83eb4b2d-ef33-4018-b164-277863dc9bd6\") " pod="openstack/nova-cell1-1599-account-create-update-t5v9q" Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.070979 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83eb4b2d-ef33-4018-b164-277863dc9bd6-operator-scripts\") pod \"nova-cell1-1599-account-create-update-t5v9q\" (UID: \"83eb4b2d-ef33-4018-b164-277863dc9bd6\") " pod="openstack/nova-cell1-1599-account-create-update-t5v9q" Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.109536 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-85l9n"] Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.134729 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-390c-account-create-update-jjv5j" Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.172751 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v2tc\" (UniqueName: \"kubernetes.io/projected/83eb4b2d-ef33-4018-b164-277863dc9bd6-kube-api-access-8v2tc\") pod \"nova-cell1-1599-account-create-update-t5v9q\" (UID: \"83eb4b2d-ef33-4018-b164-277863dc9bd6\") " pod="openstack/nova-cell1-1599-account-create-update-t5v9q" Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.172867 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83eb4b2d-ef33-4018-b164-277863dc9bd6-operator-scripts\") pod \"nova-cell1-1599-account-create-update-t5v9q\" (UID: \"83eb4b2d-ef33-4018-b164-277863dc9bd6\") " pod="openstack/nova-cell1-1599-account-create-update-t5v9q" Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.173711 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83eb4b2d-ef33-4018-b164-277863dc9bd6-operator-scripts\") pod \"nova-cell1-1599-account-create-update-t5v9q\" (UID: \"83eb4b2d-ef33-4018-b164-277863dc9bd6\") " pod="openstack/nova-cell1-1599-account-create-update-t5v9q" Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.191610 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v2tc\" (UniqueName: \"kubernetes.io/projected/83eb4b2d-ef33-4018-b164-277863dc9bd6-kube-api-access-8v2tc\") pod \"nova-cell1-1599-account-create-update-t5v9q\" (UID: \"83eb4b2d-ef33-4018-b164-277863dc9bd6\") " pod="openstack/nova-cell1-1599-account-create-update-t5v9q" Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.242029 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1599-account-create-update-t5v9q" Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.280558 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-75xt8"] Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.368339 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c2a2-account-create-update-2892c"] Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.448280 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-pgdhb"] Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.571118 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-390c-account-create-update-jjv5j"] Jan 30 06:44:20 crc kubenswrapper[4841]: W0130 06:44:20.578682 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1de6617_005c_4b5a_b230_b1578d641b2b.slice/crio-0225855f7fe4c008e8da5b893a200365a4b10fd902cea18da4e5ab636d0b00ef WatchSource:0}: Error finding container 0225855f7fe4c008e8da5b893a200365a4b10fd902cea18da4e5ab636d0b00ef: Status 404 returned error can't find the container with id 0225855f7fe4c008e8da5b893a200365a4b10fd902cea18da4e5ab636d0b00ef Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.690224 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1599-account-create-update-t5v9q"] Jan 30 06:44:20 crc kubenswrapper[4841]: W0130 06:44:20.746317 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83eb4b2d_ef33_4018_b164_277863dc9bd6.slice/crio-66a931e83ca0868ba58fcec181e67e4b6ff23a862ba1403adaacf40152fc8600 WatchSource:0}: Error finding container 66a931e83ca0868ba58fcec181e67e4b6ff23a862ba1403adaacf40152fc8600: Status 404 returned error can't find the container with id 66a931e83ca0868ba58fcec181e67e4b6ff23a862ba1403adaacf40152fc8600 Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.817607 4841 generic.go:334] "Generic (PLEG): container finished" podID="b80476c5-38e5-46e8-ba13-a999779eca8c" containerID="ff7ee4bfc96e7a9b617a23c96e78aac89195c1ab971439d27f61a84ceccb7268" exitCode=0 Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.817670 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-75xt8" event={"ID":"b80476c5-38e5-46e8-ba13-a999779eca8c","Type":"ContainerDied","Data":"ff7ee4bfc96e7a9b617a23c96e78aac89195c1ab971439d27f61a84ceccb7268"} Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.817696 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-75xt8" event={"ID":"b80476c5-38e5-46e8-ba13-a999779eca8c","Type":"ContainerStarted","Data":"091efe65df3abe65a05e5367f19f236487dd1823cb0b10568b28c0811de04dd6"} Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.819611 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1599-account-create-update-t5v9q" event={"ID":"83eb4b2d-ef33-4018-b164-277863dc9bd6","Type":"ContainerStarted","Data":"66a931e83ca0868ba58fcec181e67e4b6ff23a862ba1403adaacf40152fc8600"} Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.821730 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pgdhb" event={"ID":"b3e01bb5-4f5d-453c-9909-115f85275590","Type":"ContainerStarted","Data":"51cb556626cb01050492c4a16cbf0f284dd302575ccee459cafea347ce884939"} Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.822633 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pgdhb" event={"ID":"b3e01bb5-4f5d-453c-9909-115f85275590","Type":"ContainerStarted","Data":"fd6591aa6d1014b762d2186c36db695ab643e5ee5604023949965dcbad800cd4"} Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.823307 4841 generic.go:334] "Generic (PLEG): container finished" podID="b8527b4c-dd51-4993-8997-909f5c4fd939" containerID="62a204ea064b2252916ea57f5f0f20afbca6ab750f74b88c97979f918df1721a" exitCode=0 Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.823483 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-85l9n" event={"ID":"b8527b4c-dd51-4993-8997-909f5c4fd939","Type":"ContainerDied","Data":"62a204ea064b2252916ea57f5f0f20afbca6ab750f74b88c97979f918df1721a"} Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.823579 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-85l9n" event={"ID":"b8527b4c-dd51-4993-8997-909f5c4fd939","Type":"ContainerStarted","Data":"a3300ee14bfbfe915e08bcdb27e569807107b9b89324816b4fdc4b28a40f5c8f"} Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.825154 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-390c-account-create-update-jjv5j" event={"ID":"a1de6617-005c-4b5a-b230-b1578d641b2b","Type":"ContainerStarted","Data":"0225855f7fe4c008e8da5b893a200365a4b10fd902cea18da4e5ab636d0b00ef"} Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.831896 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c2a2-account-create-update-2892c" event={"ID":"55f1a988-8b8f-4799-9efa-b4ec26393ad2","Type":"ContainerStarted","Data":"e614bb3403e088c40feda02435441d13c0e4673e306b16c045e5bcf222e3e102"} Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.831937 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c2a2-account-create-update-2892c" event={"ID":"55f1a988-8b8f-4799-9efa-b4ec26393ad2","Type":"ContainerStarted","Data":"53ceb02c402efa18af6bb385d95e3ac3041acfe3df2cf3b71dcc978df7d4a157"} Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.859135 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-pgdhb" podStartSLOduration=1.859120903 podStartE2EDuration="1.859120903s" podCreationTimestamp="2026-01-30 06:44:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:20.852324591 +0000 UTC m=+5797.845797229" watchObservedRunningTime="2026-01-30 06:44:20.859120903 +0000 UTC m=+5797.852593541" Jan 30 06:44:20 crc kubenswrapper[4841]: I0130 06:44:20.871076 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-c2a2-account-create-update-2892c" podStartSLOduration=1.871061892 podStartE2EDuration="1.871061892s" podCreationTimestamp="2026-01-30 06:44:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:20.865153874 +0000 UTC m=+5797.858626512" watchObservedRunningTime="2026-01-30 06:44:20.871061892 +0000 UTC m=+5797.864534530" Jan 30 06:44:21 crc kubenswrapper[4841]: I0130 06:44:21.841425 4841 generic.go:334] "Generic (PLEG): container finished" podID="83eb4b2d-ef33-4018-b164-277863dc9bd6" containerID="64e4d66c014416becbe6b28e4b5228500fe624eaa4ba0aafbb94b47a30e0dbc6" exitCode=0 Jan 30 06:44:21 crc kubenswrapper[4841]: I0130 06:44:21.841485 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1599-account-create-update-t5v9q" event={"ID":"83eb4b2d-ef33-4018-b164-277863dc9bd6","Type":"ContainerDied","Data":"64e4d66c014416becbe6b28e4b5228500fe624eaa4ba0aafbb94b47a30e0dbc6"} Jan 30 06:44:21 crc kubenswrapper[4841]: I0130 06:44:21.844283 4841 generic.go:334] "Generic (PLEG): container finished" podID="b3e01bb5-4f5d-453c-9909-115f85275590" containerID="51cb556626cb01050492c4a16cbf0f284dd302575ccee459cafea347ce884939" exitCode=0 Jan 30 06:44:21 crc kubenswrapper[4841]: I0130 06:44:21.844374 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pgdhb" event={"ID":"b3e01bb5-4f5d-453c-9909-115f85275590","Type":"ContainerDied","Data":"51cb556626cb01050492c4a16cbf0f284dd302575ccee459cafea347ce884939"} Jan 30 06:44:21 crc kubenswrapper[4841]: I0130 06:44:21.846585 4841 generic.go:334] "Generic (PLEG): container finished" podID="a1de6617-005c-4b5a-b230-b1578d641b2b" containerID="66b4d4b8ef2b5b7d96321d40fc7e26088f66b7a180d0ab03cd01e57807a5ea1a" exitCode=0 Jan 30 06:44:21 crc kubenswrapper[4841]: I0130 06:44:21.846614 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-390c-account-create-update-jjv5j" event={"ID":"a1de6617-005c-4b5a-b230-b1578d641b2b","Type":"ContainerDied","Data":"66b4d4b8ef2b5b7d96321d40fc7e26088f66b7a180d0ab03cd01e57807a5ea1a"} Jan 30 06:44:21 crc kubenswrapper[4841]: I0130 06:44:21.848366 4841 generic.go:334] "Generic (PLEG): container finished" podID="55f1a988-8b8f-4799-9efa-b4ec26393ad2" containerID="e614bb3403e088c40feda02435441d13c0e4673e306b16c045e5bcf222e3e102" exitCode=0 Jan 30 06:44:21 crc kubenswrapper[4841]: I0130 06:44:21.848450 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c2a2-account-create-update-2892c" event={"ID":"55f1a988-8b8f-4799-9efa-b4ec26393ad2","Type":"ContainerDied","Data":"e614bb3403e088c40feda02435441d13c0e4673e306b16c045e5bcf222e3e102"} Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.246649 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-85l9n" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.252296 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-75xt8" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.317402 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8527b4c-dd51-4993-8997-909f5c4fd939-operator-scripts\") pod \"b8527b4c-dd51-4993-8997-909f5c4fd939\" (UID: \"b8527b4c-dd51-4993-8997-909f5c4fd939\") " Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.319158 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8527b4c-dd51-4993-8997-909f5c4fd939-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8527b4c-dd51-4993-8997-909f5c4fd939" (UID: "b8527b4c-dd51-4993-8997-909f5c4fd939"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.329322 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f25j\" (UniqueName: \"kubernetes.io/projected/b80476c5-38e5-46e8-ba13-a999779eca8c-kube-api-access-5f25j\") pod \"b80476c5-38e5-46e8-ba13-a999779eca8c\" (UID: \"b80476c5-38e5-46e8-ba13-a999779eca8c\") " Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.331929 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msjt7\" (UniqueName: \"kubernetes.io/projected/b8527b4c-dd51-4993-8997-909f5c4fd939-kube-api-access-msjt7\") pod \"b8527b4c-dd51-4993-8997-909f5c4fd939\" (UID: \"b8527b4c-dd51-4993-8997-909f5c4fd939\") " Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.331994 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b80476c5-38e5-46e8-ba13-a999779eca8c-operator-scripts\") pod \"b80476c5-38e5-46e8-ba13-a999779eca8c\" (UID: \"b80476c5-38e5-46e8-ba13-a999779eca8c\") " Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.334161 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8527b4c-dd51-4993-8997-909f5c4fd939-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.334857 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b80476c5-38e5-46e8-ba13-a999779eca8c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b80476c5-38e5-46e8-ba13-a999779eca8c" (UID: "b80476c5-38e5-46e8-ba13-a999779eca8c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.340851 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8527b4c-dd51-4993-8997-909f5c4fd939-kube-api-access-msjt7" (OuterVolumeSpecName: "kube-api-access-msjt7") pod "b8527b4c-dd51-4993-8997-909f5c4fd939" (UID: "b8527b4c-dd51-4993-8997-909f5c4fd939"). InnerVolumeSpecName "kube-api-access-msjt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.342236 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b80476c5-38e5-46e8-ba13-a999779eca8c-kube-api-access-5f25j" (OuterVolumeSpecName: "kube-api-access-5f25j") pod "b80476c5-38e5-46e8-ba13-a999779eca8c" (UID: "b80476c5-38e5-46e8-ba13-a999779eca8c"). InnerVolumeSpecName "kube-api-access-5f25j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.436394 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msjt7\" (UniqueName: \"kubernetes.io/projected/b8527b4c-dd51-4993-8997-909f5c4fd939-kube-api-access-msjt7\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.436788 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b80476c5-38e5-46e8-ba13-a999779eca8c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.436804 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f25j\" (UniqueName: \"kubernetes.io/projected/b80476c5-38e5-46e8-ba13-a999779eca8c-kube-api-access-5f25j\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.859150 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-75xt8" event={"ID":"b80476c5-38e5-46e8-ba13-a999779eca8c","Type":"ContainerDied","Data":"091efe65df3abe65a05e5367f19f236487dd1823cb0b10568b28c0811de04dd6"} Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.859196 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="091efe65df3abe65a05e5367f19f236487dd1823cb0b10568b28c0811de04dd6" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.859280 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-75xt8" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.861850 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-85l9n" Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.862545 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-85l9n" event={"ID":"b8527b4c-dd51-4993-8997-909f5c4fd939","Type":"ContainerDied","Data":"a3300ee14bfbfe915e08bcdb27e569807107b9b89324816b4fdc4b28a40f5c8f"} Jan 30 06:44:22 crc kubenswrapper[4841]: I0130 06:44:22.862587 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3300ee14bfbfe915e08bcdb27e569807107b9b89324816b4fdc4b28a40f5c8f" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.381917 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pgdhb" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.389334 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1599-account-create-update-t5v9q" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.395790 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c2a2-account-create-update-2892c" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.402085 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-390c-account-create-update-jjv5j" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.452522 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3e01bb5-4f5d-453c-9909-115f85275590-operator-scripts\") pod \"b3e01bb5-4f5d-453c-9909-115f85275590\" (UID: \"b3e01bb5-4f5d-453c-9909-115f85275590\") " Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.452576 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppn8w\" (UniqueName: \"kubernetes.io/projected/b3e01bb5-4f5d-453c-9909-115f85275590-kube-api-access-ppn8w\") pod \"b3e01bb5-4f5d-453c-9909-115f85275590\" (UID: \"b3e01bb5-4f5d-453c-9909-115f85275590\") " Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.452977 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3e01bb5-4f5d-453c-9909-115f85275590-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b3e01bb5-4f5d-453c-9909-115f85275590" (UID: "b3e01bb5-4f5d-453c-9909-115f85275590"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.453993 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f1a988-8b8f-4799-9efa-b4ec26393ad2-operator-scripts\") pod \"55f1a988-8b8f-4799-9efa-b4ec26393ad2\" (UID: \"55f1a988-8b8f-4799-9efa-b4ec26393ad2\") " Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.454086 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83eb4b2d-ef33-4018-b164-277863dc9bd6-operator-scripts\") pod \"83eb4b2d-ef33-4018-b164-277863dc9bd6\" (UID: \"83eb4b2d-ef33-4018-b164-277863dc9bd6\") " Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.454138 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1de6617-005c-4b5a-b230-b1578d641b2b-operator-scripts\") pod \"a1de6617-005c-4b5a-b230-b1578d641b2b\" (UID: \"a1de6617-005c-4b5a-b230-b1578d641b2b\") " Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.454265 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cz8q\" (UniqueName: \"kubernetes.io/projected/55f1a988-8b8f-4799-9efa-b4ec26393ad2-kube-api-access-5cz8q\") pod \"55f1a988-8b8f-4799-9efa-b4ec26393ad2\" (UID: \"55f1a988-8b8f-4799-9efa-b4ec26393ad2\") " Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.454297 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55f1a988-8b8f-4799-9efa-b4ec26393ad2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55f1a988-8b8f-4799-9efa-b4ec26393ad2" (UID: "55f1a988-8b8f-4799-9efa-b4ec26393ad2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.454308 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sg24\" (UniqueName: \"kubernetes.io/projected/a1de6617-005c-4b5a-b230-b1578d641b2b-kube-api-access-5sg24\") pod \"a1de6617-005c-4b5a-b230-b1578d641b2b\" (UID: \"a1de6617-005c-4b5a-b230-b1578d641b2b\") " Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.454402 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v2tc\" (UniqueName: \"kubernetes.io/projected/83eb4b2d-ef33-4018-b164-277863dc9bd6-kube-api-access-8v2tc\") pod \"83eb4b2d-ef33-4018-b164-277863dc9bd6\" (UID: \"83eb4b2d-ef33-4018-b164-277863dc9bd6\") " Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.454761 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1de6617-005c-4b5a-b230-b1578d641b2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1de6617-005c-4b5a-b230-b1578d641b2b" (UID: "a1de6617-005c-4b5a-b230-b1578d641b2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.455255 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1de6617-005c-4b5a-b230-b1578d641b2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.455286 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3e01bb5-4f5d-453c-9909-115f85275590-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.455299 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f1a988-8b8f-4799-9efa-b4ec26393ad2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.455794 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83eb4b2d-ef33-4018-b164-277863dc9bd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83eb4b2d-ef33-4018-b164-277863dc9bd6" (UID: "83eb4b2d-ef33-4018-b164-277863dc9bd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.457741 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e01bb5-4f5d-453c-9909-115f85275590-kube-api-access-ppn8w" (OuterVolumeSpecName: "kube-api-access-ppn8w") pod "b3e01bb5-4f5d-453c-9909-115f85275590" (UID: "b3e01bb5-4f5d-453c-9909-115f85275590"). InnerVolumeSpecName "kube-api-access-ppn8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.458096 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83eb4b2d-ef33-4018-b164-277863dc9bd6-kube-api-access-8v2tc" (OuterVolumeSpecName: "kube-api-access-8v2tc") pod "83eb4b2d-ef33-4018-b164-277863dc9bd6" (UID: "83eb4b2d-ef33-4018-b164-277863dc9bd6"). InnerVolumeSpecName "kube-api-access-8v2tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.458994 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55f1a988-8b8f-4799-9efa-b4ec26393ad2-kube-api-access-5cz8q" (OuterVolumeSpecName: "kube-api-access-5cz8q") pod "55f1a988-8b8f-4799-9efa-b4ec26393ad2" (UID: "55f1a988-8b8f-4799-9efa-b4ec26393ad2"). InnerVolumeSpecName "kube-api-access-5cz8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.459321 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1de6617-005c-4b5a-b230-b1578d641b2b-kube-api-access-5sg24" (OuterVolumeSpecName: "kube-api-access-5sg24") pod "a1de6617-005c-4b5a-b230-b1578d641b2b" (UID: "a1de6617-005c-4b5a-b230-b1578d641b2b"). InnerVolumeSpecName "kube-api-access-5sg24". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.556735 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83eb4b2d-ef33-4018-b164-277863dc9bd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.556787 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cz8q\" (UniqueName: \"kubernetes.io/projected/55f1a988-8b8f-4799-9efa-b4ec26393ad2-kube-api-access-5cz8q\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.556798 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sg24\" (UniqueName: \"kubernetes.io/projected/a1de6617-005c-4b5a-b230-b1578d641b2b-kube-api-access-5sg24\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.556808 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v2tc\" (UniqueName: \"kubernetes.io/projected/83eb4b2d-ef33-4018-b164-277863dc9bd6-kube-api-access-8v2tc\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.556819 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppn8w\" (UniqueName: \"kubernetes.io/projected/b3e01bb5-4f5d-453c-9909-115f85275590-kube-api-access-ppn8w\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.872681 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1599-account-create-update-t5v9q" event={"ID":"83eb4b2d-ef33-4018-b164-277863dc9bd6","Type":"ContainerDied","Data":"66a931e83ca0868ba58fcec181e67e4b6ff23a862ba1403adaacf40152fc8600"} Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.872734 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1599-account-create-update-t5v9q" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.872740 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66a931e83ca0868ba58fcec181e67e4b6ff23a862ba1403adaacf40152fc8600" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.875342 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pgdhb" event={"ID":"b3e01bb5-4f5d-453c-9909-115f85275590","Type":"ContainerDied","Data":"fd6591aa6d1014b762d2186c36db695ab643e5ee5604023949965dcbad800cd4"} Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.875485 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd6591aa6d1014b762d2186c36db695ab643e5ee5604023949965dcbad800cd4" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.875358 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pgdhb" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.877835 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-390c-account-create-update-jjv5j" event={"ID":"a1de6617-005c-4b5a-b230-b1578d641b2b","Type":"ContainerDied","Data":"0225855f7fe4c008e8da5b893a200365a4b10fd902cea18da4e5ab636d0b00ef"} Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.877938 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0225855f7fe4c008e8da5b893a200365a4b10fd902cea18da4e5ab636d0b00ef" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.877878 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-390c-account-create-update-jjv5j" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.879972 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c2a2-account-create-update-2892c" event={"ID":"55f1a988-8b8f-4799-9efa-b4ec26393ad2","Type":"ContainerDied","Data":"53ceb02c402efa18af6bb385d95e3ac3041acfe3df2cf3b71dcc978df7d4a157"} Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.880005 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53ceb02c402efa18af6bb385d95e3ac3041acfe3df2cf3b71dcc978df7d4a157" Jan 30 06:44:23 crc kubenswrapper[4841]: I0130 06:44:23.880045 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c2a2-account-create-update-2892c" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010129 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ws5lz"] Jan 30 06:44:25 crc kubenswrapper[4841]: E0130 06:44:25.010659 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b80476c5-38e5-46e8-ba13-a999779eca8c" containerName="mariadb-database-create" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010671 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b80476c5-38e5-46e8-ba13-a999779eca8c" containerName="mariadb-database-create" Jan 30 06:44:25 crc kubenswrapper[4841]: E0130 06:44:25.010688 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e01bb5-4f5d-453c-9909-115f85275590" containerName="mariadb-database-create" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010695 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e01bb5-4f5d-453c-9909-115f85275590" containerName="mariadb-database-create" Jan 30 06:44:25 crc kubenswrapper[4841]: E0130 06:44:25.010707 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55f1a988-8b8f-4799-9efa-b4ec26393ad2" containerName="mariadb-account-create-update" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010713 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="55f1a988-8b8f-4799-9efa-b4ec26393ad2" containerName="mariadb-account-create-update" Jan 30 06:44:25 crc kubenswrapper[4841]: E0130 06:44:25.010726 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8527b4c-dd51-4993-8997-909f5c4fd939" containerName="mariadb-database-create" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010733 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8527b4c-dd51-4993-8997-909f5c4fd939" containerName="mariadb-database-create" Jan 30 06:44:25 crc kubenswrapper[4841]: E0130 06:44:25.010750 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1de6617-005c-4b5a-b230-b1578d641b2b" containerName="mariadb-account-create-update" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010757 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1de6617-005c-4b5a-b230-b1578d641b2b" containerName="mariadb-account-create-update" Jan 30 06:44:25 crc kubenswrapper[4841]: E0130 06:44:25.010772 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83eb4b2d-ef33-4018-b164-277863dc9bd6" containerName="mariadb-account-create-update" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010778 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="83eb4b2d-ef33-4018-b164-277863dc9bd6" containerName="mariadb-account-create-update" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010914 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1de6617-005c-4b5a-b230-b1578d641b2b" containerName="mariadb-account-create-update" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010926 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="55f1a988-8b8f-4799-9efa-b4ec26393ad2" containerName="mariadb-account-create-update" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010932 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b80476c5-38e5-46e8-ba13-a999779eca8c" containerName="mariadb-database-create" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010940 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e01bb5-4f5d-453c-9909-115f85275590" containerName="mariadb-database-create" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010952 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="83eb4b2d-ef33-4018-b164-277863dc9bd6" containerName="mariadb-account-create-update" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.010963 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8527b4c-dd51-4993-8997-909f5c4fd939" containerName="mariadb-database-create" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.011531 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.014563 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.014609 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.014940 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-x46gm" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.025588 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ws5lz"] Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.091293 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ws5lz\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.091392 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnbml\" (UniqueName: \"kubernetes.io/projected/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-kube-api-access-mnbml\") pod \"nova-cell0-conductor-db-sync-ws5lz\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.091432 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-scripts\") pod \"nova-cell0-conductor-db-sync-ws5lz\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.091489 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-config-data\") pod \"nova-cell0-conductor-db-sync-ws5lz\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.192916 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnbml\" (UniqueName: \"kubernetes.io/projected/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-kube-api-access-mnbml\") pod \"nova-cell0-conductor-db-sync-ws5lz\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.192997 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-scripts\") pod \"nova-cell0-conductor-db-sync-ws5lz\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.193067 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-config-data\") pod \"nova-cell0-conductor-db-sync-ws5lz\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.193118 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ws5lz\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.198751 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-scripts\") pod \"nova-cell0-conductor-db-sync-ws5lz\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.199053 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-config-data\") pod \"nova-cell0-conductor-db-sync-ws5lz\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.199258 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ws5lz\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.237882 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnbml\" (UniqueName: \"kubernetes.io/projected/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-kube-api-access-mnbml\") pod \"nova-cell0-conductor-db-sync-ws5lz\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.332980 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.815119 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ws5lz"] Jan 30 06:44:25 crc kubenswrapper[4841]: I0130 06:44:25.929632 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ws5lz" event={"ID":"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32","Type":"ContainerStarted","Data":"f2145d9220ff1e18906a60b36e8de7f4e52875b16349ca9093f5169e2825280b"} Jan 30 06:44:26 crc kubenswrapper[4841]: I0130 06:44:26.432157 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:44:26 crc kubenswrapper[4841]: E0130 06:44:26.432868 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:44:26 crc kubenswrapper[4841]: I0130 06:44:26.940270 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ws5lz" event={"ID":"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32","Type":"ContainerStarted","Data":"bd65f3076ea24c438dde45460632288a230a6e6a52c3fcf6f672d01f2b2982f3"} Jan 30 06:44:26 crc kubenswrapper[4841]: I0130 06:44:26.962820 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ws5lz" podStartSLOduration=2.962806007 podStartE2EDuration="2.962806007s" podCreationTimestamp="2026-01-30 06:44:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:26.956664813 +0000 UTC m=+5803.950137451" watchObservedRunningTime="2026-01-30 06:44:26.962806007 +0000 UTC m=+5803.956278645" Jan 30 06:44:30 crc kubenswrapper[4841]: I0130 06:44:30.986095 4841 generic.go:334] "Generic (PLEG): container finished" podID="797cdae3-bb0b-497e-b8a6-cdd4e3c13f32" containerID="bd65f3076ea24c438dde45460632288a230a6e6a52c3fcf6f672d01f2b2982f3" exitCode=0 Jan 30 06:44:30 crc kubenswrapper[4841]: I0130 06:44:30.986199 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ws5lz" event={"ID":"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32","Type":"ContainerDied","Data":"bd65f3076ea24c438dde45460632288a230a6e6a52c3fcf6f672d01f2b2982f3"} Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.449476 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.483100 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-scripts\") pod \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.483220 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-config-data\") pod \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.483284 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-combined-ca-bundle\") pod \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.483496 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnbml\" (UniqueName: \"kubernetes.io/projected/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-kube-api-access-mnbml\") pod \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\" (UID: \"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32\") " Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.503375 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-scripts" (OuterVolumeSpecName: "scripts") pod "797cdae3-bb0b-497e-b8a6-cdd4e3c13f32" (UID: "797cdae3-bb0b-497e-b8a6-cdd4e3c13f32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.510055 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-kube-api-access-mnbml" (OuterVolumeSpecName: "kube-api-access-mnbml") pod "797cdae3-bb0b-497e-b8a6-cdd4e3c13f32" (UID: "797cdae3-bb0b-497e-b8a6-cdd4e3c13f32"). InnerVolumeSpecName "kube-api-access-mnbml". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.518373 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-config-data" (OuterVolumeSpecName: "config-data") pod "797cdae3-bb0b-497e-b8a6-cdd4e3c13f32" (UID: "797cdae3-bb0b-497e-b8a6-cdd4e3c13f32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.529422 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "797cdae3-bb0b-497e-b8a6-cdd4e3c13f32" (UID: "797cdae3-bb0b-497e-b8a6-cdd4e3c13f32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.586652 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnbml\" (UniqueName: \"kubernetes.io/projected/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-kube-api-access-mnbml\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.586721 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.586735 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:32 crc kubenswrapper[4841]: I0130 06:44:32.586749 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.011524 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ws5lz" event={"ID":"797cdae3-bb0b-497e-b8a6-cdd4e3c13f32","Type":"ContainerDied","Data":"f2145d9220ff1e18906a60b36e8de7f4e52875b16349ca9093f5169e2825280b"} Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.011938 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2145d9220ff1e18906a60b36e8de7f4e52875b16349ca9093f5169e2825280b" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.011646 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ws5lz" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.138593 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 06:44:33 crc kubenswrapper[4841]: E0130 06:44:33.139064 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797cdae3-bb0b-497e-b8a6-cdd4e3c13f32" containerName="nova-cell0-conductor-db-sync" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.139084 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="797cdae3-bb0b-497e-b8a6-cdd4e3c13f32" containerName="nova-cell0-conductor-db-sync" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.139359 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="797cdae3-bb0b-497e-b8a6-cdd4e3c13f32" containerName="nova-cell0-conductor-db-sync" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.140149 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.142997 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.143224 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-x46gm" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.178857 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.198769 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e2985a-b102-472e-90af-13a1e4278197-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"67e2985a-b102-472e-90af-13a1e4278197\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.198960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e2985a-b102-472e-90af-13a1e4278197-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"67e2985a-b102-472e-90af-13a1e4278197\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.198989 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbxmq\" (UniqueName: \"kubernetes.io/projected/67e2985a-b102-472e-90af-13a1e4278197-kube-api-access-mbxmq\") pod \"nova-cell0-conductor-0\" (UID: \"67e2985a-b102-472e-90af-13a1e4278197\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.299867 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e2985a-b102-472e-90af-13a1e4278197-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"67e2985a-b102-472e-90af-13a1e4278197\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.300065 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e2985a-b102-472e-90af-13a1e4278197-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"67e2985a-b102-472e-90af-13a1e4278197\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.300102 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbxmq\" (UniqueName: \"kubernetes.io/projected/67e2985a-b102-472e-90af-13a1e4278197-kube-api-access-mbxmq\") pod \"nova-cell0-conductor-0\" (UID: \"67e2985a-b102-472e-90af-13a1e4278197\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.305928 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e2985a-b102-472e-90af-13a1e4278197-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"67e2985a-b102-472e-90af-13a1e4278197\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.305954 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67e2985a-b102-472e-90af-13a1e4278197-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"67e2985a-b102-472e-90af-13a1e4278197\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.327511 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbxmq\" (UniqueName: \"kubernetes.io/projected/67e2985a-b102-472e-90af-13a1e4278197-kube-api-access-mbxmq\") pod \"nova-cell0-conductor-0\" (UID: \"67e2985a-b102-472e-90af-13a1e4278197\") " pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.464834 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:33 crc kubenswrapper[4841]: I0130 06:44:33.980212 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 06:44:34 crc kubenswrapper[4841]: I0130 06:44:34.033016 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"67e2985a-b102-472e-90af-13a1e4278197","Type":"ContainerStarted","Data":"b91fc0b7374fc8eb18d0dc8094558254829e0f8a0615ff7629ee1ee74f36c816"} Jan 30 06:44:35 crc kubenswrapper[4841]: I0130 06:44:35.045934 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"67e2985a-b102-472e-90af-13a1e4278197","Type":"ContainerStarted","Data":"2ce7d234394a163d05b859560dc169be97359aa8579c21bc493ba8932ff93911"} Jan 30 06:44:35 crc kubenswrapper[4841]: I0130 06:44:35.048090 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:35 crc kubenswrapper[4841]: I0130 06:44:35.072284 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.072257341 podStartE2EDuration="2.072257341s" podCreationTimestamp="2026-01-30 06:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:35.070796071 +0000 UTC m=+5812.064268729" watchObservedRunningTime="2026-01-30 06:44:35.072257341 +0000 UTC m=+5812.065730009" Jan 30 06:44:38 crc kubenswrapper[4841]: I0130 06:44:38.432839 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:44:38 crc kubenswrapper[4841]: E0130 06:44:38.433541 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:44:43 crc kubenswrapper[4841]: I0130 06:44:43.512344 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.024175 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mccnj"] Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.025979 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.030321 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.030443 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.035632 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mccnj"] Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.201204 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-scripts\") pod \"nova-cell0-cell-mapping-mccnj\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.201329 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xptlb\" (UniqueName: \"kubernetes.io/projected/273b206a-f216-479f-927c-aed775be10b6-kube-api-access-xptlb\") pod \"nova-cell0-cell-mapping-mccnj\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.201363 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-config-data\") pod \"nova-cell0-cell-mapping-mccnj\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.201434 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mccnj\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.234293 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.236285 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.240243 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.245592 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.277478 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.278835 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.282268 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.304549 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-scripts\") pod \"nova-cell0-cell-mapping-mccnj\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.304647 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xptlb\" (UniqueName: \"kubernetes.io/projected/273b206a-f216-479f-927c-aed775be10b6-kube-api-access-xptlb\") pod \"nova-cell0-cell-mapping-mccnj\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.304674 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-config-data\") pod \"nova-cell0-cell-mapping-mccnj\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.304712 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mccnj\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.313504 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-config-data\") pod \"nova-cell0-cell-mapping-mccnj\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.314801 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-scripts\") pod \"nova-cell0-cell-mapping-mccnj\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.328231 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mccnj\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.337470 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.372420 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xptlb\" (UniqueName: \"kubernetes.io/projected/273b206a-f216-479f-927c-aed775be10b6-kube-api-access-xptlb\") pod \"nova-cell0-cell-mapping-mccnj\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.384165 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.385546 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.390754 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.394983 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.395979 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.406738 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3447f0c-f61f-456b-8b1f-4956055d9bcc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\") " pod="openstack/nova-scheduler-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.406781 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70062e63-a5a7-40f5-add6-7c268838b5e5-logs\") pod \"nova-api-0\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.406815 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26m6w\" (UniqueName: \"kubernetes.io/projected/d3447f0c-f61f-456b-8b1f-4956055d9bcc-kube-api-access-26m6w\") pod \"nova-scheduler-0\" (UID: \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\") " pod="openstack/nova-scheduler-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.406849 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t668\" (UniqueName: \"kubernetes.io/projected/70062e63-a5a7-40f5-add6-7c268838b5e5-kube-api-access-2t668\") pod \"nova-api-0\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.406880 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70062e63-a5a7-40f5-add6-7c268838b5e5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.406904 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3447f0c-f61f-456b-8b1f-4956055d9bcc-config-data\") pod \"nova-scheduler-0\" (UID: \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\") " pod="openstack/nova-scheduler-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.406925 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70062e63-a5a7-40f5-add6-7c268838b5e5-config-data\") pod \"nova-api-0\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.492480 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.493569 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.500384 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.557539 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fb5f469f9-lc6zv"] Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581120 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg4z7\" (UniqueName: \"kubernetes.io/projected/5d55efc5-765f-4e49-be25-f7f98a23213e-kube-api-access-dg4z7\") pod \"nova-metadata-0\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581202 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d55efc5-765f-4e49-be25-f7f98a23213e-logs\") pod \"nova-metadata-0\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581311 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3447f0c-f61f-456b-8b1f-4956055d9bcc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\") " pod="openstack/nova-scheduler-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581408 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70062e63-a5a7-40f5-add6-7c268838b5e5-logs\") pod \"nova-api-0\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581478 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26m6w\" (UniqueName: \"kubernetes.io/projected/d3447f0c-f61f-456b-8b1f-4956055d9bcc-kube-api-access-26m6w\") pod \"nova-scheduler-0\" (UID: \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\") " pod="openstack/nova-scheduler-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581572 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t668\" (UniqueName: \"kubernetes.io/projected/70062e63-a5a7-40f5-add6-7c268838b5e5-kube-api-access-2t668\") pod \"nova-api-0\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581597 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d55efc5-765f-4e49-be25-f7f98a23213e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581620 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401924cd-9c3a-4772-bf87-bb82985e2eec-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"401924cd-9c3a-4772-bf87-bb82985e2eec\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581654 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401924cd-9c3a-4772-bf87-bb82985e2eec-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"401924cd-9c3a-4772-bf87-bb82985e2eec\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581676 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d55efc5-765f-4e49-be25-f7f98a23213e-config-data\") pod \"nova-metadata-0\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581735 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70062e63-a5a7-40f5-add6-7c268838b5e5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581784 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3447f0c-f61f-456b-8b1f-4956055d9bcc-config-data\") pod \"nova-scheduler-0\" (UID: \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\") " pod="openstack/nova-scheduler-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581945 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvlmk\" (UniqueName: \"kubernetes.io/projected/401924cd-9c3a-4772-bf87-bb82985e2eec-kube-api-access-jvlmk\") pod \"nova-cell1-novncproxy-0\" (UID: \"401924cd-9c3a-4772-bf87-bb82985e2eec\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.581998 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70062e63-a5a7-40f5-add6-7c268838b5e5-config-data\") pod \"nova-api-0\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.644593 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3447f0c-f61f-456b-8b1f-4956055d9bcc-config-data\") pod \"nova-scheduler-0\" (UID: \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\") " pod="openstack/nova-scheduler-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.646370 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70062e63-a5a7-40f5-add6-7c268838b5e5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.649061 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3447f0c-f61f-456b-8b1f-4956055d9bcc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\") " pod="openstack/nova-scheduler-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.649638 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70062e63-a5a7-40f5-add6-7c268838b5e5-logs\") pod \"nova-api-0\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.650121 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70062e63-a5a7-40f5-add6-7c268838b5e5-config-data\") pod \"nova-api-0\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.656324 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.662313 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26m6w\" (UniqueName: \"kubernetes.io/projected/d3447f0c-f61f-456b-8b1f-4956055d9bcc-kube-api-access-26m6w\") pod \"nova-scheduler-0\" (UID: \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\") " pod="openstack/nova-scheduler-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.681748 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.684072 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t668\" (UniqueName: \"kubernetes.io/projected/70062e63-a5a7-40f5-add6-7c268838b5e5-kube-api-access-2t668\") pod \"nova-api-0\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.684999 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d55efc5-765f-4e49-be25-f7f98a23213e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.685041 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401924cd-9c3a-4772-bf87-bb82985e2eec-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"401924cd-9c3a-4772-bf87-bb82985e2eec\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.685063 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d55efc5-765f-4e49-be25-f7f98a23213e-config-data\") pod \"nova-metadata-0\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.685270 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401924cd-9c3a-4772-bf87-bb82985e2eec-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"401924cd-9c3a-4772-bf87-bb82985e2eec\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.685319 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvlmk\" (UniqueName: \"kubernetes.io/projected/401924cd-9c3a-4772-bf87-bb82985e2eec-kube-api-access-jvlmk\") pod \"nova-cell1-novncproxy-0\" (UID: \"401924cd-9c3a-4772-bf87-bb82985e2eec\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.685353 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg4z7\" (UniqueName: \"kubernetes.io/projected/5d55efc5-765f-4e49-be25-f7f98a23213e-kube-api-access-dg4z7\") pod \"nova-metadata-0\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.685378 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d55efc5-765f-4e49-be25-f7f98a23213e-logs\") pod \"nova-metadata-0\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.685803 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d55efc5-765f-4e49-be25-f7f98a23213e-logs\") pod \"nova-metadata-0\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.693010 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401924cd-9c3a-4772-bf87-bb82985e2eec-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"401924cd-9c3a-4772-bf87-bb82985e2eec\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.693822 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d55efc5-765f-4e49-be25-f7f98a23213e-config-data\") pod \"nova-metadata-0\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.696857 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401924cd-9c3a-4772-bf87-bb82985e2eec-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"401924cd-9c3a-4772-bf87-bb82985e2eec\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.700471 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fb5f469f9-lc6zv"] Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.714209 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d55efc5-765f-4e49-be25-f7f98a23213e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.719709 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg4z7\" (UniqueName: \"kubernetes.io/projected/5d55efc5-765f-4e49-be25-f7f98a23213e-kube-api-access-dg4z7\") pod \"nova-metadata-0\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.720980 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvlmk\" (UniqueName: \"kubernetes.io/projected/401924cd-9c3a-4772-bf87-bb82985e2eec-kube-api-access-jvlmk\") pod \"nova-cell1-novncproxy-0\" (UID: \"401924cd-9c3a-4772-bf87-bb82985e2eec\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.788711 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-ovsdbserver-nb\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.788753 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-dns-svc\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.788786 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n86c2\" (UniqueName: \"kubernetes.io/projected/02ad2afe-9c7b-437c-a8dd-21902fed0051-kube-api-access-n86c2\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.788830 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-config\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.788885 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-ovsdbserver-sb\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.863259 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.868620 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.890895 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-ovsdbserver-nb\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.890947 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-dns-svc\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.890977 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n86c2\" (UniqueName: \"kubernetes.io/projected/02ad2afe-9c7b-437c-a8dd-21902fed0051-kube-api-access-n86c2\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.891026 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-config\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.891073 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-ovsdbserver-sb\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.891778 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-dns-svc\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.891780 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-ovsdbserver-nb\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.892331 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-ovsdbserver-sb\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.892359 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-config\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.904892 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.915507 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n86c2\" (UniqueName: \"kubernetes.io/projected/02ad2afe-9c7b-437c-a8dd-21902fed0051-kube-api-access-n86c2\") pod \"dnsmasq-dns-6fb5f469f9-lc6zv\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.965911 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:44 crc kubenswrapper[4841]: I0130 06:44:44.979918 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.175009 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mccnj"] Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.295988 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hwdkc"] Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.297819 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.299816 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.300078 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.313947 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hwdkc"] Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.408583 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wwp\" (UniqueName: \"kubernetes.io/projected/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-kube-api-access-26wwp\") pod \"nova-cell1-conductor-db-sync-hwdkc\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.408674 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-config-data\") pod \"nova-cell1-conductor-db-sync-hwdkc\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.409629 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hwdkc\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.409894 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-scripts\") pod \"nova-cell1-conductor-db-sync-hwdkc\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.478194 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:44:45 crc kubenswrapper[4841]: W0130 06:44:45.478815 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70062e63_a5a7_40f5_add6_7c268838b5e5.slice/crio-8e6093c023ab5be02dc80a1a48ae83d8fc526a73cf7992554155b68de8d63d5f WatchSource:0}: Error finding container 8e6093c023ab5be02dc80a1a48ae83d8fc526a73cf7992554155b68de8d63d5f: Status 404 returned error can't find the container with id 8e6093c023ab5be02dc80a1a48ae83d8fc526a73cf7992554155b68de8d63d5f Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.496856 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.511860 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26wwp\" (UniqueName: \"kubernetes.io/projected/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-kube-api-access-26wwp\") pod \"nova-cell1-conductor-db-sync-hwdkc\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.512473 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-config-data\") pod \"nova-cell1-conductor-db-sync-hwdkc\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.512590 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hwdkc\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.512712 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-scripts\") pod \"nova-cell1-conductor-db-sync-hwdkc\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.516968 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-scripts\") pod \"nova-cell1-conductor-db-sync-hwdkc\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.517123 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-config-data\") pod \"nova-cell1-conductor-db-sync-hwdkc\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.517479 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hwdkc\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.536971 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26wwp\" (UniqueName: \"kubernetes.io/projected/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-kube-api-access-26wwp\") pod \"nova-cell1-conductor-db-sync-hwdkc\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.640572 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:44:45 crc kubenswrapper[4841]: W0130 06:44:45.652787 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3447f0c_f61f_456b_8b1f_4956055d9bcc.slice/crio-7f204eb4255dcfaeccca4de9126afb3dbd28df87d90932e98b2aefc31dba3f4f WatchSource:0}: Error finding container 7f204eb4255dcfaeccca4de9126afb3dbd28df87d90932e98b2aefc31dba3f4f: Status 404 returned error can't find the container with id 7f204eb4255dcfaeccca4de9126afb3dbd28df87d90932e98b2aefc31dba3f4f Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.722820 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.762444 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:44:45 crc kubenswrapper[4841]: I0130 06:44:45.775368 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fb5f469f9-lc6zv"] Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.158640 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"401924cd-9c3a-4772-bf87-bb82985e2eec","Type":"ContainerStarted","Data":"16449dbb228c6ee3ca24c751b0d8270bac6f6fedeb8a08fd8e2a5808dd43443e"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.159081 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"401924cd-9c3a-4772-bf87-bb82985e2eec","Type":"ContainerStarted","Data":"ffd60a457602c67c774d0b54d1da20b280de4b9ce1479527a04893ebbe17d5b5"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.160833 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d3447f0c-f61f-456b-8b1f-4956055d9bcc","Type":"ContainerStarted","Data":"f6e4fd5fcc96b19e980a941577cdf4a3eef3cc0b6c47033b2d6798ddfc7cdd8b"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.160874 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d3447f0c-f61f-456b-8b1f-4956055d9bcc","Type":"ContainerStarted","Data":"7f204eb4255dcfaeccca4de9126afb3dbd28df87d90932e98b2aefc31dba3f4f"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.162450 4841 generic.go:334] "Generic (PLEG): container finished" podID="02ad2afe-9c7b-437c-a8dd-21902fed0051" containerID="f784072bbf33149506c374713bcad7f1e775ba849f776ee2bbf0fd14d862f299" exitCode=0 Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.162528 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" event={"ID":"02ad2afe-9c7b-437c-a8dd-21902fed0051","Type":"ContainerDied","Data":"f784072bbf33149506c374713bcad7f1e775ba849f776ee2bbf0fd14d862f299"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.162558 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" event={"ID":"02ad2afe-9c7b-437c-a8dd-21902fed0051","Type":"ContainerStarted","Data":"bbeef325619650011786125cf83bd356393bac6f5aeb8782ebf7c3826231c099"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.164729 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"70062e63-a5a7-40f5-add6-7c268838b5e5","Type":"ContainerStarted","Data":"af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.164766 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"70062e63-a5a7-40f5-add6-7c268838b5e5","Type":"ContainerStarted","Data":"8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.164776 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"70062e63-a5a7-40f5-add6-7c268838b5e5","Type":"ContainerStarted","Data":"8e6093c023ab5be02dc80a1a48ae83d8fc526a73cf7992554155b68de8d63d5f"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.166250 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mccnj" event={"ID":"273b206a-f216-479f-927c-aed775be10b6","Type":"ContainerStarted","Data":"4d625a00dde5495041c61246d870b950a9d82636d2edbbc24c5dc1c4e2c2729b"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.166286 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mccnj" event={"ID":"273b206a-f216-479f-927c-aed775be10b6","Type":"ContainerStarted","Data":"d92aba10f6a0f76cf3ddcca4e05d1c9b6585978fea93e26256581c7cf5501dfa"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.168967 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5d55efc5-765f-4e49-be25-f7f98a23213e","Type":"ContainerStarted","Data":"4b19921ac258b7a81bd50539cda824dc6e1104ee0edd57de4e4db8a55fb77560"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.168993 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5d55efc5-765f-4e49-be25-f7f98a23213e","Type":"ContainerStarted","Data":"e4de32d38c29af149fbc2cb43bae0696cede6db965c16533eab42f831e6483a0"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.169005 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5d55efc5-765f-4e49-be25-f7f98a23213e","Type":"ContainerStarted","Data":"699a8ca26c23dfd433fa32988673d168c3261e4fdbab974113818ac3e7a9aa7b"} Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.184754 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.184735685 podStartE2EDuration="2.184735685s" podCreationTimestamp="2026-01-30 06:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:46.177687127 +0000 UTC m=+5823.171159765" watchObservedRunningTime="2026-01-30 06:44:46.184735685 +0000 UTC m=+5823.178208323" Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.232154 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hwdkc"] Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.242498 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.242477695 podStartE2EDuration="2.242477695s" podCreationTimestamp="2026-01-30 06:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:46.219571394 +0000 UTC m=+5823.213044032" watchObservedRunningTime="2026-01-30 06:44:46.242477695 +0000 UTC m=+5823.235950333" Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.256070 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mccnj" podStartSLOduration=3.256049157 podStartE2EDuration="3.256049157s" podCreationTimestamp="2026-01-30 06:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:46.232026357 +0000 UTC m=+5823.225499005" watchObservedRunningTime="2026-01-30 06:44:46.256049157 +0000 UTC m=+5823.249521795" Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.267908 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.267892284 podStartE2EDuration="2.267892284s" podCreationTimestamp="2026-01-30 06:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:46.257780653 +0000 UTC m=+5823.251253291" watchObservedRunningTime="2026-01-30 06:44:46.267892284 +0000 UTC m=+5823.261364922" Jan 30 06:44:46 crc kubenswrapper[4841]: I0130 06:44:46.280122 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.280108139 podStartE2EDuration="2.280108139s" podCreationTimestamp="2026-01-30 06:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:46.274699925 +0000 UTC m=+5823.268172563" watchObservedRunningTime="2026-01-30 06:44:46.280108139 +0000 UTC m=+5823.273580777" Jan 30 06:44:47 crc kubenswrapper[4841]: I0130 06:44:47.179156 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" event={"ID":"02ad2afe-9c7b-437c-a8dd-21902fed0051","Type":"ContainerStarted","Data":"8238e9d7576ea53b78b4060063ecd593cd36dfb4b2b76876395e3f04eff8d08b"} Jan 30 06:44:47 crc kubenswrapper[4841]: I0130 06:44:47.180381 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:47 crc kubenswrapper[4841]: I0130 06:44:47.182655 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hwdkc" event={"ID":"995f0195-eb79-4f48-ac66-ac38b0f7cd0f","Type":"ContainerStarted","Data":"64c11b2824241bdfc1a972a0a8cb5f7c371117564c57e3a50363123323495590"} Jan 30 06:44:47 crc kubenswrapper[4841]: I0130 06:44:47.182679 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hwdkc" event={"ID":"995f0195-eb79-4f48-ac66-ac38b0f7cd0f","Type":"ContainerStarted","Data":"86c815c2138dccb5e5c6a29bd8609e98c8d17944fe89580c01c2c7655ac7f667"} Jan 30 06:44:47 crc kubenswrapper[4841]: I0130 06:44:47.206690 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" podStartSLOduration=3.206678924 podStartE2EDuration="3.206678924s" podCreationTimestamp="2026-01-30 06:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:47.203714145 +0000 UTC m=+5824.197186783" watchObservedRunningTime="2026-01-30 06:44:47.206678924 +0000 UTC m=+5824.200151562" Jan 30 06:44:47 crc kubenswrapper[4841]: I0130 06:44:47.224378 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-hwdkc" podStartSLOduration=2.224358906 podStartE2EDuration="2.224358906s" podCreationTimestamp="2026-01-30 06:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:47.220103062 +0000 UTC m=+5824.213575700" watchObservedRunningTime="2026-01-30 06:44:47.224358906 +0000 UTC m=+5824.217831544" Jan 30 06:44:48 crc kubenswrapper[4841]: I0130 06:44:48.674373 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:48 crc kubenswrapper[4841]: I0130 06:44:48.674965 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5d55efc5-765f-4e49-be25-f7f98a23213e" containerName="nova-metadata-log" containerID="cri-o://e4de32d38c29af149fbc2cb43bae0696cede6db965c16533eab42f831e6483a0" gracePeriod=30 Jan 30 06:44:48 crc kubenswrapper[4841]: I0130 06:44:48.675080 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5d55efc5-765f-4e49-be25-f7f98a23213e" containerName="nova-metadata-metadata" containerID="cri-o://4b19921ac258b7a81bd50539cda824dc6e1104ee0edd57de4e4db8a55fb77560" gracePeriod=30 Jan 30 06:44:48 crc kubenswrapper[4841]: I0130 06:44:48.682671 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:44:48 crc kubenswrapper[4841]: I0130 06:44:48.682860 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="401924cd-9c3a-4772-bf87-bb82985e2eec" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://16449dbb228c6ee3ca24c751b0d8270bac6f6fedeb8a08fd8e2a5808dd43443e" gracePeriod=30 Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.201205 4841 generic.go:334] "Generic (PLEG): container finished" podID="5d55efc5-765f-4e49-be25-f7f98a23213e" containerID="4b19921ac258b7a81bd50539cda824dc6e1104ee0edd57de4e4db8a55fb77560" exitCode=0 Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.201501 4841 generic.go:334] "Generic (PLEG): container finished" podID="5d55efc5-765f-4e49-be25-f7f98a23213e" containerID="e4de32d38c29af149fbc2cb43bae0696cede6db965c16533eab42f831e6483a0" exitCode=143 Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.201542 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5d55efc5-765f-4e49-be25-f7f98a23213e","Type":"ContainerDied","Data":"4b19921ac258b7a81bd50539cda824dc6e1104ee0edd57de4e4db8a55fb77560"} Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.201568 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5d55efc5-765f-4e49-be25-f7f98a23213e","Type":"ContainerDied","Data":"e4de32d38c29af149fbc2cb43bae0696cede6db965c16533eab42f831e6483a0"} Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.201578 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5d55efc5-765f-4e49-be25-f7f98a23213e","Type":"ContainerDied","Data":"699a8ca26c23dfd433fa32988673d168c3261e4fdbab974113818ac3e7a9aa7b"} Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.201586 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="699a8ca26c23dfd433fa32988673d168c3261e4fdbab974113818ac3e7a9aa7b" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.203752 4841 generic.go:334] "Generic (PLEG): container finished" podID="401924cd-9c3a-4772-bf87-bb82985e2eec" containerID="16449dbb228c6ee3ca24c751b0d8270bac6f6fedeb8a08fd8e2a5808dd43443e" exitCode=0 Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.203785 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"401924cd-9c3a-4772-bf87-bb82985e2eec","Type":"ContainerDied","Data":"16449dbb228c6ee3ca24c751b0d8270bac6f6fedeb8a08fd8e2a5808dd43443e"} Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.205863 4841 generic.go:334] "Generic (PLEG): container finished" podID="995f0195-eb79-4f48-ac66-ac38b0f7cd0f" containerID="64c11b2824241bdfc1a972a0a8cb5f7c371117564c57e3a50363123323495590" exitCode=0 Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.205918 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hwdkc" event={"ID":"995f0195-eb79-4f48-ac66-ac38b0f7cd0f","Type":"ContainerDied","Data":"64c11b2824241bdfc1a972a0a8cb5f7c371117564c57e3a50363123323495590"} Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.310738 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.397141 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d55efc5-765f-4e49-be25-f7f98a23213e-logs\") pod \"5d55efc5-765f-4e49-be25-f7f98a23213e\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.397227 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg4z7\" (UniqueName: \"kubernetes.io/projected/5d55efc5-765f-4e49-be25-f7f98a23213e-kube-api-access-dg4z7\") pod \"5d55efc5-765f-4e49-be25-f7f98a23213e\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.397270 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d55efc5-765f-4e49-be25-f7f98a23213e-combined-ca-bundle\") pod \"5d55efc5-765f-4e49-be25-f7f98a23213e\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.397367 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d55efc5-765f-4e49-be25-f7f98a23213e-config-data\") pod \"5d55efc5-765f-4e49-be25-f7f98a23213e\" (UID: \"5d55efc5-765f-4e49-be25-f7f98a23213e\") " Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.398525 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d55efc5-765f-4e49-be25-f7f98a23213e-logs" (OuterVolumeSpecName: "logs") pod "5d55efc5-765f-4e49-be25-f7f98a23213e" (UID: "5d55efc5-765f-4e49-be25-f7f98a23213e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.403474 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d55efc5-765f-4e49-be25-f7f98a23213e-kube-api-access-dg4z7" (OuterVolumeSpecName: "kube-api-access-dg4z7") pod "5d55efc5-765f-4e49-be25-f7f98a23213e" (UID: "5d55efc5-765f-4e49-be25-f7f98a23213e"). InnerVolumeSpecName "kube-api-access-dg4z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.424848 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.426922 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d55efc5-765f-4e49-be25-f7f98a23213e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d55efc5-765f-4e49-be25-f7f98a23213e" (UID: "5d55efc5-765f-4e49-be25-f7f98a23213e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.427971 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d55efc5-765f-4e49-be25-f7f98a23213e-config-data" (OuterVolumeSpecName: "config-data") pod "5d55efc5-765f-4e49-be25-f7f98a23213e" (UID: "5d55efc5-765f-4e49-be25-f7f98a23213e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.498488 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvlmk\" (UniqueName: \"kubernetes.io/projected/401924cd-9c3a-4772-bf87-bb82985e2eec-kube-api-access-jvlmk\") pod \"401924cd-9c3a-4772-bf87-bb82985e2eec\" (UID: \"401924cd-9c3a-4772-bf87-bb82985e2eec\") " Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.498979 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401924cd-9c3a-4772-bf87-bb82985e2eec-combined-ca-bundle\") pod \"401924cd-9c3a-4772-bf87-bb82985e2eec\" (UID: \"401924cd-9c3a-4772-bf87-bb82985e2eec\") " Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.499262 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401924cd-9c3a-4772-bf87-bb82985e2eec-config-data\") pod \"401924cd-9c3a-4772-bf87-bb82985e2eec\" (UID: \"401924cd-9c3a-4772-bf87-bb82985e2eec\") " Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.500264 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d55efc5-765f-4e49-be25-f7f98a23213e-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.500299 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg4z7\" (UniqueName: \"kubernetes.io/projected/5d55efc5-765f-4e49-be25-f7f98a23213e-kube-api-access-dg4z7\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.500320 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d55efc5-765f-4e49-be25-f7f98a23213e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.500337 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d55efc5-765f-4e49-be25-f7f98a23213e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.501107 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401924cd-9c3a-4772-bf87-bb82985e2eec-kube-api-access-jvlmk" (OuterVolumeSpecName: "kube-api-access-jvlmk") pod "401924cd-9c3a-4772-bf87-bb82985e2eec" (UID: "401924cd-9c3a-4772-bf87-bb82985e2eec"). InnerVolumeSpecName "kube-api-access-jvlmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.520593 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401924cd-9c3a-4772-bf87-bb82985e2eec-config-data" (OuterVolumeSpecName: "config-data") pod "401924cd-9c3a-4772-bf87-bb82985e2eec" (UID: "401924cd-9c3a-4772-bf87-bb82985e2eec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.527003 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401924cd-9c3a-4772-bf87-bb82985e2eec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "401924cd-9c3a-4772-bf87-bb82985e2eec" (UID: "401924cd-9c3a-4772-bf87-bb82985e2eec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.601869 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvlmk\" (UniqueName: \"kubernetes.io/projected/401924cd-9c3a-4772-bf87-bb82985e2eec-kube-api-access-jvlmk\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.601961 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401924cd-9c3a-4772-bf87-bb82985e2eec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.602016 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401924cd-9c3a-4772-bf87-bb82985e2eec-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:49 crc kubenswrapper[4841]: I0130 06:44:49.906562 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.222311 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"401924cd-9c3a-4772-bf87-bb82985e2eec","Type":"ContainerDied","Data":"ffd60a457602c67c774d0b54d1da20b280de4b9ce1479527a04893ebbe17d5b5"} Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.222328 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.222428 4841 scope.go:117] "RemoveContainer" containerID="16449dbb228c6ee3ca24c751b0d8270bac6f6fedeb8a08fd8e2a5808dd43443e" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.225819 4841 generic.go:334] "Generic (PLEG): container finished" podID="273b206a-f216-479f-927c-aed775be10b6" containerID="4d625a00dde5495041c61246d870b950a9d82636d2edbbc24c5dc1c4e2c2729b" exitCode=0 Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.225897 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mccnj" event={"ID":"273b206a-f216-479f-927c-aed775be10b6","Type":"ContainerDied","Data":"4d625a00dde5495041c61246d870b950a9d82636d2edbbc24c5dc1c4e2c2729b"} Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.226169 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.320957 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.342543 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.357847 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.368864 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.377245 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:50 crc kubenswrapper[4841]: E0130 06:44:50.377706 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d55efc5-765f-4e49-be25-f7f98a23213e" containerName="nova-metadata-metadata" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.377724 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d55efc5-765f-4e49-be25-f7f98a23213e" containerName="nova-metadata-metadata" Jan 30 06:44:50 crc kubenswrapper[4841]: E0130 06:44:50.377745 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d55efc5-765f-4e49-be25-f7f98a23213e" containerName="nova-metadata-log" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.377752 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d55efc5-765f-4e49-be25-f7f98a23213e" containerName="nova-metadata-log" Jan 30 06:44:50 crc kubenswrapper[4841]: E0130 06:44:50.377760 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401924cd-9c3a-4772-bf87-bb82985e2eec" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.377767 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="401924cd-9c3a-4772-bf87-bb82985e2eec" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.377950 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d55efc5-765f-4e49-be25-f7f98a23213e" containerName="nova-metadata-log" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.377962 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d55efc5-765f-4e49-be25-f7f98a23213e" containerName="nova-metadata-metadata" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.377978 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="401924cd-9c3a-4772-bf87-bb82985e2eec" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.378956 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.383306 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.383645 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.400217 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.402233 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.405525 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.405977 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.408655 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.412256 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.418284 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.418348 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a81333a-38af-40fb-9238-116fe8e5b6a8-logs\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.418383 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.418444 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.418471 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.418648 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.418701 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qcf2\" (UniqueName: \"kubernetes.io/projected/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-kube-api-access-4qcf2\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.418810 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-config-data\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.418849 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h46nc\" (UniqueName: \"kubernetes.io/projected/8a81333a-38af-40fb-9238-116fe8e5b6a8-kube-api-access-h46nc\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.418927 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.424949 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.442940 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="401924cd-9c3a-4772-bf87-bb82985e2eec" path="/var/lib/kubelet/pods/401924cd-9c3a-4772-bf87-bb82985e2eec/volumes" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.443808 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d55efc5-765f-4e49-be25-f7f98a23213e" path="/var/lib/kubelet/pods/5d55efc5-765f-4e49-be25-f7f98a23213e/volumes" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.521387 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.521480 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.521501 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a81333a-38af-40fb-9238-116fe8e5b6a8-logs\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.521524 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.521560 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.521583 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.521639 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.521655 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qcf2\" (UniqueName: \"kubernetes.io/projected/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-kube-api-access-4qcf2\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.521692 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-config-data\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.521716 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h46nc\" (UniqueName: \"kubernetes.io/projected/8a81333a-38af-40fb-9238-116fe8e5b6a8-kube-api-access-h46nc\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.524832 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a81333a-38af-40fb-9238-116fe8e5b6a8-logs\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.528654 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.528819 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.529883 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.531045 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.538996 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h46nc\" (UniqueName: \"kubernetes.io/projected/8a81333a-38af-40fb-9238-116fe8e5b6a8-kube-api-access-h46nc\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.539776 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-config-data\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.548014 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.550151 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qcf2\" (UniqueName: \"kubernetes.io/projected/b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae-kube-api-access-4qcf2\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.550173 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.668977 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.703224 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.725477 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-combined-ca-bundle\") pod \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.725721 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-config-data\") pod \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.726893 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26wwp\" (UniqueName: \"kubernetes.io/projected/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-kube-api-access-26wwp\") pod \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.726965 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-scripts\") pod \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\" (UID: \"995f0195-eb79-4f48-ac66-ac38b0f7cd0f\") " Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.727028 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.731250 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-kube-api-access-26wwp" (OuterVolumeSpecName: "kube-api-access-26wwp") pod "995f0195-eb79-4f48-ac66-ac38b0f7cd0f" (UID: "995f0195-eb79-4f48-ac66-ac38b0f7cd0f"). InnerVolumeSpecName "kube-api-access-26wwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.733056 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-scripts" (OuterVolumeSpecName: "scripts") pod "995f0195-eb79-4f48-ac66-ac38b0f7cd0f" (UID: "995f0195-eb79-4f48-ac66-ac38b0f7cd0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.775631 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "995f0195-eb79-4f48-ac66-ac38b0f7cd0f" (UID: "995f0195-eb79-4f48-ac66-ac38b0f7cd0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.782299 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-config-data" (OuterVolumeSpecName: "config-data") pod "995f0195-eb79-4f48-ac66-ac38b0f7cd0f" (UID: "995f0195-eb79-4f48-ac66-ac38b0f7cd0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.830325 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.830672 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26wwp\" (UniqueName: \"kubernetes.io/projected/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-kube-api-access-26wwp\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.830689 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:50 crc kubenswrapper[4841]: I0130 06:44:50.830729 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/995f0195-eb79-4f48-ac66-ac38b0f7cd0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.160353 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:51 crc kubenswrapper[4841]: W0130 06:44:51.164922 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a81333a_38af_40fb_9238_116fe8e5b6a8.slice/crio-07e9f5c7bb6af1cbf5421b9bab35ed0f3665044f73940bf65f047ad372ed9f4e WatchSource:0}: Error finding container 07e9f5c7bb6af1cbf5421b9bab35ed0f3665044f73940bf65f047ad372ed9f4e: Status 404 returned error can't find the container with id 07e9f5c7bb6af1cbf5421b9bab35ed0f3665044f73940bf65f047ad372ed9f4e Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.240378 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hwdkc" event={"ID":"995f0195-eb79-4f48-ac66-ac38b0f7cd0f","Type":"ContainerDied","Data":"86c815c2138dccb5e5c6a29bd8609e98c8d17944fe89580c01c2c7655ac7f667"} Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.240442 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86c815c2138dccb5e5c6a29bd8609e98c8d17944fe89580c01c2c7655ac7f667" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.240417 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hwdkc" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.241767 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a81333a-38af-40fb-9238-116fe8e5b6a8","Type":"ContainerStarted","Data":"07e9f5c7bb6af1cbf5421b9bab35ed0f3665044f73940bf65f047ad372ed9f4e"} Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.260210 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 06:44:51 crc kubenswrapper[4841]: W0130 06:44:51.265775 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7eb1e2a_e1f2_4bb7_a95d_fbe741da44ae.slice/crio-f3cd70df4429534c58d1a31a74b356178691feaee30744bb5e6fee59445c08ed WatchSource:0}: Error finding container f3cd70df4429534c58d1a31a74b356178691feaee30744bb5e6fee59445c08ed: Status 404 returned error can't find the container with id f3cd70df4429534c58d1a31a74b356178691feaee30744bb5e6fee59445c08ed Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.432045 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:44:51 crc kubenswrapper[4841]: E0130 06:44:51.434626 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.531060 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.652728 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-config-data\") pod \"273b206a-f216-479f-927c-aed775be10b6\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.652834 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-scripts\") pod \"273b206a-f216-479f-927c-aed775be10b6\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.652923 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xptlb\" (UniqueName: \"kubernetes.io/projected/273b206a-f216-479f-927c-aed775be10b6-kube-api-access-xptlb\") pod \"273b206a-f216-479f-927c-aed775be10b6\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.652984 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-combined-ca-bundle\") pod \"273b206a-f216-479f-927c-aed775be10b6\" (UID: \"273b206a-f216-479f-927c-aed775be10b6\") " Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.656168 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/273b206a-f216-479f-927c-aed775be10b6-kube-api-access-xptlb" (OuterVolumeSpecName: "kube-api-access-xptlb") pod "273b206a-f216-479f-927c-aed775be10b6" (UID: "273b206a-f216-479f-927c-aed775be10b6"). InnerVolumeSpecName "kube-api-access-xptlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.656445 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-scripts" (OuterVolumeSpecName: "scripts") pod "273b206a-f216-479f-927c-aed775be10b6" (UID: "273b206a-f216-479f-927c-aed775be10b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.685081 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-config-data" (OuterVolumeSpecName: "config-data") pod "273b206a-f216-479f-927c-aed775be10b6" (UID: "273b206a-f216-479f-927c-aed775be10b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.692520 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "273b206a-f216-479f-927c-aed775be10b6" (UID: "273b206a-f216-479f-927c-aed775be10b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.746687 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 06:44:51 crc kubenswrapper[4841]: E0130 06:44:51.747094 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273b206a-f216-479f-927c-aed775be10b6" containerName="nova-manage" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.747113 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="273b206a-f216-479f-927c-aed775be10b6" containerName="nova-manage" Jan 30 06:44:51 crc kubenswrapper[4841]: E0130 06:44:51.747147 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995f0195-eb79-4f48-ac66-ac38b0f7cd0f" containerName="nova-cell1-conductor-db-sync" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.747157 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="995f0195-eb79-4f48-ac66-ac38b0f7cd0f" containerName="nova-cell1-conductor-db-sync" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.747347 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="995f0195-eb79-4f48-ac66-ac38b0f7cd0f" containerName="nova-cell1-conductor-db-sync" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.747372 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="273b206a-f216-479f-927c-aed775be10b6" containerName="nova-manage" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.748234 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.754364 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.755531 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.755557 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.755570 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xptlb\" (UniqueName: \"kubernetes.io/projected/273b206a-f216-479f-927c-aed775be10b6-kube-api-access-xptlb\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.755582 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273b206a-f216-479f-927c-aed775be10b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.760420 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.857792 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g5qp\" (UniqueName: \"kubernetes.io/projected/85adb897-5ab0-44b4-95a1-36e1610522d8-kube-api-access-5g5qp\") pod \"nova-cell1-conductor-0\" (UID: \"85adb897-5ab0-44b4-95a1-36e1610522d8\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.857848 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85adb897-5ab0-44b4-95a1-36e1610522d8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"85adb897-5ab0-44b4-95a1-36e1610522d8\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.858030 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85adb897-5ab0-44b4-95a1-36e1610522d8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"85adb897-5ab0-44b4-95a1-36e1610522d8\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.959701 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g5qp\" (UniqueName: \"kubernetes.io/projected/85adb897-5ab0-44b4-95a1-36e1610522d8-kube-api-access-5g5qp\") pod \"nova-cell1-conductor-0\" (UID: \"85adb897-5ab0-44b4-95a1-36e1610522d8\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.959773 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85adb897-5ab0-44b4-95a1-36e1610522d8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"85adb897-5ab0-44b4-95a1-36e1610522d8\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.959876 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85adb897-5ab0-44b4-95a1-36e1610522d8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"85adb897-5ab0-44b4-95a1-36e1610522d8\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.965499 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85adb897-5ab0-44b4-95a1-36e1610522d8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"85adb897-5ab0-44b4-95a1-36e1610522d8\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.966686 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85adb897-5ab0-44b4-95a1-36e1610522d8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"85adb897-5ab0-44b4-95a1-36e1610522d8\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:51 crc kubenswrapper[4841]: I0130 06:44:51.988178 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g5qp\" (UniqueName: \"kubernetes.io/projected/85adb897-5ab0-44b4-95a1-36e1610522d8-kube-api-access-5g5qp\") pod \"nova-cell1-conductor-0\" (UID: \"85adb897-5ab0-44b4-95a1-36e1610522d8\") " pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.072219 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.253881 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mccnj" event={"ID":"273b206a-f216-479f-927c-aed775be10b6","Type":"ContainerDied","Data":"d92aba10f6a0f76cf3ddcca4e05d1c9b6585978fea93e26256581c7cf5501dfa"} Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.254146 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d92aba10f6a0f76cf3ddcca4e05d1c9b6585978fea93e26256581c7cf5501dfa" Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.253956 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mccnj" Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.256794 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae","Type":"ContainerStarted","Data":"9becb2d9dc74e335a515fa28c331975fbdd451295485c8ff27e1f4a8521b9e07"} Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.256837 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae","Type":"ContainerStarted","Data":"f3cd70df4429534c58d1a31a74b356178691feaee30744bb5e6fee59445c08ed"} Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.261961 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a81333a-38af-40fb-9238-116fe8e5b6a8","Type":"ContainerStarted","Data":"828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec"} Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.262036 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a81333a-38af-40fb-9238-116fe8e5b6a8","Type":"ContainerStarted","Data":"d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774"} Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.298009 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.297989335 podStartE2EDuration="2.297989335s" podCreationTimestamp="2026-01-30 06:44:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:52.282107661 +0000 UTC m=+5829.275580299" watchObservedRunningTime="2026-01-30 06:44:52.297989335 +0000 UTC m=+5829.291461973" Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.328754 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.328735495 podStartE2EDuration="2.328735495s" podCreationTimestamp="2026-01-30 06:44:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:52.321961834 +0000 UTC m=+5829.315434492" watchObservedRunningTime="2026-01-30 06:44:52.328735495 +0000 UTC m=+5829.322208133" Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.453831 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.454079 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="70062e63-a5a7-40f5-add6-7c268838b5e5" containerName="nova-api-log" containerID="cri-o://8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9" gracePeriod=30 Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.454182 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="70062e63-a5a7-40f5-add6-7c268838b5e5" containerName="nova-api-api" containerID="cri-o://af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff" gracePeriod=30 Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.458994 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.459177 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d3447f0c-f61f-456b-8b1f-4956055d9bcc" containerName="nova-scheduler-scheduler" containerID="cri-o://f6e4fd5fcc96b19e980a941577cdf4a3eef3cc0b6c47033b2d6798ddfc7cdd8b" gracePeriod=30 Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.490080 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:52 crc kubenswrapper[4841]: W0130 06:44:52.601002 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85adb897_5ab0_44b4_95a1_36e1610522d8.slice/crio-6fb2b65106ca48cc22f094bcf0c241589e8d2730f4d53e525c28016fcc424a0e WatchSource:0}: Error finding container 6fb2b65106ca48cc22f094bcf0c241589e8d2730f4d53e525c28016fcc424a0e: Status 404 returned error can't find the container with id 6fb2b65106ca48cc22f094bcf0c241589e8d2730f4d53e525c28016fcc424a0e Jan 30 06:44:52 crc kubenswrapper[4841]: I0130 06:44:52.609386 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.014510 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.080501 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70062e63-a5a7-40f5-add6-7c268838b5e5-combined-ca-bundle\") pod \"70062e63-a5a7-40f5-add6-7c268838b5e5\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.080622 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70062e63-a5a7-40f5-add6-7c268838b5e5-config-data\") pod \"70062e63-a5a7-40f5-add6-7c268838b5e5\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.080647 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t668\" (UniqueName: \"kubernetes.io/projected/70062e63-a5a7-40f5-add6-7c268838b5e5-kube-api-access-2t668\") pod \"70062e63-a5a7-40f5-add6-7c268838b5e5\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.080762 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70062e63-a5a7-40f5-add6-7c268838b5e5-logs\") pod \"70062e63-a5a7-40f5-add6-7c268838b5e5\" (UID: \"70062e63-a5a7-40f5-add6-7c268838b5e5\") " Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.081382 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70062e63-a5a7-40f5-add6-7c268838b5e5-logs" (OuterVolumeSpecName: "logs") pod "70062e63-a5a7-40f5-add6-7c268838b5e5" (UID: "70062e63-a5a7-40f5-add6-7c268838b5e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.091639 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70062e63-a5a7-40f5-add6-7c268838b5e5-kube-api-access-2t668" (OuterVolumeSpecName: "kube-api-access-2t668") pod "70062e63-a5a7-40f5-add6-7c268838b5e5" (UID: "70062e63-a5a7-40f5-add6-7c268838b5e5"). InnerVolumeSpecName "kube-api-access-2t668". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.105349 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70062e63-a5a7-40f5-add6-7c268838b5e5-config-data" (OuterVolumeSpecName: "config-data") pod "70062e63-a5a7-40f5-add6-7c268838b5e5" (UID: "70062e63-a5a7-40f5-add6-7c268838b5e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.113577 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70062e63-a5a7-40f5-add6-7c268838b5e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70062e63-a5a7-40f5-add6-7c268838b5e5" (UID: "70062e63-a5a7-40f5-add6-7c268838b5e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.182435 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70062e63-a5a7-40f5-add6-7c268838b5e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.182704 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70062e63-a5a7-40f5-add6-7c268838b5e5-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.182721 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t668\" (UniqueName: \"kubernetes.io/projected/70062e63-a5a7-40f5-add6-7c268838b5e5-kube-api-access-2t668\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.182735 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70062e63-a5a7-40f5-add6-7c268838b5e5-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.274125 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"85adb897-5ab0-44b4-95a1-36e1610522d8","Type":"ContainerStarted","Data":"98d72d7a331f6c5eaabad8e890904ef6fbedda7682879cfff023428dcc9b6042"} Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.274175 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"85adb897-5ab0-44b4-95a1-36e1610522d8","Type":"ContainerStarted","Data":"6fb2b65106ca48cc22f094bcf0c241589e8d2730f4d53e525c28016fcc424a0e"} Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.275391 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.277657 4841 generic.go:334] "Generic (PLEG): container finished" podID="70062e63-a5a7-40f5-add6-7c268838b5e5" containerID="af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff" exitCode=0 Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.277678 4841 generic.go:334] "Generic (PLEG): container finished" podID="70062e63-a5a7-40f5-add6-7c268838b5e5" containerID="8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9" exitCode=143 Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.278272 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.281615 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"70062e63-a5a7-40f5-add6-7c268838b5e5","Type":"ContainerDied","Data":"af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff"} Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.281743 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"70062e63-a5a7-40f5-add6-7c268838b5e5","Type":"ContainerDied","Data":"8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9"} Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.281768 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"70062e63-a5a7-40f5-add6-7c268838b5e5","Type":"ContainerDied","Data":"8e6093c023ab5be02dc80a1a48ae83d8fc526a73cf7992554155b68de8d63d5f"} Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.281796 4841 scope.go:117] "RemoveContainer" containerID="af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.306592 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.306571287 podStartE2EDuration="2.306571287s" podCreationTimestamp="2026-01-30 06:44:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:53.301544763 +0000 UTC m=+5830.295017401" watchObservedRunningTime="2026-01-30 06:44:53.306571287 +0000 UTC m=+5830.300043935" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.322762 4841 scope.go:117] "RemoveContainer" containerID="8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.328812 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.338436 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.370279 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 06:44:53 crc kubenswrapper[4841]: E0130 06:44:53.370725 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70062e63-a5a7-40f5-add6-7c268838b5e5" containerName="nova-api-api" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.370748 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="70062e63-a5a7-40f5-add6-7c268838b5e5" containerName="nova-api-api" Jan 30 06:44:53 crc kubenswrapper[4841]: E0130 06:44:53.370771 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70062e63-a5a7-40f5-add6-7c268838b5e5" containerName="nova-api-log" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.370781 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="70062e63-a5a7-40f5-add6-7c268838b5e5" containerName="nova-api-log" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.370983 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="70062e63-a5a7-40f5-add6-7c268838b5e5" containerName="nova-api-api" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.371000 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="70062e63-a5a7-40f5-add6-7c268838b5e5" containerName="nova-api-log" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.372247 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.377487 4841 scope.go:117] "RemoveContainer" containerID="af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.377696 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 06:44:53 crc kubenswrapper[4841]: E0130 06:44:53.378302 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff\": container with ID starting with af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff not found: ID does not exist" containerID="af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.378524 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff"} err="failed to get container status \"af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff\": rpc error: code = NotFound desc = could not find container \"af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff\": container with ID starting with af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff not found: ID does not exist" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.378729 4841 scope.go:117] "RemoveContainer" containerID="8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.381796 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:44:53 crc kubenswrapper[4841]: E0130 06:44:53.383556 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9\": container with ID starting with 8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9 not found: ID does not exist" containerID="8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.383697 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9"} err="failed to get container status \"8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9\": rpc error: code = NotFound desc = could not find container \"8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9\": container with ID starting with 8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9 not found: ID does not exist" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.383780 4841 scope.go:117] "RemoveContainer" containerID="af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.384183 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff"} err="failed to get container status \"af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff\": rpc error: code = NotFound desc = could not find container \"af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff\": container with ID starting with af42991c83a2c76d16aaa0ace91c8fbc7966ab4015e942a6c31f36d900531fff not found: ID does not exist" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.384215 4841 scope.go:117] "RemoveContainer" containerID="8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.384518 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9"} err="failed to get container status \"8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9\": rpc error: code = NotFound desc = could not find container \"8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9\": container with ID starting with 8185d7417ca61fa12c07d14776b11c18994d90b28a1805c812c165db431d21f9 not found: ID does not exist" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.487524 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhdcb\" (UniqueName: \"kubernetes.io/projected/0a4bb63f-dd64-497c-96a9-720d93b6812b-kube-api-access-jhdcb\") pod \"nova-api-0\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.487568 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4bb63f-dd64-497c-96a9-720d93b6812b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.487685 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4bb63f-dd64-497c-96a9-720d93b6812b-config-data\") pod \"nova-api-0\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.487712 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4bb63f-dd64-497c-96a9-720d93b6812b-logs\") pod \"nova-api-0\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.589778 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhdcb\" (UniqueName: \"kubernetes.io/projected/0a4bb63f-dd64-497c-96a9-720d93b6812b-kube-api-access-jhdcb\") pod \"nova-api-0\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.589834 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4bb63f-dd64-497c-96a9-720d93b6812b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.589953 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4bb63f-dd64-497c-96a9-720d93b6812b-config-data\") pod \"nova-api-0\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.590733 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4bb63f-dd64-497c-96a9-720d93b6812b-logs\") pod \"nova-api-0\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.591065 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4bb63f-dd64-497c-96a9-720d93b6812b-logs\") pod \"nova-api-0\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.594693 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4bb63f-dd64-497c-96a9-720d93b6812b-config-data\") pod \"nova-api-0\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.595706 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4bb63f-dd64-497c-96a9-720d93b6812b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.616311 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhdcb\" (UniqueName: \"kubernetes.io/projected/0a4bb63f-dd64-497c-96a9-720d93b6812b-kube-api-access-jhdcb\") pod \"nova-api-0\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " pod="openstack/nova-api-0" Jan 30 06:44:53 crc kubenswrapper[4841]: I0130 06:44:53.699861 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:44:54 crc kubenswrapper[4841]: W0130 06:44:54.234996 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a4bb63f_dd64_497c_96a9_720d93b6812b.slice/crio-643739b7e3c42832bf39ee335849b7dbf72a12574421c7d3607531dd35a1b39b WatchSource:0}: Error finding container 643739b7e3c42832bf39ee335849b7dbf72a12574421c7d3607531dd35a1b39b: Status 404 returned error can't find the container with id 643739b7e3c42832bf39ee335849b7dbf72a12574421c7d3607531dd35a1b39b Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.254044 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.295554 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a4bb63f-dd64-497c-96a9-720d93b6812b","Type":"ContainerStarted","Data":"643739b7e3c42832bf39ee335849b7dbf72a12574421c7d3607531dd35a1b39b"} Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.297334 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8a81333a-38af-40fb-9238-116fe8e5b6a8" containerName="nova-metadata-log" containerID="cri-o://d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774" gracePeriod=30 Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.297361 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8a81333a-38af-40fb-9238-116fe8e5b6a8" containerName="nova-metadata-metadata" containerID="cri-o://828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec" gracePeriod=30 Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.449841 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70062e63-a5a7-40f5-add6-7c268838b5e5" path="/var/lib/kubelet/pods/70062e63-a5a7-40f5-add6-7c268838b5e5/volumes" Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.763364 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.818057 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h46nc\" (UniqueName: \"kubernetes.io/projected/8a81333a-38af-40fb-9238-116fe8e5b6a8-kube-api-access-h46nc\") pod \"8a81333a-38af-40fb-9238-116fe8e5b6a8\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.818198 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-config-data\") pod \"8a81333a-38af-40fb-9238-116fe8e5b6a8\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.818274 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-nova-metadata-tls-certs\") pod \"8a81333a-38af-40fb-9238-116fe8e5b6a8\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.818388 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a81333a-38af-40fb-9238-116fe8e5b6a8-logs\") pod \"8a81333a-38af-40fb-9238-116fe8e5b6a8\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.818496 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-combined-ca-bundle\") pod \"8a81333a-38af-40fb-9238-116fe8e5b6a8\" (UID: \"8a81333a-38af-40fb-9238-116fe8e5b6a8\") " Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.819702 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a81333a-38af-40fb-9238-116fe8e5b6a8-logs" (OuterVolumeSpecName: "logs") pod "8a81333a-38af-40fb-9238-116fe8e5b6a8" (UID: "8a81333a-38af-40fb-9238-116fe8e5b6a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.850076 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a81333a-38af-40fb-9238-116fe8e5b6a8-kube-api-access-h46nc" (OuterVolumeSpecName: "kube-api-access-h46nc") pod "8a81333a-38af-40fb-9238-116fe8e5b6a8" (UID: "8a81333a-38af-40fb-9238-116fe8e5b6a8"). InnerVolumeSpecName "kube-api-access-h46nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.897288 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-config-data" (OuterVolumeSpecName: "config-data") pod "8a81333a-38af-40fb-9238-116fe8e5b6a8" (UID: "8a81333a-38af-40fb-9238-116fe8e5b6a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.898557 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a81333a-38af-40fb-9238-116fe8e5b6a8" (UID: "8a81333a-38af-40fb-9238-116fe8e5b6a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.918302 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8a81333a-38af-40fb-9238-116fe8e5b6a8" (UID: "8a81333a-38af-40fb-9238-116fe8e5b6a8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.920671 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.920694 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h46nc\" (UniqueName: \"kubernetes.io/projected/8a81333a-38af-40fb-9238-116fe8e5b6a8-kube-api-access-h46nc\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.920705 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.920713 4841 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a81333a-38af-40fb-9238-116fe8e5b6a8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.920723 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a81333a-38af-40fb-9238-116fe8e5b6a8-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:54 crc kubenswrapper[4841]: I0130 06:44:54.981566 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.055162 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4d4c5f95-l26l4"] Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.055749 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" podUID="5b88f273-35e9-4a59-be7e-de1eebd92300" containerName="dnsmasq-dns" containerID="cri-o://8ae61371f9b8019989aff7c298726f6e521e295879dccf935c7ecca29daa1a6b" gracePeriod=10 Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.310716 4841 generic.go:334] "Generic (PLEG): container finished" podID="5b88f273-35e9-4a59-be7e-de1eebd92300" containerID="8ae61371f9b8019989aff7c298726f6e521e295879dccf935c7ecca29daa1a6b" exitCode=0 Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.310770 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" event={"ID":"5b88f273-35e9-4a59-be7e-de1eebd92300","Type":"ContainerDied","Data":"8ae61371f9b8019989aff7c298726f6e521e295879dccf935c7ecca29daa1a6b"} Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.312511 4841 generic.go:334] "Generic (PLEG): container finished" podID="8a81333a-38af-40fb-9238-116fe8e5b6a8" containerID="828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec" exitCode=0 Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.312536 4841 generic.go:334] "Generic (PLEG): container finished" podID="8a81333a-38af-40fb-9238-116fe8e5b6a8" containerID="d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774" exitCode=143 Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.312581 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a81333a-38af-40fb-9238-116fe8e5b6a8","Type":"ContainerDied","Data":"828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec"} Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.312597 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.312602 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a81333a-38af-40fb-9238-116fe8e5b6a8","Type":"ContainerDied","Data":"d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774"} Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.312630 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a81333a-38af-40fb-9238-116fe8e5b6a8","Type":"ContainerDied","Data":"07e9f5c7bb6af1cbf5421b9bab35ed0f3665044f73940bf65f047ad372ed9f4e"} Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.312648 4841 scope.go:117] "RemoveContainer" containerID="828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.321417 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a4bb63f-dd64-497c-96a9-720d93b6812b","Type":"ContainerStarted","Data":"75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51"} Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.321453 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a4bb63f-dd64-497c-96a9-720d93b6812b","Type":"ContainerStarted","Data":"d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909"} Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.364947 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.36493379 podStartE2EDuration="2.36493379s" podCreationTimestamp="2026-01-30 06:44:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:55.364068937 +0000 UTC m=+5832.357541575" watchObservedRunningTime="2026-01-30 06:44:55.36493379 +0000 UTC m=+5832.358406428" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.368063 4841 scope.go:117] "RemoveContainer" containerID="d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.442060 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.454455 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.475458 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:55 crc kubenswrapper[4841]: E0130 06:44:55.475854 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a81333a-38af-40fb-9238-116fe8e5b6a8" containerName="nova-metadata-metadata" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.475875 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a81333a-38af-40fb-9238-116fe8e5b6a8" containerName="nova-metadata-metadata" Jan 30 06:44:55 crc kubenswrapper[4841]: E0130 06:44:55.475892 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a81333a-38af-40fb-9238-116fe8e5b6a8" containerName="nova-metadata-log" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.475901 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a81333a-38af-40fb-9238-116fe8e5b6a8" containerName="nova-metadata-log" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.476075 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a81333a-38af-40fb-9238-116fe8e5b6a8" containerName="nova-metadata-metadata" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.476103 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a81333a-38af-40fb-9238-116fe8e5b6a8" containerName="nova-metadata-log" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.477085 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.479722 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.479976 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.480412 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.498741 4841 scope.go:117] "RemoveContainer" containerID="828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec" Jan 30 06:44:55 crc kubenswrapper[4841]: E0130 06:44:55.499682 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec\": container with ID starting with 828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec not found: ID does not exist" containerID="828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.499710 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec"} err="failed to get container status \"828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec\": rpc error: code = NotFound desc = could not find container \"828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec\": container with ID starting with 828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec not found: ID does not exist" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.499732 4841 scope.go:117] "RemoveContainer" containerID="d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774" Jan 30 06:44:55 crc kubenswrapper[4841]: E0130 06:44:55.504987 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774\": container with ID starting with d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774 not found: ID does not exist" containerID="d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.505020 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774"} err="failed to get container status \"d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774\": rpc error: code = NotFound desc = could not find container \"d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774\": container with ID starting with d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774 not found: ID does not exist" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.505040 4841 scope.go:117] "RemoveContainer" containerID="828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.511819 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec"} err="failed to get container status \"828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec\": rpc error: code = NotFound desc = could not find container \"828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec\": container with ID starting with 828e676fc9ea2f5deb3d0af1d39694c486d870ced44cc6482480d97f2f5b66ec not found: ID does not exist" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.511857 4841 scope.go:117] "RemoveContainer" containerID="d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.517359 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774"} err="failed to get container status \"d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774\": rpc error: code = NotFound desc = could not find container \"d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774\": container with ID starting with d9a280f5b2f9255df793ce146d8c0085a98989f68d10a40d1d70f6cb8819a774 not found: ID does not exist" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.575443 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.636071 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59b20f68-5582-416b-93a8-b2ec8ffe7503-logs\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.636347 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.636800 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-config-data\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.637180 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.637246 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r2bp\" (UniqueName: \"kubernetes.io/projected/59b20f68-5582-416b-93a8-b2ec8ffe7503-kube-api-access-2r2bp\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.727569 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.738738 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mdpr\" (UniqueName: \"kubernetes.io/projected/5b88f273-35e9-4a59-be7e-de1eebd92300-kube-api-access-8mdpr\") pod \"5b88f273-35e9-4a59-be7e-de1eebd92300\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.738846 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-config\") pod \"5b88f273-35e9-4a59-be7e-de1eebd92300\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.738928 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-ovsdbserver-sb\") pod \"5b88f273-35e9-4a59-be7e-de1eebd92300\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.739011 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-ovsdbserver-nb\") pod \"5b88f273-35e9-4a59-be7e-de1eebd92300\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.739722 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-dns-svc\") pod \"5b88f273-35e9-4a59-be7e-de1eebd92300\" (UID: \"5b88f273-35e9-4a59-be7e-de1eebd92300\") " Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.739895 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-config-data\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.739966 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.740011 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r2bp\" (UniqueName: \"kubernetes.io/projected/59b20f68-5582-416b-93a8-b2ec8ffe7503-kube-api-access-2r2bp\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.740143 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59b20f68-5582-416b-93a8-b2ec8ffe7503-logs\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.740167 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.741178 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59b20f68-5582-416b-93a8-b2ec8ffe7503-logs\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.744171 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f273-35e9-4a59-be7e-de1eebd92300-kube-api-access-8mdpr" (OuterVolumeSpecName: "kube-api-access-8mdpr") pod "5b88f273-35e9-4a59-be7e-de1eebd92300" (UID: "5b88f273-35e9-4a59-be7e-de1eebd92300"). InnerVolumeSpecName "kube-api-access-8mdpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.745189 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.746316 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.746682 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-config-data\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.765473 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r2bp\" (UniqueName: \"kubernetes.io/projected/59b20f68-5582-416b-93a8-b2ec8ffe7503-kube-api-access-2r2bp\") pod \"nova-metadata-0\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.794841 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.800111 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b88f273-35e9-4a59-be7e-de1eebd92300" (UID: "5b88f273-35e9-4a59-be7e-de1eebd92300"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.807369 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-config" (OuterVolumeSpecName: "config") pod "5b88f273-35e9-4a59-be7e-de1eebd92300" (UID: "5b88f273-35e9-4a59-be7e-de1eebd92300"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.816858 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5b88f273-35e9-4a59-be7e-de1eebd92300" (UID: "5b88f273-35e9-4a59-be7e-de1eebd92300"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.823676 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5b88f273-35e9-4a59-be7e-de1eebd92300" (UID: "5b88f273-35e9-4a59-be7e-de1eebd92300"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.842421 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.842443 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.842454 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.842463 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b88f273-35e9-4a59-be7e-de1eebd92300-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:55 crc kubenswrapper[4841]: I0130 06:44:55.842473 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mdpr\" (UniqueName: \"kubernetes.io/projected/5b88f273-35e9-4a59-be7e-de1eebd92300-kube-api-access-8mdpr\") on node \"crc\" DevicePath \"\"" Jan 30 06:44:56 crc kubenswrapper[4841]: I0130 06:44:56.289606 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:44:56 crc kubenswrapper[4841]: W0130 06:44:56.295612 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59b20f68_5582_416b_93a8_b2ec8ffe7503.slice/crio-e54594e27367c85906bf8a7dee807a16890fdba016543462711a59a4a67a6a50 WatchSource:0}: Error finding container e54594e27367c85906bf8a7dee807a16890fdba016543462711a59a4a67a6a50: Status 404 returned error can't find the container with id e54594e27367c85906bf8a7dee807a16890fdba016543462711a59a4a67a6a50 Jan 30 06:44:56 crc kubenswrapper[4841]: I0130 06:44:56.334733 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59b20f68-5582-416b-93a8-b2ec8ffe7503","Type":"ContainerStarted","Data":"e54594e27367c85906bf8a7dee807a16890fdba016543462711a59a4a67a6a50"} Jan 30 06:44:56 crc kubenswrapper[4841]: I0130 06:44:56.338205 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" event={"ID":"5b88f273-35e9-4a59-be7e-de1eebd92300","Type":"ContainerDied","Data":"1fc85c7b4e49d56c1b9d7803bff90acb38b8bbdff1871c9e6160b3a544d32ff4"} Jan 30 06:44:56 crc kubenswrapper[4841]: I0130 06:44:56.338242 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4d4c5f95-l26l4" Jan 30 06:44:56 crc kubenswrapper[4841]: I0130 06:44:56.338268 4841 scope.go:117] "RemoveContainer" containerID="8ae61371f9b8019989aff7c298726f6e521e295879dccf935c7ecca29daa1a6b" Jan 30 06:44:56 crc kubenswrapper[4841]: I0130 06:44:56.412243 4841 scope.go:117] "RemoveContainer" containerID="f68ec22e4f4f230cc6eb11e856ad1151f2ae4e247c6179d44d6fed5e67e21c97" Jan 30 06:44:56 crc kubenswrapper[4841]: I0130 06:44:56.430248 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4d4c5f95-l26l4"] Jan 30 06:44:56 crc kubenswrapper[4841]: I0130 06:44:56.448143 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a81333a-38af-40fb-9238-116fe8e5b6a8" path="/var/lib/kubelet/pods/8a81333a-38af-40fb-9238-116fe8e5b6a8/volumes" Jan 30 06:44:56 crc kubenswrapper[4841]: I0130 06:44:56.448759 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b4d4c5f95-l26l4"] Jan 30 06:44:57 crc kubenswrapper[4841]: I0130 06:44:57.113750 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 30 06:44:57 crc kubenswrapper[4841]: I0130 06:44:57.353769 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59b20f68-5582-416b-93a8-b2ec8ffe7503","Type":"ContainerStarted","Data":"f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c"} Jan 30 06:44:57 crc kubenswrapper[4841]: I0130 06:44:57.353818 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59b20f68-5582-416b-93a8-b2ec8ffe7503","Type":"ContainerStarted","Data":"bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094"} Jan 30 06:44:57 crc kubenswrapper[4841]: I0130 06:44:57.398062 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.398036649 podStartE2EDuration="2.398036649s" podCreationTimestamp="2026-01-30 06:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:44:57.389226294 +0000 UTC m=+5834.382698932" watchObservedRunningTime="2026-01-30 06:44:57.398036649 +0000 UTC m=+5834.391509297" Jan 30 06:44:58 crc kubenswrapper[4841]: I0130 06:44:58.447686 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f273-35e9-4a59-be7e-de1eebd92300" path="/var/lib/kubelet/pods/5b88f273-35e9-4a59-be7e-de1eebd92300/volumes" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.175832 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8"] Jan 30 06:45:00 crc kubenswrapper[4841]: E0130 06:45:00.176277 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b88f273-35e9-4a59-be7e-de1eebd92300" containerName="dnsmasq-dns" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.176292 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b88f273-35e9-4a59-be7e-de1eebd92300" containerName="dnsmasq-dns" Jan 30 06:45:00 crc kubenswrapper[4841]: E0130 06:45:00.176321 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b88f273-35e9-4a59-be7e-de1eebd92300" containerName="init" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.176330 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b88f273-35e9-4a59-be7e-de1eebd92300" containerName="init" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.176564 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b88f273-35e9-4a59-be7e-de1eebd92300" containerName="dnsmasq-dns" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.177301 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.182532 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.185450 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.188229 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8"] Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.248922 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4307ec57-fc06-43de-96d9-3fc582a3a6f3-secret-volume\") pod \"collect-profiles-29495925-5qxm8\" (UID: \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.249299 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4307ec57-fc06-43de-96d9-3fc582a3a6f3-config-volume\") pod \"collect-profiles-29495925-5qxm8\" (UID: \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.249495 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxr58\" (UniqueName: \"kubernetes.io/projected/4307ec57-fc06-43de-96d9-3fc582a3a6f3-kube-api-access-vxr58\") pod \"collect-profiles-29495925-5qxm8\" (UID: \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.352946 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4307ec57-fc06-43de-96d9-3fc582a3a6f3-secret-volume\") pod \"collect-profiles-29495925-5qxm8\" (UID: \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.353258 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4307ec57-fc06-43de-96d9-3fc582a3a6f3-config-volume\") pod \"collect-profiles-29495925-5qxm8\" (UID: \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.353341 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxr58\" (UniqueName: \"kubernetes.io/projected/4307ec57-fc06-43de-96d9-3fc582a3a6f3-kube-api-access-vxr58\") pod \"collect-profiles-29495925-5qxm8\" (UID: \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.354859 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4307ec57-fc06-43de-96d9-3fc582a3a6f3-config-volume\") pod \"collect-profiles-29495925-5qxm8\" (UID: \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.364741 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4307ec57-fc06-43de-96d9-3fc582a3a6f3-secret-volume\") pod \"collect-profiles-29495925-5qxm8\" (UID: \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.381059 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxr58\" (UniqueName: \"kubernetes.io/projected/4307ec57-fc06-43de-96d9-3fc582a3a6f3-kube-api-access-vxr58\") pod \"collect-profiles-29495925-5qxm8\" (UID: \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.530324 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.728131 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.746786 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.795410 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 06:45:00 crc kubenswrapper[4841]: I0130 06:45:00.795705 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.072645 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8"] Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.418035 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" event={"ID":"4307ec57-fc06-43de-96d9-3fc582a3a6f3","Type":"ContainerStarted","Data":"6f259f8c58f03d0494cdbf314c0434410ba191d59a2187ee501f1c485d3b0d68"} Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.418358 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" event={"ID":"4307ec57-fc06-43de-96d9-3fc582a3a6f3","Type":"ContainerStarted","Data":"057f408a80364963d9e99030efcc07105d52afc60b33e060fd9d9bdb84f10eaa"} Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.456810 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.501045 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" podStartSLOduration=1.501021648 podStartE2EDuration="1.501021648s" podCreationTimestamp="2026-01-30 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:45:01.442720073 +0000 UTC m=+5838.436192711" watchObservedRunningTime="2026-01-30 06:45:01.501021648 +0000 UTC m=+5838.494494306" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.720712 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-wxs7z"] Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.721974 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.723839 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.724159 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.733929 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wxs7z"] Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.801944 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wxs7z\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.802170 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-config-data\") pod \"nova-cell1-cell-mapping-wxs7z\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.802224 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-scripts\") pod \"nova-cell1-cell-mapping-wxs7z\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.802674 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fccw7\" (UniqueName: \"kubernetes.io/projected/9ff06e12-bd8d-4324-8521-3363f844ca41-kube-api-access-fccw7\") pod \"nova-cell1-cell-mapping-wxs7z\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.904189 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fccw7\" (UniqueName: \"kubernetes.io/projected/9ff06e12-bd8d-4324-8521-3363f844ca41-kube-api-access-fccw7\") pod \"nova-cell1-cell-mapping-wxs7z\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.904266 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wxs7z\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.904320 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-config-data\") pod \"nova-cell1-cell-mapping-wxs7z\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.904351 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-scripts\") pod \"nova-cell1-cell-mapping-wxs7z\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.910515 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-scripts\") pod \"nova-cell1-cell-mapping-wxs7z\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.910704 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wxs7z\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.913527 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-config-data\") pod \"nova-cell1-cell-mapping-wxs7z\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:01 crc kubenswrapper[4841]: I0130 06:45:01.934593 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fccw7\" (UniqueName: \"kubernetes.io/projected/9ff06e12-bd8d-4324-8521-3363f844ca41-kube-api-access-fccw7\") pod \"nova-cell1-cell-mapping-wxs7z\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:02 crc kubenswrapper[4841]: I0130 06:45:02.037454 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:02 crc kubenswrapper[4841]: I0130 06:45:02.320529 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wxs7z"] Jan 30 06:45:02 crc kubenswrapper[4841]: W0130 06:45:02.323443 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ff06e12_bd8d_4324_8521_3363f844ca41.slice/crio-2db1d98dd83b58cc5c9a095bd0b88413d2f781df7060db900689c343a0c153c2 WatchSource:0}: Error finding container 2db1d98dd83b58cc5c9a095bd0b88413d2f781df7060db900689c343a0c153c2: Status 404 returned error can't find the container with id 2db1d98dd83b58cc5c9a095bd0b88413d2f781df7060db900689c343a0c153c2 Jan 30 06:45:02 crc kubenswrapper[4841]: I0130 06:45:02.468250 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wxs7z" event={"ID":"9ff06e12-bd8d-4324-8521-3363f844ca41","Type":"ContainerStarted","Data":"2db1d98dd83b58cc5c9a095bd0b88413d2f781df7060db900689c343a0c153c2"} Jan 30 06:45:02 crc kubenswrapper[4841]: I0130 06:45:02.469380 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" event={"ID":"4307ec57-fc06-43de-96d9-3fc582a3a6f3","Type":"ContainerDied","Data":"6f259f8c58f03d0494cdbf314c0434410ba191d59a2187ee501f1c485d3b0d68"} Jan 30 06:45:02 crc kubenswrapper[4841]: I0130 06:45:02.468820 4841 generic.go:334] "Generic (PLEG): container finished" podID="4307ec57-fc06-43de-96d9-3fc582a3a6f3" containerID="6f259f8c58f03d0494cdbf314c0434410ba191d59a2187ee501f1c485d3b0d68" exitCode=0 Jan 30 06:45:03 crc kubenswrapper[4841]: I0130 06:45:03.490016 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wxs7z" event={"ID":"9ff06e12-bd8d-4324-8521-3363f844ca41","Type":"ContainerStarted","Data":"e207f845fb90b7268c4ac6168efac75c16bf89e47f6e2148f595517a174980a6"} Jan 30 06:45:03 crc kubenswrapper[4841]: I0130 06:45:03.516171 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-wxs7z" podStartSLOduration=2.516149697 podStartE2EDuration="2.516149697s" podCreationTimestamp="2026-01-30 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:45:03.514487103 +0000 UTC m=+5840.507959781" watchObservedRunningTime="2026-01-30 06:45:03.516149697 +0000 UTC m=+5840.509622345" Jan 30 06:45:03 crc kubenswrapper[4841]: I0130 06:45:03.700508 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 06:45:03 crc kubenswrapper[4841]: I0130 06:45:03.700585 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.006077 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.152019 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4307ec57-fc06-43de-96d9-3fc582a3a6f3-config-volume\") pod \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\" (UID: \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\") " Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.152212 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxr58\" (UniqueName: \"kubernetes.io/projected/4307ec57-fc06-43de-96d9-3fc582a3a6f3-kube-api-access-vxr58\") pod \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\" (UID: \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\") " Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.152256 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4307ec57-fc06-43de-96d9-3fc582a3a6f3-secret-volume\") pod \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\" (UID: \"4307ec57-fc06-43de-96d9-3fc582a3a6f3\") " Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.152705 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4307ec57-fc06-43de-96d9-3fc582a3a6f3-config-volume" (OuterVolumeSpecName: "config-volume") pod "4307ec57-fc06-43de-96d9-3fc582a3a6f3" (UID: "4307ec57-fc06-43de-96d9-3fc582a3a6f3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.157333 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4307ec57-fc06-43de-96d9-3fc582a3a6f3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4307ec57-fc06-43de-96d9-3fc582a3a6f3" (UID: "4307ec57-fc06-43de-96d9-3fc582a3a6f3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.157694 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4307ec57-fc06-43de-96d9-3fc582a3a6f3-kube-api-access-vxr58" (OuterVolumeSpecName: "kube-api-access-vxr58") pod "4307ec57-fc06-43de-96d9-3fc582a3a6f3" (UID: "4307ec57-fc06-43de-96d9-3fc582a3a6f3"). InnerVolumeSpecName "kube-api-access-vxr58". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.254831 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4307ec57-fc06-43de-96d9-3fc582a3a6f3-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.254871 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxr58\" (UniqueName: \"kubernetes.io/projected/4307ec57-fc06-43de-96d9-3fc582a3a6f3-kube-api-access-vxr58\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.254882 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4307ec57-fc06-43de-96d9-3fc582a3a6f3-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.437629 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:45:04 crc kubenswrapper[4841]: E0130 06:45:04.437941 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.500096 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" event={"ID":"4307ec57-fc06-43de-96d9-3fc582a3a6f3","Type":"ContainerDied","Data":"057f408a80364963d9e99030efcc07105d52afc60b33e060fd9d9bdb84f10eaa"} Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.500144 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="057f408a80364963d9e99030efcc07105d52afc60b33e060fd9d9bdb84f10eaa" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.500161 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495925-5qxm8" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.526531 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm"] Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.538532 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495880-hvjbm"] Jan 30 06:45:04 crc kubenswrapper[4841]: E0130 06:45:04.704390 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4307ec57_fc06_43de_96d9_3fc582a3a6f3.slice/crio-057f408a80364963d9e99030efcc07105d52afc60b33e060fd9d9bdb84f10eaa\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4307ec57_fc06_43de_96d9_3fc582a3a6f3.slice\": RecentStats: unable to find data in memory cache]" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.786993 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0a4bb63f-dd64-497c-96a9-720d93b6812b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.91:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:45:04 crc kubenswrapper[4841]: I0130 06:45:04.787985 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0a4bb63f-dd64-497c-96a9-720d93b6812b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.91:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:45:05 crc kubenswrapper[4841]: I0130 06:45:05.795874 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 06:45:05 crc kubenswrapper[4841]: I0130 06:45:05.795930 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 06:45:06 crc kubenswrapper[4841]: I0130 06:45:06.446975 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f21364d-f9d0-4b5c-897e-44d183bbd441" path="/var/lib/kubelet/pods/5f21364d-f9d0-4b5c-897e-44d183bbd441/volumes" Jan 30 06:45:06 crc kubenswrapper[4841]: I0130 06:45:06.809610 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="59b20f68-5582-416b-93a8-b2ec8ffe7503" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.92:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 06:45:06 crc kubenswrapper[4841]: I0130 06:45:06.809689 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="59b20f68-5582-416b-93a8-b2ec8ffe7503" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.92:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 06:45:07 crc kubenswrapper[4841]: I0130 06:45:07.530510 4841 generic.go:334] "Generic (PLEG): container finished" podID="9ff06e12-bd8d-4324-8521-3363f844ca41" containerID="e207f845fb90b7268c4ac6168efac75c16bf89e47f6e2148f595517a174980a6" exitCode=0 Jan 30 06:45:07 crc kubenswrapper[4841]: I0130 06:45:07.530562 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wxs7z" event={"ID":"9ff06e12-bd8d-4324-8521-3363f844ca41","Type":"ContainerDied","Data":"e207f845fb90b7268c4ac6168efac75c16bf89e47f6e2148f595517a174980a6"} Jan 30 06:45:08 crc kubenswrapper[4841]: I0130 06:45:08.940952 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:08 crc kubenswrapper[4841]: I0130 06:45:08.989157 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-combined-ca-bundle\") pod \"9ff06e12-bd8d-4324-8521-3363f844ca41\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " Jan 30 06:45:08 crc kubenswrapper[4841]: I0130 06:45:08.989225 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-config-data\") pod \"9ff06e12-bd8d-4324-8521-3363f844ca41\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " Jan 30 06:45:08 crc kubenswrapper[4841]: I0130 06:45:08.989424 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fccw7\" (UniqueName: \"kubernetes.io/projected/9ff06e12-bd8d-4324-8521-3363f844ca41-kube-api-access-fccw7\") pod \"9ff06e12-bd8d-4324-8521-3363f844ca41\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " Jan 30 06:45:08 crc kubenswrapper[4841]: I0130 06:45:08.989486 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-scripts\") pod \"9ff06e12-bd8d-4324-8521-3363f844ca41\" (UID: \"9ff06e12-bd8d-4324-8521-3363f844ca41\") " Jan 30 06:45:08 crc kubenswrapper[4841]: I0130 06:45:08.997740 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff06e12-bd8d-4324-8521-3363f844ca41-kube-api-access-fccw7" (OuterVolumeSpecName: "kube-api-access-fccw7") pod "9ff06e12-bd8d-4324-8521-3363f844ca41" (UID: "9ff06e12-bd8d-4324-8521-3363f844ca41"). InnerVolumeSpecName "kube-api-access-fccw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:08 crc kubenswrapper[4841]: I0130 06:45:08.998460 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-scripts" (OuterVolumeSpecName: "scripts") pod "9ff06e12-bd8d-4324-8521-3363f844ca41" (UID: "9ff06e12-bd8d-4324-8521-3363f844ca41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.024547 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-config-data" (OuterVolumeSpecName: "config-data") pod "9ff06e12-bd8d-4324-8521-3363f844ca41" (UID: "9ff06e12-bd8d-4324-8521-3363f844ca41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.031165 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ff06e12-bd8d-4324-8521-3363f844ca41" (UID: "9ff06e12-bd8d-4324-8521-3363f844ca41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.091824 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fccw7\" (UniqueName: \"kubernetes.io/projected/9ff06e12-bd8d-4324-8521-3363f844ca41-kube-api-access-fccw7\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.091859 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.091873 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.091885 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ff06e12-bd8d-4324-8521-3363f844ca41-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.559819 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wxs7z" event={"ID":"9ff06e12-bd8d-4324-8521-3363f844ca41","Type":"ContainerDied","Data":"2db1d98dd83b58cc5c9a095bd0b88413d2f781df7060db900689c343a0c153c2"} Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.559862 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2db1d98dd83b58cc5c9a095bd0b88413d2f781df7060db900689c343a0c153c2" Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.560001 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wxs7z" Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.790909 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.791148 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0a4bb63f-dd64-497c-96a9-720d93b6812b" containerName="nova-api-log" containerID="cri-o://d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909" gracePeriod=30 Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.791254 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0a4bb63f-dd64-497c-96a9-720d93b6812b" containerName="nova-api-api" containerID="cri-o://75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51" gracePeriod=30 Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.804995 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.805265 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="59b20f68-5582-416b-93a8-b2ec8ffe7503" containerName="nova-metadata-log" containerID="cri-o://bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094" gracePeriod=30 Jan 30 06:45:09 crc kubenswrapper[4841]: I0130 06:45:09.805340 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="59b20f68-5582-416b-93a8-b2ec8ffe7503" containerName="nova-metadata-metadata" containerID="cri-o://f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c" gracePeriod=30 Jan 30 06:45:10 crc kubenswrapper[4841]: I0130 06:45:10.621437 4841 generic.go:334] "Generic (PLEG): container finished" podID="59b20f68-5582-416b-93a8-b2ec8ffe7503" containerID="bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094" exitCode=143 Jan 30 06:45:10 crc kubenswrapper[4841]: I0130 06:45:10.621566 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59b20f68-5582-416b-93a8-b2ec8ffe7503","Type":"ContainerDied","Data":"bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094"} Jan 30 06:45:10 crc kubenswrapper[4841]: I0130 06:45:10.624580 4841 generic.go:334] "Generic (PLEG): container finished" podID="0a4bb63f-dd64-497c-96a9-720d93b6812b" containerID="d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909" exitCode=143 Jan 30 06:45:10 crc kubenswrapper[4841]: I0130 06:45:10.624613 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a4bb63f-dd64-497c-96a9-720d93b6812b","Type":"ContainerDied","Data":"d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909"} Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.518285 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.598488 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59b20f68-5582-416b-93a8-b2ec8ffe7503-logs\") pod \"59b20f68-5582-416b-93a8-b2ec8ffe7503\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.598613 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-nova-metadata-tls-certs\") pod \"59b20f68-5582-416b-93a8-b2ec8ffe7503\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.598698 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-config-data\") pod \"59b20f68-5582-416b-93a8-b2ec8ffe7503\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.598754 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r2bp\" (UniqueName: \"kubernetes.io/projected/59b20f68-5582-416b-93a8-b2ec8ffe7503-kube-api-access-2r2bp\") pod \"59b20f68-5582-416b-93a8-b2ec8ffe7503\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.598796 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-combined-ca-bundle\") pod \"59b20f68-5582-416b-93a8-b2ec8ffe7503\" (UID: \"59b20f68-5582-416b-93a8-b2ec8ffe7503\") " Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.598969 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59b20f68-5582-416b-93a8-b2ec8ffe7503-logs" (OuterVolumeSpecName: "logs") pod "59b20f68-5582-416b-93a8-b2ec8ffe7503" (UID: "59b20f68-5582-416b-93a8-b2ec8ffe7503"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.599574 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59b20f68-5582-416b-93a8-b2ec8ffe7503-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.610542 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b20f68-5582-416b-93a8-b2ec8ffe7503-kube-api-access-2r2bp" (OuterVolumeSpecName: "kube-api-access-2r2bp") pod "59b20f68-5582-416b-93a8-b2ec8ffe7503" (UID: "59b20f68-5582-416b-93a8-b2ec8ffe7503"). InnerVolumeSpecName "kube-api-access-2r2bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.644494 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59b20f68-5582-416b-93a8-b2ec8ffe7503" (UID: "59b20f68-5582-416b-93a8-b2ec8ffe7503"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.645285 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-config-data" (OuterVolumeSpecName: "config-data") pod "59b20f68-5582-416b-93a8-b2ec8ffe7503" (UID: "59b20f68-5582-416b-93a8-b2ec8ffe7503"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.670581 4841 generic.go:334] "Generic (PLEG): container finished" podID="59b20f68-5582-416b-93a8-b2ec8ffe7503" containerID="f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c" exitCode=0 Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.670624 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59b20f68-5582-416b-93a8-b2ec8ffe7503","Type":"ContainerDied","Data":"f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c"} Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.670650 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59b20f68-5582-416b-93a8-b2ec8ffe7503","Type":"ContainerDied","Data":"e54594e27367c85906bf8a7dee807a16890fdba016543462711a59a4a67a6a50"} Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.670667 4841 scope.go:117] "RemoveContainer" containerID="f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.670797 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.678662 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "59b20f68-5582-416b-93a8-b2ec8ffe7503" (UID: "59b20f68-5582-416b-93a8-b2ec8ffe7503"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.700878 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.700907 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r2bp\" (UniqueName: \"kubernetes.io/projected/59b20f68-5582-416b-93a8-b2ec8ffe7503-kube-api-access-2r2bp\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.700920 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.700930 4841 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b20f68-5582-416b-93a8-b2ec8ffe7503-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.713920 4841 scope.go:117] "RemoveContainer" containerID="bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.734255 4841 scope.go:117] "RemoveContainer" containerID="f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c" Jan 30 06:45:13 crc kubenswrapper[4841]: E0130 06:45:13.734723 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c\": container with ID starting with f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c not found: ID does not exist" containerID="f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.734766 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c"} err="failed to get container status \"f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c\": rpc error: code = NotFound desc = could not find container \"f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c\": container with ID starting with f378c9bb78eae5d1653d83dbb061a599e06782ee03a2d9c105a052e7bfe0a48c not found: ID does not exist" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.734795 4841 scope.go:117] "RemoveContainer" containerID="bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094" Jan 30 06:45:13 crc kubenswrapper[4841]: E0130 06:45:13.735179 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094\": container with ID starting with bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094 not found: ID does not exist" containerID="bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094" Jan 30 06:45:13 crc kubenswrapper[4841]: I0130 06:45:13.735215 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094"} err="failed to get container status \"bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094\": rpc error: code = NotFound desc = could not find container \"bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094\": container with ID starting with bb58d04a8a4e010bc83ccb7e024bd93e6c3573afd7934583c5bf2d6d8a1bb094 not found: ID does not exist" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.022460 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.049471 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.073899 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:45:14 crc kubenswrapper[4841]: E0130 06:45:14.074771 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff06e12-bd8d-4324-8521-3363f844ca41" containerName="nova-manage" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.074796 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff06e12-bd8d-4324-8521-3363f844ca41" containerName="nova-manage" Jan 30 06:45:14 crc kubenswrapper[4841]: E0130 06:45:14.074845 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4307ec57-fc06-43de-96d9-3fc582a3a6f3" containerName="collect-profiles" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.074855 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4307ec57-fc06-43de-96d9-3fc582a3a6f3" containerName="collect-profiles" Jan 30 06:45:14 crc kubenswrapper[4841]: E0130 06:45:14.074874 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b20f68-5582-416b-93a8-b2ec8ffe7503" containerName="nova-metadata-metadata" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.074885 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b20f68-5582-416b-93a8-b2ec8ffe7503" containerName="nova-metadata-metadata" Jan 30 06:45:14 crc kubenswrapper[4841]: E0130 06:45:14.074908 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b20f68-5582-416b-93a8-b2ec8ffe7503" containerName="nova-metadata-log" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.074918 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b20f68-5582-416b-93a8-b2ec8ffe7503" containerName="nova-metadata-log" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.075625 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4307ec57-fc06-43de-96d9-3fc582a3a6f3" containerName="collect-profiles" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.075653 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b20f68-5582-416b-93a8-b2ec8ffe7503" containerName="nova-metadata-metadata" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.075681 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff06e12-bd8d-4324-8521-3363f844ca41" containerName="nova-manage" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.075715 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b20f68-5582-416b-93a8-b2ec8ffe7503" containerName="nova-metadata-log" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.077944 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.082366 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.082751 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.088070 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.108472 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rht4\" (UniqueName: \"kubernetes.io/projected/12ae12c4-8995-4bcd-9237-393737f5ae1d-kube-api-access-9rht4\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.108532 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ae12c4-8995-4bcd-9237-393737f5ae1d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.108565 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ae12c4-8995-4bcd-9237-393737f5ae1d-config-data\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.108726 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ae12c4-8995-4bcd-9237-393737f5ae1d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.108832 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ae12c4-8995-4bcd-9237-393737f5ae1d-logs\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.209866 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ae12c4-8995-4bcd-9237-393737f5ae1d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.209937 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ae12c4-8995-4bcd-9237-393737f5ae1d-config-data\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.210000 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ae12c4-8995-4bcd-9237-393737f5ae1d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.210038 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ae12c4-8995-4bcd-9237-393737f5ae1d-logs\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.210134 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rht4\" (UniqueName: \"kubernetes.io/projected/12ae12c4-8995-4bcd-9237-393737f5ae1d-kube-api-access-9rht4\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.210938 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ae12c4-8995-4bcd-9237-393737f5ae1d-logs\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.214658 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ae12c4-8995-4bcd-9237-393737f5ae1d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.214889 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ae12c4-8995-4bcd-9237-393737f5ae1d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.221840 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ae12c4-8995-4bcd-9237-393737f5ae1d-config-data\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.238211 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rht4\" (UniqueName: \"kubernetes.io/projected/12ae12c4-8995-4bcd-9237-393737f5ae1d-kube-api-access-9rht4\") pod \"nova-metadata-0\" (UID: \"12ae12c4-8995-4bcd-9237-393737f5ae1d\") " pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.407604 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.451608 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59b20f68-5582-416b-93a8-b2ec8ffe7503" path="/var/lib/kubelet/pods/59b20f68-5582-416b-93a8-b2ec8ffe7503/volumes" Jan 30 06:45:14 crc kubenswrapper[4841]: W0130 06:45:14.925530 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12ae12c4_8995_4bcd_9237_393737f5ae1d.slice/crio-4903280ba25dd5ef4bd92bdc0ce487078139d7d491c2e3ba4ec4b1ad8e5a0879 WatchSource:0}: Error finding container 4903280ba25dd5ef4bd92bdc0ce487078139d7d491c2e3ba4ec4b1ad8e5a0879: Status 404 returned error can't find the container with id 4903280ba25dd5ef4bd92bdc0ce487078139d7d491c2e3ba4ec4b1ad8e5a0879 Jan 30 06:45:14 crc kubenswrapper[4841]: I0130 06:45:14.928020 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 06:45:15 crc kubenswrapper[4841]: I0130 06:45:15.690482 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12ae12c4-8995-4bcd-9237-393737f5ae1d","Type":"ContainerStarted","Data":"6810c4b46d7c14edfd74b2e7e63cd0261a2aef031a9ffb31d96110c40dbaced6"} Jan 30 06:45:15 crc kubenswrapper[4841]: I0130 06:45:15.690782 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12ae12c4-8995-4bcd-9237-393737f5ae1d","Type":"ContainerStarted","Data":"9c638a48e262a1245ed9f9b03f7bd44901522c54630c3d65d6038c05d3089a7d"} Jan 30 06:45:15 crc kubenswrapper[4841]: I0130 06:45:15.690794 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12ae12c4-8995-4bcd-9237-393737f5ae1d","Type":"ContainerStarted","Data":"4903280ba25dd5ef4bd92bdc0ce487078139d7d491c2e3ba4ec4b1ad8e5a0879"} Jan 30 06:45:15 crc kubenswrapper[4841]: I0130 06:45:15.718844 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.718816281 podStartE2EDuration="1.718816281s" podCreationTimestamp="2026-01-30 06:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:45:15.708720482 +0000 UTC m=+5852.702193150" watchObservedRunningTime="2026-01-30 06:45:15.718816281 +0000 UTC m=+5852.712288959" Jan 30 06:45:18 crc kubenswrapper[4841]: I0130 06:45:18.431904 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:45:18 crc kubenswrapper[4841]: E0130 06:45:18.432500 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:45:19 crc kubenswrapper[4841]: I0130 06:45:19.408483 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 06:45:19 crc kubenswrapper[4841]: I0130 06:45:19.408851 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 06:45:22 crc kubenswrapper[4841]: I0130 06:45:22.761570 4841 generic.go:334] "Generic (PLEG): container finished" podID="d3447f0c-f61f-456b-8b1f-4956055d9bcc" containerID="f6e4fd5fcc96b19e980a941577cdf4a3eef3cc0b6c47033b2d6798ddfc7cdd8b" exitCode=137 Jan 30 06:45:22 crc kubenswrapper[4841]: I0130 06:45:22.761647 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d3447f0c-f61f-456b-8b1f-4956055d9bcc","Type":"ContainerDied","Data":"f6e4fd5fcc96b19e980a941577cdf4a3eef3cc0b6c47033b2d6798ddfc7cdd8b"} Jan 30 06:45:22 crc kubenswrapper[4841]: I0130 06:45:22.965315 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:45:22 crc kubenswrapper[4841]: I0130 06:45:22.995762 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3447f0c-f61f-456b-8b1f-4956055d9bcc-config-data\") pod \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\" (UID: \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\") " Jan 30 06:45:22 crc kubenswrapper[4841]: I0130 06:45:22.995822 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26m6w\" (UniqueName: \"kubernetes.io/projected/d3447f0c-f61f-456b-8b1f-4956055d9bcc-kube-api-access-26m6w\") pod \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\" (UID: \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\") " Jan 30 06:45:22 crc kubenswrapper[4841]: I0130 06:45:22.995905 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3447f0c-f61f-456b-8b1f-4956055d9bcc-combined-ca-bundle\") pod \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\" (UID: \"d3447f0c-f61f-456b-8b1f-4956055d9bcc\") " Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.029371 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3447f0c-f61f-456b-8b1f-4956055d9bcc-kube-api-access-26m6w" (OuterVolumeSpecName: "kube-api-access-26m6w") pod "d3447f0c-f61f-456b-8b1f-4956055d9bcc" (UID: "d3447f0c-f61f-456b-8b1f-4956055d9bcc"). InnerVolumeSpecName "kube-api-access-26m6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.030734 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3447f0c-f61f-456b-8b1f-4956055d9bcc-config-data" (OuterVolumeSpecName: "config-data") pod "d3447f0c-f61f-456b-8b1f-4956055d9bcc" (UID: "d3447f0c-f61f-456b-8b1f-4956055d9bcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.034217 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3447f0c-f61f-456b-8b1f-4956055d9bcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3447f0c-f61f-456b-8b1f-4956055d9bcc" (UID: "d3447f0c-f61f-456b-8b1f-4956055d9bcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.097087 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3447f0c-f61f-456b-8b1f-4956055d9bcc-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.097323 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26m6w\" (UniqueName: \"kubernetes.io/projected/d3447f0c-f61f-456b-8b1f-4956055d9bcc-kube-api-access-26m6w\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.097478 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3447f0c-f61f-456b-8b1f-4956055d9bcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.672290 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.773483 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d3447f0c-f61f-456b-8b1f-4956055d9bcc","Type":"ContainerDied","Data":"7f204eb4255dcfaeccca4de9126afb3dbd28df87d90932e98b2aefc31dba3f4f"} Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.773533 4841 scope.go:117] "RemoveContainer" containerID="f6e4fd5fcc96b19e980a941577cdf4a3eef3cc0b6c47033b2d6798ddfc7cdd8b" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.773580 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.775182 4841 generic.go:334] "Generic (PLEG): container finished" podID="0a4bb63f-dd64-497c-96a9-720d93b6812b" containerID="75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51" exitCode=0 Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.775226 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a4bb63f-dd64-497c-96a9-720d93b6812b","Type":"ContainerDied","Data":"75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51"} Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.775251 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.775255 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a4bb63f-dd64-497c-96a9-720d93b6812b","Type":"ContainerDied","Data":"643739b7e3c42832bf39ee335849b7dbf72a12574421c7d3607531dd35a1b39b"} Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.801369 4841 scope.go:117] "RemoveContainer" containerID="75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.811199 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4bb63f-dd64-497c-96a9-720d93b6812b-logs\") pod \"0a4bb63f-dd64-497c-96a9-720d93b6812b\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.811417 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4bb63f-dd64-497c-96a9-720d93b6812b-combined-ca-bundle\") pod \"0a4bb63f-dd64-497c-96a9-720d93b6812b\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.811447 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4bb63f-dd64-497c-96a9-720d93b6812b-config-data\") pod \"0a4bb63f-dd64-497c-96a9-720d93b6812b\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.811654 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhdcb\" (UniqueName: \"kubernetes.io/projected/0a4bb63f-dd64-497c-96a9-720d93b6812b-kube-api-access-jhdcb\") pod \"0a4bb63f-dd64-497c-96a9-720d93b6812b\" (UID: \"0a4bb63f-dd64-497c-96a9-720d93b6812b\") " Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.811918 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a4bb63f-dd64-497c-96a9-720d93b6812b-logs" (OuterVolumeSpecName: "logs") pod "0a4bb63f-dd64-497c-96a9-720d93b6812b" (UID: "0a4bb63f-dd64-497c-96a9-720d93b6812b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.813798 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4bb63f-dd64-497c-96a9-720d93b6812b-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.816635 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a4bb63f-dd64-497c-96a9-720d93b6812b-kube-api-access-jhdcb" (OuterVolumeSpecName: "kube-api-access-jhdcb") pod "0a4bb63f-dd64-497c-96a9-720d93b6812b" (UID: "0a4bb63f-dd64-497c-96a9-720d93b6812b"). InnerVolumeSpecName "kube-api-access-jhdcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.817504 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.825572 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.842759 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:45:23 crc kubenswrapper[4841]: E0130 06:45:23.843144 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4bb63f-dd64-497c-96a9-720d93b6812b" containerName="nova-api-api" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.843165 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4bb63f-dd64-497c-96a9-720d93b6812b" containerName="nova-api-api" Jan 30 06:45:23 crc kubenswrapper[4841]: E0130 06:45:23.843200 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3447f0c-f61f-456b-8b1f-4956055d9bcc" containerName="nova-scheduler-scheduler" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.843209 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3447f0c-f61f-456b-8b1f-4956055d9bcc" containerName="nova-scheduler-scheduler" Jan 30 06:45:23 crc kubenswrapper[4841]: E0130 06:45:23.843232 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4bb63f-dd64-497c-96a9-720d93b6812b" containerName="nova-api-log" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.843242 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4bb63f-dd64-497c-96a9-720d93b6812b" containerName="nova-api-log" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.843472 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a4bb63f-dd64-497c-96a9-720d93b6812b" containerName="nova-api-api" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.843495 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3447f0c-f61f-456b-8b1f-4956055d9bcc" containerName="nova-scheduler-scheduler" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.843512 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a4bb63f-dd64-497c-96a9-720d93b6812b" containerName="nova-api-log" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.844211 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.849039 4841 scope.go:117] "RemoveContainer" containerID="d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.849530 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.857624 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4bb63f-dd64-497c-96a9-720d93b6812b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a4bb63f-dd64-497c-96a9-720d93b6812b" (UID: "0a4bb63f-dd64-497c-96a9-720d93b6812b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.872356 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.897863 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4bb63f-dd64-497c-96a9-720d93b6812b-config-data" (OuterVolumeSpecName: "config-data") pod "0a4bb63f-dd64-497c-96a9-720d93b6812b" (UID: "0a4bb63f-dd64-497c-96a9-720d93b6812b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.915542 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4bb63f-dd64-497c-96a9-720d93b6812b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.915578 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4bb63f-dd64-497c-96a9-720d93b6812b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.915591 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhdcb\" (UniqueName: \"kubernetes.io/projected/0a4bb63f-dd64-497c-96a9-720d93b6812b-kube-api-access-jhdcb\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.950929 4841 scope.go:117] "RemoveContainer" containerID="75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51" Jan 30 06:45:23 crc kubenswrapper[4841]: E0130 06:45:23.951599 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51\": container with ID starting with 75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51 not found: ID does not exist" containerID="75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.951678 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51"} err="failed to get container status \"75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51\": rpc error: code = NotFound desc = could not find container \"75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51\": container with ID starting with 75ffcb4dcd92a588d25dcf2a694c687f61d80e04bdeef148057157576e287f51 not found: ID does not exist" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.951743 4841 scope.go:117] "RemoveContainer" containerID="d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909" Jan 30 06:45:23 crc kubenswrapper[4841]: E0130 06:45:23.952360 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909\": container with ID starting with d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909 not found: ID does not exist" containerID="d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909" Jan 30 06:45:23 crc kubenswrapper[4841]: I0130 06:45:23.952406 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909"} err="failed to get container status \"d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909\": rpc error: code = NotFound desc = could not find container \"d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909\": container with ID starting with d957136bae4ddac5faca15e0f5c6b4962f079cbe0070bc4cb9823518ca97b909 not found: ID does not exist" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.017559 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcq9z\" (UniqueName: \"kubernetes.io/projected/e97b62be-59aa-4331-bc11-98b4fd2297d8-kube-api-access-tcq9z\") pod \"nova-scheduler-0\" (UID: \"e97b62be-59aa-4331-bc11-98b4fd2297d8\") " pod="openstack/nova-scheduler-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.017638 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e97b62be-59aa-4331-bc11-98b4fd2297d8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e97b62be-59aa-4331-bc11-98b4fd2297d8\") " pod="openstack/nova-scheduler-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.017813 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e97b62be-59aa-4331-bc11-98b4fd2297d8-config-data\") pod \"nova-scheduler-0\" (UID: \"e97b62be-59aa-4331-bc11-98b4fd2297d8\") " pod="openstack/nova-scheduler-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.119437 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e97b62be-59aa-4331-bc11-98b4fd2297d8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e97b62be-59aa-4331-bc11-98b4fd2297d8\") " pod="openstack/nova-scheduler-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.119644 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e97b62be-59aa-4331-bc11-98b4fd2297d8-config-data\") pod \"nova-scheduler-0\" (UID: \"e97b62be-59aa-4331-bc11-98b4fd2297d8\") " pod="openstack/nova-scheduler-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.119806 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcq9z\" (UniqueName: \"kubernetes.io/projected/e97b62be-59aa-4331-bc11-98b4fd2297d8-kube-api-access-tcq9z\") pod \"nova-scheduler-0\" (UID: \"e97b62be-59aa-4331-bc11-98b4fd2297d8\") " pod="openstack/nova-scheduler-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.123464 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.137536 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e97b62be-59aa-4331-bc11-98b4fd2297d8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e97b62be-59aa-4331-bc11-98b4fd2297d8\") " pod="openstack/nova-scheduler-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.137978 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e97b62be-59aa-4331-bc11-98b4fd2297d8-config-data\") pod \"nova-scheduler-0\" (UID: \"e97b62be-59aa-4331-bc11-98b4fd2297d8\") " pod="openstack/nova-scheduler-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.156260 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.163643 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcq9z\" (UniqueName: \"kubernetes.io/projected/e97b62be-59aa-4331-bc11-98b4fd2297d8-kube-api-access-tcq9z\") pod \"nova-scheduler-0\" (UID: \"e97b62be-59aa-4331-bc11-98b4fd2297d8\") " pod="openstack/nova-scheduler-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.176820 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.178849 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.187078 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.193055 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.258315 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.324552 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48121b9a-91a0-40b8-ad3a-f598433ebd0f-logs\") pod \"nova-api-0\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.325044 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjt6s\" (UniqueName: \"kubernetes.io/projected/48121b9a-91a0-40b8-ad3a-f598433ebd0f-kube-api-access-pjt6s\") pod \"nova-api-0\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.325217 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48121b9a-91a0-40b8-ad3a-f598433ebd0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.325277 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48121b9a-91a0-40b8-ad3a-f598433ebd0f-config-data\") pod \"nova-api-0\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.412567 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.413313 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.426000 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjt6s\" (UniqueName: \"kubernetes.io/projected/48121b9a-91a0-40b8-ad3a-f598433ebd0f-kube-api-access-pjt6s\") pod \"nova-api-0\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.426051 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48121b9a-91a0-40b8-ad3a-f598433ebd0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.426076 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48121b9a-91a0-40b8-ad3a-f598433ebd0f-config-data\") pod \"nova-api-0\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.426156 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48121b9a-91a0-40b8-ad3a-f598433ebd0f-logs\") pod \"nova-api-0\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.427157 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48121b9a-91a0-40b8-ad3a-f598433ebd0f-logs\") pod \"nova-api-0\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.436494 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48121b9a-91a0-40b8-ad3a-f598433ebd0f-config-data\") pod \"nova-api-0\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.443921 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48121b9a-91a0-40b8-ad3a-f598433ebd0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.471178 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a4bb63f-dd64-497c-96a9-720d93b6812b" path="/var/lib/kubelet/pods/0a4bb63f-dd64-497c-96a9-720d93b6812b/volumes" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.472696 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3447f0c-f61f-456b-8b1f-4956055d9bcc" path="/var/lib/kubelet/pods/d3447f0c-f61f-456b-8b1f-4956055d9bcc/volumes" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.477508 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjt6s\" (UniqueName: \"kubernetes.io/projected/48121b9a-91a0-40b8-ad3a-f598433ebd0f-kube-api-access-pjt6s\") pod \"nova-api-0\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.517875 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.771413 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 06:45:24 crc kubenswrapper[4841]: I0130 06:45:24.803361 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e97b62be-59aa-4331-bc11-98b4fd2297d8","Type":"ContainerStarted","Data":"dab25fe40b2c4aec07e1afbdad4151d39796bbbd4001ae13c0bcbb0ddcf58b52"} Jan 30 06:45:25 crc kubenswrapper[4841]: I0130 06:45:25.016001 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:45:25 crc kubenswrapper[4841]: W0130 06:45:25.020537 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48121b9a_91a0_40b8_ad3a_f598433ebd0f.slice/crio-4ae5ed4c23c3a31aaadfc2156167482891a21a6262355c60f475de1ef3062f8b WatchSource:0}: Error finding container 4ae5ed4c23c3a31aaadfc2156167482891a21a6262355c60f475de1ef3062f8b: Status 404 returned error can't find the container with id 4ae5ed4c23c3a31aaadfc2156167482891a21a6262355c60f475de1ef3062f8b Jan 30 06:45:25 crc kubenswrapper[4841]: I0130 06:45:25.435587 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="12ae12c4-8995-4bcd-9237-393737f5ae1d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 06:45:25 crc kubenswrapper[4841]: I0130 06:45:25.436152 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="12ae12c4-8995-4bcd-9237-393737f5ae1d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 06:45:25 crc kubenswrapper[4841]: I0130 06:45:25.818169 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48121b9a-91a0-40b8-ad3a-f598433ebd0f","Type":"ContainerStarted","Data":"824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35"} Jan 30 06:45:25 crc kubenswrapper[4841]: I0130 06:45:25.818459 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48121b9a-91a0-40b8-ad3a-f598433ebd0f","Type":"ContainerStarted","Data":"a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd"} Jan 30 06:45:25 crc kubenswrapper[4841]: I0130 06:45:25.818472 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48121b9a-91a0-40b8-ad3a-f598433ebd0f","Type":"ContainerStarted","Data":"4ae5ed4c23c3a31aaadfc2156167482891a21a6262355c60f475de1ef3062f8b"} Jan 30 06:45:25 crc kubenswrapper[4841]: I0130 06:45:25.820680 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e97b62be-59aa-4331-bc11-98b4fd2297d8","Type":"ContainerStarted","Data":"ce611459fc6084837ee821f812b641893e741d1d9e78b6d081bae3d9055c4c74"} Jan 30 06:45:25 crc kubenswrapper[4841]: I0130 06:45:25.846942 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.846919478 podStartE2EDuration="1.846919478s" podCreationTimestamp="2026-01-30 06:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:45:25.840180249 +0000 UTC m=+5862.833652917" watchObservedRunningTime="2026-01-30 06:45:25.846919478 +0000 UTC m=+5862.840392126" Jan 30 06:45:25 crc kubenswrapper[4841]: I0130 06:45:25.862878 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.862850354 podStartE2EDuration="2.862850354s" podCreationTimestamp="2026-01-30 06:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:45:25.861721704 +0000 UTC m=+5862.855194342" watchObservedRunningTime="2026-01-30 06:45:25.862850354 +0000 UTC m=+5862.856323022" Jan 30 06:45:29 crc kubenswrapper[4841]: I0130 06:45:29.259309 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 06:45:30 crc kubenswrapper[4841]: I0130 06:45:30.436006 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:45:30 crc kubenswrapper[4841]: E0130 06:45:30.437062 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:45:34 crc kubenswrapper[4841]: I0130 06:45:34.259232 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 06:45:34 crc kubenswrapper[4841]: I0130 06:45:34.327792 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 06:45:34 crc kubenswrapper[4841]: I0130 06:45:34.414957 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 06:45:34 crc kubenswrapper[4841]: I0130 06:45:34.416235 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 06:45:34 crc kubenswrapper[4841]: I0130 06:45:34.423064 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 06:45:34 crc kubenswrapper[4841]: I0130 06:45:34.518888 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 06:45:34 crc kubenswrapper[4841]: I0130 06:45:34.519015 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 06:45:34 crc kubenswrapper[4841]: I0130 06:45:34.952752 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 06:45:35 crc kubenswrapper[4841]: I0130 06:45:35.014729 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 06:45:35 crc kubenswrapper[4841]: I0130 06:45:35.601637 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.97:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:45:35 crc kubenswrapper[4841]: I0130 06:45:35.601943 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.97:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:45:43 crc kubenswrapper[4841]: I0130 06:45:43.432753 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:45:43 crc kubenswrapper[4841]: E0130 06:45:43.435358 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:45:44 crc kubenswrapper[4841]: I0130 06:45:44.523354 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 06:45:44 crc kubenswrapper[4841]: I0130 06:45:44.524105 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 06:45:44 crc kubenswrapper[4841]: I0130 06:45:44.527757 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 06:45:44 crc kubenswrapper[4841]: I0130 06:45:44.527889 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.071631 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.077086 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.278565 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6446589fcf-vl8h4"] Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.282679 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.330849 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6446589fcf-vl8h4"] Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.413610 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1aef25db-fefb-45b2-8db8-14dd86910ddf-dns-svc\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.413850 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1aef25db-fefb-45b2-8db8-14dd86910ddf-ovsdbserver-nb\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.413965 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg74q\" (UniqueName: \"kubernetes.io/projected/1aef25db-fefb-45b2-8db8-14dd86910ddf-kube-api-access-mg74q\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.414077 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aef25db-fefb-45b2-8db8-14dd86910ddf-config\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.414219 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1aef25db-fefb-45b2-8db8-14dd86910ddf-ovsdbserver-sb\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.515793 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1aef25db-fefb-45b2-8db8-14dd86910ddf-dns-svc\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.515876 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1aef25db-fefb-45b2-8db8-14dd86910ddf-ovsdbserver-nb\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.515942 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg74q\" (UniqueName: \"kubernetes.io/projected/1aef25db-fefb-45b2-8db8-14dd86910ddf-kube-api-access-mg74q\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.516040 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aef25db-fefb-45b2-8db8-14dd86910ddf-config\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.516077 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1aef25db-fefb-45b2-8db8-14dd86910ddf-ovsdbserver-sb\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.516612 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1aef25db-fefb-45b2-8db8-14dd86910ddf-dns-svc\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.516878 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1aef25db-fefb-45b2-8db8-14dd86910ddf-ovsdbserver-nb\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.516928 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1aef25db-fefb-45b2-8db8-14dd86910ddf-ovsdbserver-sb\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.516939 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aef25db-fefb-45b2-8db8-14dd86910ddf-config\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.540161 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg74q\" (UniqueName: \"kubernetes.io/projected/1aef25db-fefb-45b2-8db8-14dd86910ddf-kube-api-access-mg74q\") pod \"dnsmasq-dns-6446589fcf-vl8h4\" (UID: \"1aef25db-fefb-45b2-8db8-14dd86910ddf\") " pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:45 crc kubenswrapper[4841]: I0130 06:45:45.635281 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:46 crc kubenswrapper[4841]: W0130 06:45:46.235247 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aef25db_fefb_45b2_8db8_14dd86910ddf.slice/crio-4044076fc083d2d5c54007c11b6d0c4ab1d03479d241d6751fd774d794bf94e0 WatchSource:0}: Error finding container 4044076fc083d2d5c54007c11b6d0c4ab1d03479d241d6751fd774d794bf94e0: Status 404 returned error can't find the container with id 4044076fc083d2d5c54007c11b6d0c4ab1d03479d241d6751fd774d794bf94e0 Jan 30 06:45:46 crc kubenswrapper[4841]: I0130 06:45:46.237921 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6446589fcf-vl8h4"] Jan 30 06:45:47 crc kubenswrapper[4841]: I0130 06:45:47.099771 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" event={"ID":"1aef25db-fefb-45b2-8db8-14dd86910ddf","Type":"ContainerDied","Data":"9a9e73471563eb36e4092619d48ff8ebc48de68a5310e6e63cdbeb1770b22d1e"} Jan 30 06:45:47 crc kubenswrapper[4841]: I0130 06:45:47.097785 4841 generic.go:334] "Generic (PLEG): container finished" podID="1aef25db-fefb-45b2-8db8-14dd86910ddf" containerID="9a9e73471563eb36e4092619d48ff8ebc48de68a5310e6e63cdbeb1770b22d1e" exitCode=0 Jan 30 06:45:47 crc kubenswrapper[4841]: I0130 06:45:47.101631 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" event={"ID":"1aef25db-fefb-45b2-8db8-14dd86910ddf","Type":"ContainerStarted","Data":"4044076fc083d2d5c54007c11b6d0c4ab1d03479d241d6751fd774d794bf94e0"} Jan 30 06:45:48 crc kubenswrapper[4841]: I0130 06:45:48.118916 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" event={"ID":"1aef25db-fefb-45b2-8db8-14dd86910ddf","Type":"ContainerStarted","Data":"3086a259810c0e2554b8654e74b4be295a7af1e36f48985ec90349f726d33194"} Jan 30 06:45:48 crc kubenswrapper[4841]: I0130 06:45:48.119272 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:48 crc kubenswrapper[4841]: I0130 06:45:48.139184 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:45:48 crc kubenswrapper[4841]: I0130 06:45:48.139458 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" containerName="nova-api-log" containerID="cri-o://a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd" gracePeriod=30 Jan 30 06:45:48 crc kubenswrapper[4841]: I0130 06:45:48.139646 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" containerName="nova-api-api" containerID="cri-o://824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35" gracePeriod=30 Jan 30 06:45:49 crc kubenswrapper[4841]: I0130 06:45:49.128726 4841 generic.go:334] "Generic (PLEG): container finished" podID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" containerID="a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd" exitCode=143 Jan 30 06:45:49 crc kubenswrapper[4841]: I0130 06:45:49.130160 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48121b9a-91a0-40b8-ad3a-f598433ebd0f","Type":"ContainerDied","Data":"a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd"} Jan 30 06:45:51 crc kubenswrapper[4841]: I0130 06:45:51.824465 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:45:51 crc kubenswrapper[4841]: I0130 06:45:51.869065 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" podStartSLOduration=6.869039059 podStartE2EDuration="6.869039059s" podCreationTimestamp="2026-01-30 06:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:45:48.16047752 +0000 UTC m=+5885.153950198" watchObservedRunningTime="2026-01-30 06:45:51.869039059 +0000 UTC m=+5888.862511727" Jan 30 06:45:51 crc kubenswrapper[4841]: I0130 06:45:51.975021 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48121b9a-91a0-40b8-ad3a-f598433ebd0f-config-data\") pod \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " Jan 30 06:45:51 crc kubenswrapper[4841]: I0130 06:45:51.975373 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48121b9a-91a0-40b8-ad3a-f598433ebd0f-logs\") pod \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " Jan 30 06:45:51 crc kubenswrapper[4841]: I0130 06:45:51.975441 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48121b9a-91a0-40b8-ad3a-f598433ebd0f-combined-ca-bundle\") pod \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " Jan 30 06:45:51 crc kubenswrapper[4841]: I0130 06:45:51.975655 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjt6s\" (UniqueName: \"kubernetes.io/projected/48121b9a-91a0-40b8-ad3a-f598433ebd0f-kube-api-access-pjt6s\") pod \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\" (UID: \"48121b9a-91a0-40b8-ad3a-f598433ebd0f\") " Jan 30 06:45:51 crc kubenswrapper[4841]: I0130 06:45:51.976044 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48121b9a-91a0-40b8-ad3a-f598433ebd0f-logs" (OuterVolumeSpecName: "logs") pod "48121b9a-91a0-40b8-ad3a-f598433ebd0f" (UID: "48121b9a-91a0-40b8-ad3a-f598433ebd0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:45:51 crc kubenswrapper[4841]: I0130 06:45:51.976281 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48121b9a-91a0-40b8-ad3a-f598433ebd0f-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.001246 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48121b9a-91a0-40b8-ad3a-f598433ebd0f-kube-api-access-pjt6s" (OuterVolumeSpecName: "kube-api-access-pjt6s") pod "48121b9a-91a0-40b8-ad3a-f598433ebd0f" (UID: "48121b9a-91a0-40b8-ad3a-f598433ebd0f"). InnerVolumeSpecName "kube-api-access-pjt6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.026662 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48121b9a-91a0-40b8-ad3a-f598433ebd0f-config-data" (OuterVolumeSpecName: "config-data") pod "48121b9a-91a0-40b8-ad3a-f598433ebd0f" (UID: "48121b9a-91a0-40b8-ad3a-f598433ebd0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.027631 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48121b9a-91a0-40b8-ad3a-f598433ebd0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48121b9a-91a0-40b8-ad3a-f598433ebd0f" (UID: "48121b9a-91a0-40b8-ad3a-f598433ebd0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.078001 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48121b9a-91a0-40b8-ad3a-f598433ebd0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.078051 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjt6s\" (UniqueName: \"kubernetes.io/projected/48121b9a-91a0-40b8-ad3a-f598433ebd0f-kube-api-access-pjt6s\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.078065 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48121b9a-91a0-40b8-ad3a-f598433ebd0f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.164073 4841 generic.go:334] "Generic (PLEG): container finished" podID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" containerID="824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35" exitCode=0 Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.164123 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48121b9a-91a0-40b8-ad3a-f598433ebd0f","Type":"ContainerDied","Data":"824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35"} Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.164149 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.164184 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48121b9a-91a0-40b8-ad3a-f598433ebd0f","Type":"ContainerDied","Data":"4ae5ed4c23c3a31aaadfc2156167482891a21a6262355c60f475de1ef3062f8b"} Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.164209 4841 scope.go:117] "RemoveContainer" containerID="824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.184710 4841 scope.go:117] "RemoveContainer" containerID="a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.207284 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.210768 4841 scope.go:117] "RemoveContainer" containerID="2181758f83931ab95d5f734fd20cb598f51d12c4708e8c7e675b734e95a1e4bf" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.222043 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.227746 4841 scope.go:117] "RemoveContainer" containerID="824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35" Jan 30 06:45:52 crc kubenswrapper[4841]: E0130 06:45:52.228108 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35\": container with ID starting with 824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35 not found: ID does not exist" containerID="824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.228143 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35"} err="failed to get container status \"824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35\": rpc error: code = NotFound desc = could not find container \"824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35\": container with ID starting with 824b4f81c615a7bca6d8c5aea6710fc3b94a02426c5a2d351627a7a23c090e35 not found: ID does not exist" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.228172 4841 scope.go:117] "RemoveContainer" containerID="a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.228251 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 06:45:52 crc kubenswrapper[4841]: E0130 06:45:52.228572 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd\": container with ID starting with a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd not found: ID does not exist" containerID="a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.228629 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd"} err="failed to get container status \"a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd\": rpc error: code = NotFound desc = could not find container \"a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd\": container with ID starting with a45933f2b01110fc6e850d2f3efd82ca2c1603f361b1dd47851c1da9dc35f5cd not found: ID does not exist" Jan 30 06:45:52 crc kubenswrapper[4841]: E0130 06:45:52.228694 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" containerName="nova-api-api" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.228708 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" containerName="nova-api-api" Jan 30 06:45:52 crc kubenswrapper[4841]: E0130 06:45:52.228738 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" containerName="nova-api-log" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.228746 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" containerName="nova-api-log" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.228980 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" containerName="nova-api-api" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.229013 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" containerName="nova-api-log" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.230150 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.235481 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.235767 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.235802 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.250631 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.385286 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec0a9da-3385-4362-b5bf-c44ff1de727a-config-data\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.385382 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fec0a9da-3385-4362-b5bf-c44ff1de727a-logs\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.385917 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt95g\" (UniqueName: \"kubernetes.io/projected/fec0a9da-3385-4362-b5bf-c44ff1de727a-kube-api-access-dt95g\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.386036 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec0a9da-3385-4362-b5bf-c44ff1de727a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.386171 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec0a9da-3385-4362-b5bf-c44ff1de727a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.386300 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec0a9da-3385-4362-b5bf-c44ff1de727a-public-tls-certs\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.456634 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48121b9a-91a0-40b8-ad3a-f598433ebd0f" path="/var/lib/kubelet/pods/48121b9a-91a0-40b8-ad3a-f598433ebd0f/volumes" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.487921 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fec0a9da-3385-4362-b5bf-c44ff1de727a-logs\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.488023 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt95g\" (UniqueName: \"kubernetes.io/projected/fec0a9da-3385-4362-b5bf-c44ff1de727a-kube-api-access-dt95g\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.488059 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec0a9da-3385-4362-b5bf-c44ff1de727a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.488092 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec0a9da-3385-4362-b5bf-c44ff1de727a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.488131 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec0a9da-3385-4362-b5bf-c44ff1de727a-public-tls-certs\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.488166 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec0a9da-3385-4362-b5bf-c44ff1de727a-config-data\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.489639 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fec0a9da-3385-4362-b5bf-c44ff1de727a-logs\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.494357 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec0a9da-3385-4362-b5bf-c44ff1de727a-config-data\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.499540 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec0a9da-3385-4362-b5bf-c44ff1de727a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.503847 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec0a9da-3385-4362-b5bf-c44ff1de727a-public-tls-certs\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.504503 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec0a9da-3385-4362-b5bf-c44ff1de727a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.509791 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt95g\" (UniqueName: \"kubernetes.io/projected/fec0a9da-3385-4362-b5bf-c44ff1de727a-kube-api-access-dt95g\") pod \"nova-api-0\" (UID: \"fec0a9da-3385-4362-b5bf-c44ff1de727a\") " pod="openstack/nova-api-0" Jan 30 06:45:52 crc kubenswrapper[4841]: I0130 06:45:52.551998 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 06:45:53 crc kubenswrapper[4841]: I0130 06:45:53.081755 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 06:45:53 crc kubenswrapper[4841]: I0130 06:45:53.179365 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fec0a9da-3385-4362-b5bf-c44ff1de727a","Type":"ContainerStarted","Data":"78309a8cc24421c751cd02523b1b4a520364242d1f7c60298affa9501e2a60ac"} Jan 30 06:45:54 crc kubenswrapper[4841]: I0130 06:45:54.191004 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fec0a9da-3385-4362-b5bf-c44ff1de727a","Type":"ContainerStarted","Data":"9734cfbbe2aaf4aa6de2c63ebce6157d17fe63b64e98eb4c0963678cca1b63de"} Jan 30 06:45:54 crc kubenswrapper[4841]: I0130 06:45:54.191390 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fec0a9da-3385-4362-b5bf-c44ff1de727a","Type":"ContainerStarted","Data":"ec58dab0ccf5cfe92bd8d8b7589da3c965d8fbd2c6cd97abf7d798c0a8643587"} Jan 30 06:45:54 crc kubenswrapper[4841]: I0130 06:45:54.220879 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.220861589 podStartE2EDuration="2.220861589s" podCreationTimestamp="2026-01-30 06:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:45:54.220328465 +0000 UTC m=+5891.213801133" watchObservedRunningTime="2026-01-30 06:45:54.220861589 +0000 UTC m=+5891.214334227" Jan 30 06:45:55 crc kubenswrapper[4841]: I0130 06:45:55.637659 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6446589fcf-vl8h4" Jan 30 06:45:55 crc kubenswrapper[4841]: I0130 06:45:55.736549 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fb5f469f9-lc6zv"] Jan 30 06:45:55 crc kubenswrapper[4841]: I0130 06:45:55.736842 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" podUID="02ad2afe-9c7b-437c-a8dd-21902fed0051" containerName="dnsmasq-dns" containerID="cri-o://8238e9d7576ea53b78b4060063ecd593cd36dfb4b2b76876395e3f04eff8d08b" gracePeriod=10 Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.222233 4841 generic.go:334] "Generic (PLEG): container finished" podID="02ad2afe-9c7b-437c-a8dd-21902fed0051" containerID="8238e9d7576ea53b78b4060063ecd593cd36dfb4b2b76876395e3f04eff8d08b" exitCode=0 Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.222545 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" event={"ID":"02ad2afe-9c7b-437c-a8dd-21902fed0051","Type":"ContainerDied","Data":"8238e9d7576ea53b78b4060063ecd593cd36dfb4b2b76876395e3f04eff8d08b"} Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.222574 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" event={"ID":"02ad2afe-9c7b-437c-a8dd-21902fed0051","Type":"ContainerDied","Data":"bbeef325619650011786125cf83bd356393bac6f5aeb8782ebf7c3826231c099"} Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.222587 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbeef325619650011786125cf83bd356393bac6f5aeb8782ebf7c3826231c099" Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.234655 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.387670 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n86c2\" (UniqueName: \"kubernetes.io/projected/02ad2afe-9c7b-437c-a8dd-21902fed0051-kube-api-access-n86c2\") pod \"02ad2afe-9c7b-437c-a8dd-21902fed0051\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.387953 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-ovsdbserver-nb\") pod \"02ad2afe-9c7b-437c-a8dd-21902fed0051\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.388050 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-config\") pod \"02ad2afe-9c7b-437c-a8dd-21902fed0051\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.388132 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-ovsdbserver-sb\") pod \"02ad2afe-9c7b-437c-a8dd-21902fed0051\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.388378 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-dns-svc\") pod \"02ad2afe-9c7b-437c-a8dd-21902fed0051\" (UID: \"02ad2afe-9c7b-437c-a8dd-21902fed0051\") " Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.418589 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02ad2afe-9c7b-437c-a8dd-21902fed0051-kube-api-access-n86c2" (OuterVolumeSpecName: "kube-api-access-n86c2") pod "02ad2afe-9c7b-437c-a8dd-21902fed0051" (UID: "02ad2afe-9c7b-437c-a8dd-21902fed0051"). InnerVolumeSpecName "kube-api-access-n86c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.491144 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-config" (OuterVolumeSpecName: "config") pod "02ad2afe-9c7b-437c-a8dd-21902fed0051" (UID: "02ad2afe-9c7b-437c-a8dd-21902fed0051"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.492678 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n86c2\" (UniqueName: \"kubernetes.io/projected/02ad2afe-9c7b-437c-a8dd-21902fed0051-kube-api-access-n86c2\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.492692 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.512093 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "02ad2afe-9c7b-437c-a8dd-21902fed0051" (UID: "02ad2afe-9c7b-437c-a8dd-21902fed0051"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.514589 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "02ad2afe-9c7b-437c-a8dd-21902fed0051" (UID: "02ad2afe-9c7b-437c-a8dd-21902fed0051"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.518287 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "02ad2afe-9c7b-437c-a8dd-21902fed0051" (UID: "02ad2afe-9c7b-437c-a8dd-21902fed0051"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.594505 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.594544 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:56 crc kubenswrapper[4841]: I0130 06:45:56.594557 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ad2afe-9c7b-437c-a8dd-21902fed0051-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 06:45:57 crc kubenswrapper[4841]: I0130 06:45:57.232964 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fb5f469f9-lc6zv" Jan 30 06:45:57 crc kubenswrapper[4841]: I0130 06:45:57.271676 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fb5f469f9-lc6zv"] Jan 30 06:45:57 crc kubenswrapper[4841]: I0130 06:45:57.280621 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fb5f469f9-lc6zv"] Jan 30 06:45:57 crc kubenswrapper[4841]: I0130 06:45:57.432083 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:45:57 crc kubenswrapper[4841]: E0130 06:45:57.432413 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:45:58 crc kubenswrapper[4841]: I0130 06:45:58.441483 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02ad2afe-9c7b-437c-a8dd-21902fed0051" path="/var/lib/kubelet/pods/02ad2afe-9c7b-437c-a8dd-21902fed0051/volumes" Jan 30 06:46:02 crc kubenswrapper[4841]: I0130 06:46:02.553888 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 06:46:02 crc kubenswrapper[4841]: I0130 06:46:02.557219 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 06:46:03 crc kubenswrapper[4841]: I0130 06:46:03.565606 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fec0a9da-3385-4362-b5bf-c44ff1de727a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.99:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 06:46:03 crc kubenswrapper[4841]: I0130 06:46:03.565632 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fec0a9da-3385-4362-b5bf-c44ff1de727a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.99:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 06:46:09 crc kubenswrapper[4841]: I0130 06:46:09.432672 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:46:09 crc kubenswrapper[4841]: E0130 06:46:09.433724 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:46:12 crc kubenswrapper[4841]: I0130 06:46:12.562879 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 06:46:12 crc kubenswrapper[4841]: I0130 06:46:12.564781 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 06:46:12 crc kubenswrapper[4841]: I0130 06:46:12.570232 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 06:46:12 crc kubenswrapper[4841]: I0130 06:46:12.572489 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 06:46:13 crc kubenswrapper[4841]: I0130 06:46:13.418826 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 06:46:13 crc kubenswrapper[4841]: I0130 06:46:13.426979 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 06:46:21 crc kubenswrapper[4841]: I0130 06:46:21.432296 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:46:21 crc kubenswrapper[4841]: E0130 06:46:21.433442 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:46:22 crc kubenswrapper[4841]: I0130 06:46:22.058462 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-caf3-account-create-update-rsk67"] Jan 30 06:46:22 crc kubenswrapper[4841]: I0130 06:46:22.072795 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-zzzcl"] Jan 30 06:46:22 crc kubenswrapper[4841]: I0130 06:46:22.084391 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-zzzcl"] Jan 30 06:46:22 crc kubenswrapper[4841]: I0130 06:46:22.095435 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-caf3-account-create-update-rsk67"] Jan 30 06:46:22 crc kubenswrapper[4841]: I0130 06:46:22.450679 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b447a96e-6def-4771-b1f9-ed2f17a2fd37" path="/var/lib/kubelet/pods/b447a96e-6def-4771-b1f9-ed2f17a2fd37/volumes" Jan 30 06:46:22 crc kubenswrapper[4841]: I0130 06:46:22.451215 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deed29e2-f797-4d6e-a53c-8a6a70f98888" path="/var/lib/kubelet/pods/deed29e2-f797-4d6e-a53c-8a6a70f98888/volumes" Jan 30 06:46:29 crc kubenswrapper[4841]: I0130 06:46:29.027202 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-bltvn"] Jan 30 06:46:29 crc kubenswrapper[4841]: I0130 06:46:29.035277 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-bltvn"] Jan 30 06:46:30 crc kubenswrapper[4841]: I0130 06:46:30.452022 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="670940fd-a625-4698-9f3c-12af46a73bf4" path="/var/lib/kubelet/pods/670940fd-a625-4698-9f3c-12af46a73bf4/volumes" Jan 30 06:46:36 crc kubenswrapper[4841]: I0130 06:46:36.432266 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:46:36 crc kubenswrapper[4841]: E0130 06:46:36.433120 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.858255 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4wrmv"] Jan 30 06:46:37 crc kubenswrapper[4841]: E0130 06:46:37.859198 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ad2afe-9c7b-437c-a8dd-21902fed0051" containerName="init" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.859238 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ad2afe-9c7b-437c-a8dd-21902fed0051" containerName="init" Jan 30 06:46:37 crc kubenswrapper[4841]: E0130 06:46:37.859252 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ad2afe-9c7b-437c-a8dd-21902fed0051" containerName="dnsmasq-dns" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.859263 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ad2afe-9c7b-437c-a8dd-21902fed0051" containerName="dnsmasq-dns" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.859648 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="02ad2afe-9c7b-437c-a8dd-21902fed0051" containerName="dnsmasq-dns" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.860888 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.866604 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2596f"] Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.868030 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.868297 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.868597 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nt2w9" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.869436 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.886495 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4wrmv"] Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.903907 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2596f"] Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.987587 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zmfd\" (UniqueName: \"kubernetes.io/projected/496ddc15-cb66-4871-85b5-39673abb40e6-kube-api-access-4zmfd\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.987913 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-scripts\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.988091 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/496ddc15-cb66-4871-85b5-39673abb40e6-var-run-ovn\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.988226 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/496ddc15-cb66-4871-85b5-39673abb40e6-ovn-controller-tls-certs\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.988394 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-etc-ovs\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.988556 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-var-log\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.988680 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496ddc15-cb66-4871-85b5-39673abb40e6-combined-ca-bundle\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.988813 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-var-run\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.988971 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-var-lib\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.989110 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/496ddc15-cb66-4871-85b5-39673abb40e6-scripts\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.989267 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sxj4\" (UniqueName: \"kubernetes.io/projected/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-kube-api-access-2sxj4\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.989450 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/496ddc15-cb66-4871-85b5-39673abb40e6-var-log-ovn\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:37 crc kubenswrapper[4841]: I0130 06:46:37.989579 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/496ddc15-cb66-4871-85b5-39673abb40e6-var-run\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091183 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/496ddc15-cb66-4871-85b5-39673abb40e6-var-run\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091441 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/496ddc15-cb66-4871-85b5-39673abb40e6-var-log-ovn\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091468 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zmfd\" (UniqueName: \"kubernetes.io/projected/496ddc15-cb66-4871-85b5-39673abb40e6-kube-api-access-4zmfd\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091489 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-scripts\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091545 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/496ddc15-cb66-4871-85b5-39673abb40e6-var-run-ovn\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091578 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/496ddc15-cb66-4871-85b5-39673abb40e6-ovn-controller-tls-certs\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091606 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-etc-ovs\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091628 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-var-log\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091644 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496ddc15-cb66-4871-85b5-39673abb40e6-combined-ca-bundle\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091666 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-var-run\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091698 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-var-lib\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091726 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/496ddc15-cb66-4871-85b5-39673abb40e6-scripts\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091766 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sxj4\" (UniqueName: \"kubernetes.io/projected/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-kube-api-access-2sxj4\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091802 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-var-log\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091817 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/496ddc15-cb66-4871-85b5-39673abb40e6-var-run-ovn\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091810 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-etc-ovs\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091863 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-var-lib\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091873 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-var-run\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.091935 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/496ddc15-cb66-4871-85b5-39673abb40e6-var-log-ovn\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.092369 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/496ddc15-cb66-4871-85b5-39673abb40e6-var-run\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.093784 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/496ddc15-cb66-4871-85b5-39673abb40e6-scripts\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.094302 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-scripts\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.097891 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496ddc15-cb66-4871-85b5-39673abb40e6-combined-ca-bundle\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.106930 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/496ddc15-cb66-4871-85b5-39673abb40e6-ovn-controller-tls-certs\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.121201 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zmfd\" (UniqueName: \"kubernetes.io/projected/496ddc15-cb66-4871-85b5-39673abb40e6-kube-api-access-4zmfd\") pod \"ovn-controller-4wrmv\" (UID: \"496ddc15-cb66-4871-85b5-39673abb40e6\") " pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.134979 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sxj4\" (UniqueName: \"kubernetes.io/projected/7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd-kube-api-access-2sxj4\") pod \"ovn-controller-ovs-2596f\" (UID: \"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd\") " pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.191730 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.213457 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:38 crc kubenswrapper[4841]: I0130 06:46:38.807438 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4wrmv"] Jan 30 06:46:38 crc kubenswrapper[4841]: W0130 06:46:38.809908 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod496ddc15_cb66_4871_85b5_39673abb40e6.slice/crio-d8593c07238cc94ea86d97e7612d8ac8d31baece2bd7328bb634b861e5d27f88 WatchSource:0}: Error finding container d8593c07238cc94ea86d97e7612d8ac8d31baece2bd7328bb634b861e5d27f88: Status 404 returned error can't find the container with id d8593c07238cc94ea86d97e7612d8ac8d31baece2bd7328bb634b861e5d27f88 Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.215091 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2596f"] Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.462679 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-mm4vw"] Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.469725 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.473044 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.474102 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mm4vw"] Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.628480 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-xblv8"] Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.629533 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-config\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.629584 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-ovn-rundir\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.629543 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-xblv8" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.629643 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-ovs-rundir\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.629687 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.629719 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-combined-ca-bundle\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.629752 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4fph\" (UniqueName: \"kubernetes.io/projected/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-kube-api-access-t4fph\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.641188 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-xblv8"] Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.692300 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2596f" event={"ID":"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd","Type":"ContainerStarted","Data":"6d4685e9d100e6a56712d0f14d9b34d9888bbe9e10691cbb5b2afddf12286791"} Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.692342 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2596f" event={"ID":"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd","Type":"ContainerStarted","Data":"b3a532b1983dbaed721a1693ff17c483a47536de541c8689c305924753c70bbe"} Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.694224 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wrmv" event={"ID":"496ddc15-cb66-4871-85b5-39673abb40e6","Type":"ContainerStarted","Data":"6fb35b9095ab37e02b0aaffac3df7258312dfb3b7567a0683dba1499e496ad5e"} Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.694249 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wrmv" event={"ID":"496ddc15-cb66-4871-85b5-39673abb40e6","Type":"ContainerStarted","Data":"d8593c07238cc94ea86d97e7612d8ac8d31baece2bd7328bb634b861e5d27f88"} Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.694427 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-4wrmv" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.731853 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-ovs-rundir\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.731905 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.731950 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-combined-ca-bundle\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.731996 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4fph\" (UniqueName: \"kubernetes.io/projected/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-kube-api-access-t4fph\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.732059 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac-operator-scripts\") pod \"octavia-db-create-xblv8\" (UID: \"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac\") " pod="openstack/octavia-db-create-xblv8" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.732146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-config\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.732172 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bqnq\" (UniqueName: \"kubernetes.io/projected/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac-kube-api-access-9bqnq\") pod \"octavia-db-create-xblv8\" (UID: \"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac\") " pod="openstack/octavia-db-create-xblv8" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.732198 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-ovn-rundir\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.732541 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-ovn-rundir\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.732654 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-ovs-rundir\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.735285 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-config\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.737667 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.738905 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-combined-ca-bundle\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.739486 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-4wrmv" podStartSLOduration=2.73938922 podStartE2EDuration="2.73938922s" podCreationTimestamp="2026-01-30 06:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:46:39.737822808 +0000 UTC m=+5936.731295446" watchObservedRunningTime="2026-01-30 06:46:39.73938922 +0000 UTC m=+5936.732861858" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.753038 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4fph\" (UniqueName: \"kubernetes.io/projected/6caea7ee-bd59-4d38-89d0-ca3bbeda764e-kube-api-access-t4fph\") pod \"ovn-controller-metrics-mm4vw\" (UID: \"6caea7ee-bd59-4d38-89d0-ca3bbeda764e\") " pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.796239 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mm4vw" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.833378 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac-operator-scripts\") pod \"octavia-db-create-xblv8\" (UID: \"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac\") " pod="openstack/octavia-db-create-xblv8" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.833497 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bqnq\" (UniqueName: \"kubernetes.io/projected/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac-kube-api-access-9bqnq\") pod \"octavia-db-create-xblv8\" (UID: \"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac\") " pod="openstack/octavia-db-create-xblv8" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.834165 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac-operator-scripts\") pod \"octavia-db-create-xblv8\" (UID: \"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac\") " pod="openstack/octavia-db-create-xblv8" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.854278 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bqnq\" (UniqueName: \"kubernetes.io/projected/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac-kube-api-access-9bqnq\") pod \"octavia-db-create-xblv8\" (UID: \"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac\") " pod="openstack/octavia-db-create-xblv8" Jan 30 06:46:39 crc kubenswrapper[4841]: I0130 06:46:39.944433 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-xblv8" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.315956 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mm4vw"] Jan 30 06:46:40 crc kubenswrapper[4841]: W0130 06:46:40.325579 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6caea7ee_bd59_4d38_89d0_ca3bbeda764e.slice/crio-4ba58de5c77acc8443e752801508c59974364d32d0edda7707e4e8deb18ebe39 WatchSource:0}: Error finding container 4ba58de5c77acc8443e752801508c59974364d32d0edda7707e4e8deb18ebe39: Status 404 returned error can't find the container with id 4ba58de5c77acc8443e752801508c59974364d32d0edda7707e4e8deb18ebe39 Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.357466 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wbx4d"] Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.359777 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.375707 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wbx4d"] Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.459884 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-xblv8"] Jan 30 06:46:40 crc kubenswrapper[4841]: W0130 06:46:40.472943 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c7b918d_ef6a_4644_bf1a_d653c4cfdeac.slice/crio-c453f401af29d483350e95e2635dbf1ade93612f820926d1d4907a20aff14202 WatchSource:0}: Error finding container c453f401af29d483350e95e2635dbf1ade93612f820926d1d4907a20aff14202: Status 404 returned error can't find the container with id c453f401af29d483350e95e2635dbf1ade93612f820926d1d4907a20aff14202 Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.550158 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fa1885-942f-48ce-94f4-e1c807e0ae4e-catalog-content\") pod \"redhat-operators-wbx4d\" (UID: \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\") " pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.550465 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrms\" (UniqueName: \"kubernetes.io/projected/79fa1885-942f-48ce-94f4-e1c807e0ae4e-kube-api-access-dlrms\") pod \"redhat-operators-wbx4d\" (UID: \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\") " pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.550495 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fa1885-942f-48ce-94f4-e1c807e0ae4e-utilities\") pod \"redhat-operators-wbx4d\" (UID: \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\") " pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.651622 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fa1885-942f-48ce-94f4-e1c807e0ae4e-catalog-content\") pod \"redhat-operators-wbx4d\" (UID: \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\") " pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.651669 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlrms\" (UniqueName: \"kubernetes.io/projected/79fa1885-942f-48ce-94f4-e1c807e0ae4e-kube-api-access-dlrms\") pod \"redhat-operators-wbx4d\" (UID: \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\") " pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.651688 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fa1885-942f-48ce-94f4-e1c807e0ae4e-utilities\") pod \"redhat-operators-wbx4d\" (UID: \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\") " pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.652217 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fa1885-942f-48ce-94f4-e1c807e0ae4e-utilities\") pod \"redhat-operators-wbx4d\" (UID: \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\") " pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.652420 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fa1885-942f-48ce-94f4-e1c807e0ae4e-catalog-content\") pod \"redhat-operators-wbx4d\" (UID: \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\") " pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.668139 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlrms\" (UniqueName: \"kubernetes.io/projected/79fa1885-942f-48ce-94f4-e1c807e0ae4e-kube-api-access-dlrms\") pod \"redhat-operators-wbx4d\" (UID: \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\") " pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.701958 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-a59b-account-create-update-xr8hp"] Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.703071 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-a59b-account-create-update-xr8hp" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.704444 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.704871 4841 generic.go:334] "Generic (PLEG): container finished" podID="7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd" containerID="6d4685e9d100e6a56712d0f14d9b34d9888bbe9e10691cbb5b2afddf12286791" exitCode=0 Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.705031 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2596f" event={"ID":"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd","Type":"ContainerDied","Data":"6d4685e9d100e6a56712d0f14d9b34d9888bbe9e10691cbb5b2afddf12286791"} Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.707219 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mm4vw" event={"ID":"6caea7ee-bd59-4d38-89d0-ca3bbeda764e","Type":"ContainerStarted","Data":"6ef9de7063082996e95f3e0d633835941766bf5cc1238345ab03f9cdb53a2835"} Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.707254 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mm4vw" event={"ID":"6caea7ee-bd59-4d38-89d0-ca3bbeda764e","Type":"ContainerStarted","Data":"4ba58de5c77acc8443e752801508c59974364d32d0edda7707e4e8deb18ebe39"} Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.710584 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-xblv8" event={"ID":"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac","Type":"ContainerStarted","Data":"a93748098483f741ad17575cfeaaf729dc2486ab84f610abf18b5ada2b6e3e10"} Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.710607 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-xblv8" event={"ID":"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac","Type":"ContainerStarted","Data":"c453f401af29d483350e95e2635dbf1ade93612f820926d1d4907a20aff14202"} Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.716511 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-a59b-account-create-update-xr8hp"] Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.778035 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.812931 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-mm4vw" podStartSLOduration=1.812911974 podStartE2EDuration="1.812911974s" podCreationTimestamp="2026-01-30 06:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:46:40.773264557 +0000 UTC m=+5937.766737195" watchObservedRunningTime="2026-01-30 06:46:40.812911974 +0000 UTC m=+5937.806384612" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.839857 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-create-xblv8" podStartSLOduration=1.839826202 podStartE2EDuration="1.839826202s" podCreationTimestamp="2026-01-30 06:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:46:40.823765944 +0000 UTC m=+5937.817238582" watchObservedRunningTime="2026-01-30 06:46:40.839826202 +0000 UTC m=+5937.833298840" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.856104 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsr5t\" (UniqueName: \"kubernetes.io/projected/59452b25-eace-4fe3-985a-2efd23a31dc4-kube-api-access-zsr5t\") pod \"octavia-a59b-account-create-update-xr8hp\" (UID: \"59452b25-eace-4fe3-985a-2efd23a31dc4\") " pod="openstack/octavia-a59b-account-create-update-xr8hp" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.856297 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59452b25-eace-4fe3-985a-2efd23a31dc4-operator-scripts\") pod \"octavia-a59b-account-create-update-xr8hp\" (UID: \"59452b25-eace-4fe3-985a-2efd23a31dc4\") " pod="openstack/octavia-a59b-account-create-update-xr8hp" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.958266 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59452b25-eace-4fe3-985a-2efd23a31dc4-operator-scripts\") pod \"octavia-a59b-account-create-update-xr8hp\" (UID: \"59452b25-eace-4fe3-985a-2efd23a31dc4\") " pod="openstack/octavia-a59b-account-create-update-xr8hp" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.958352 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsr5t\" (UniqueName: \"kubernetes.io/projected/59452b25-eace-4fe3-985a-2efd23a31dc4-kube-api-access-zsr5t\") pod \"octavia-a59b-account-create-update-xr8hp\" (UID: \"59452b25-eace-4fe3-985a-2efd23a31dc4\") " pod="openstack/octavia-a59b-account-create-update-xr8hp" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.959553 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59452b25-eace-4fe3-985a-2efd23a31dc4-operator-scripts\") pod \"octavia-a59b-account-create-update-xr8hp\" (UID: \"59452b25-eace-4fe3-985a-2efd23a31dc4\") " pod="openstack/octavia-a59b-account-create-update-xr8hp" Jan 30 06:46:40 crc kubenswrapper[4841]: I0130 06:46:40.981071 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsr5t\" (UniqueName: \"kubernetes.io/projected/59452b25-eace-4fe3-985a-2efd23a31dc4-kube-api-access-zsr5t\") pod \"octavia-a59b-account-create-update-xr8hp\" (UID: \"59452b25-eace-4fe3-985a-2efd23a31dc4\") " pod="openstack/octavia-a59b-account-create-update-xr8hp" Jan 30 06:46:41 crc kubenswrapper[4841]: I0130 06:46:41.082330 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-a59b-account-create-update-xr8hp" Jan 30 06:46:41 crc kubenswrapper[4841]: I0130 06:46:41.299778 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wbx4d"] Jan 30 06:46:41 crc kubenswrapper[4841]: I0130 06:46:41.702791 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-a59b-account-create-update-xr8hp"] Jan 30 06:46:41 crc kubenswrapper[4841]: W0130 06:46:41.708635 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59452b25_eace_4fe3_985a_2efd23a31dc4.slice/crio-947b5eaf35cc8f42b4b668d96ef25ea4b36102063cb38e5bdc0415438863fb71 WatchSource:0}: Error finding container 947b5eaf35cc8f42b4b668d96ef25ea4b36102063cb38e5bdc0415438863fb71: Status 404 returned error can't find the container with id 947b5eaf35cc8f42b4b668d96ef25ea4b36102063cb38e5bdc0415438863fb71 Jan 30 06:46:41 crc kubenswrapper[4841]: I0130 06:46:41.722717 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2596f" event={"ID":"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd","Type":"ContainerStarted","Data":"d5cfaa3924b5ee1cd4238def9890ebd553a6e77efa9c4adf191dadc013b76a54"} Jan 30 06:46:41 crc kubenswrapper[4841]: I0130 06:46:41.725556 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbx4d" event={"ID":"79fa1885-942f-48ce-94f4-e1c807e0ae4e","Type":"ContainerStarted","Data":"63bc894312fe6898f802d65001222053e36bb36267fdb714d68ad5da7fa6a24d"} Jan 30 06:46:41 crc kubenswrapper[4841]: I0130 06:46:41.729292 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-a59b-account-create-update-xr8hp" event={"ID":"59452b25-eace-4fe3-985a-2efd23a31dc4","Type":"ContainerStarted","Data":"947b5eaf35cc8f42b4b668d96ef25ea4b36102063cb38e5bdc0415438863fb71"} Jan 30 06:46:41 crc kubenswrapper[4841]: I0130 06:46:41.731727 4841 generic.go:334] "Generic (PLEG): container finished" podID="2c7b918d-ef6a-4644-bf1a-d653c4cfdeac" containerID="a93748098483f741ad17575cfeaaf729dc2486ab84f610abf18b5ada2b6e3e10" exitCode=0 Jan 30 06:46:41 crc kubenswrapper[4841]: I0130 06:46:41.731813 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-xblv8" event={"ID":"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac","Type":"ContainerDied","Data":"a93748098483f741ad17575cfeaaf729dc2486ab84f610abf18b5ada2b6e3e10"} Jan 30 06:46:42 crc kubenswrapper[4841]: I0130 06:46:42.743258 4841 generic.go:334] "Generic (PLEG): container finished" podID="59452b25-eace-4fe3-985a-2efd23a31dc4" containerID="faba54ee2d6ecf901ab1a1a6af0a6c3f047379488ad4e20367c55ec816112a6d" exitCode=0 Jan 30 06:46:42 crc kubenswrapper[4841]: I0130 06:46:42.743339 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-a59b-account-create-update-xr8hp" event={"ID":"59452b25-eace-4fe3-985a-2efd23a31dc4","Type":"ContainerDied","Data":"faba54ee2d6ecf901ab1a1a6af0a6c3f047379488ad4e20367c55ec816112a6d"} Jan 30 06:46:42 crc kubenswrapper[4841]: I0130 06:46:42.746588 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2596f" event={"ID":"7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd","Type":"ContainerStarted","Data":"842bd2fe3b4814eaa53beaa610b0fe38c0ff982f9f4f3f1737d0cbeb32ae40ef"} Jan 30 06:46:42 crc kubenswrapper[4841]: I0130 06:46:42.746751 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:42 crc kubenswrapper[4841]: I0130 06:46:42.748504 4841 generic.go:334] "Generic (PLEG): container finished" podID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" containerID="86d54b1f315e6e14acf06def77c133f99cf1d55aa71981ceadf21aebf38ec0f5" exitCode=0 Jan 30 06:46:42 crc kubenswrapper[4841]: I0130 06:46:42.748560 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbx4d" event={"ID":"79fa1885-942f-48ce-94f4-e1c807e0ae4e","Type":"ContainerDied","Data":"86d54b1f315e6e14acf06def77c133f99cf1d55aa71981ceadf21aebf38ec0f5"} Jan 30 06:46:42 crc kubenswrapper[4841]: I0130 06:46:42.779346 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2596f" podStartSLOduration=5.779331724 podStartE2EDuration="5.779331724s" podCreationTimestamp="2026-01-30 06:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:46:42.775846212 +0000 UTC m=+5939.769318870" watchObservedRunningTime="2026-01-30 06:46:42.779331724 +0000 UTC m=+5939.772804362" Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.041356 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-trbn6"] Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.066092 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-trbn6"] Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.134075 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-xblv8" Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.212474 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac-operator-scripts\") pod \"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac\" (UID: \"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac\") " Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.212551 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bqnq\" (UniqueName: \"kubernetes.io/projected/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac-kube-api-access-9bqnq\") pod \"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac\" (UID: \"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac\") " Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.213534 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c7b918d-ef6a-4644-bf1a-d653c4cfdeac" (UID: "2c7b918d-ef6a-4644-bf1a-d653c4cfdeac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.214325 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.217456 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac-kube-api-access-9bqnq" (OuterVolumeSpecName: "kube-api-access-9bqnq") pod "2c7b918d-ef6a-4644-bf1a-d653c4cfdeac" (UID: "2c7b918d-ef6a-4644-bf1a-d653c4cfdeac"). InnerVolumeSpecName "kube-api-access-9bqnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.315290 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.315629 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bqnq\" (UniqueName: \"kubernetes.io/projected/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac-kube-api-access-9bqnq\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.762016 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbx4d" event={"ID":"79fa1885-942f-48ce-94f4-e1c807e0ae4e","Type":"ContainerStarted","Data":"6f16d134dfda198df4faa973050acd988c6ff6084ed57a09975e5e96d8f99c81"} Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.768622 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-xblv8" Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.773957 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-xblv8" event={"ID":"2c7b918d-ef6a-4644-bf1a-d653c4cfdeac","Type":"ContainerDied","Data":"c453f401af29d483350e95e2635dbf1ade93612f820926d1d4907a20aff14202"} Jan 30 06:46:43 crc kubenswrapper[4841]: I0130 06:46:43.774037 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c453f401af29d483350e95e2635dbf1ade93612f820926d1d4907a20aff14202" Jan 30 06:46:44 crc kubenswrapper[4841]: I0130 06:46:44.181618 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-a59b-account-create-update-xr8hp" Jan 30 06:46:44 crc kubenswrapper[4841]: I0130 06:46:44.230116 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59452b25-eace-4fe3-985a-2efd23a31dc4-operator-scripts\") pod \"59452b25-eace-4fe3-985a-2efd23a31dc4\" (UID: \"59452b25-eace-4fe3-985a-2efd23a31dc4\") " Jan 30 06:46:44 crc kubenswrapper[4841]: I0130 06:46:44.230165 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsr5t\" (UniqueName: \"kubernetes.io/projected/59452b25-eace-4fe3-985a-2efd23a31dc4-kube-api-access-zsr5t\") pod \"59452b25-eace-4fe3-985a-2efd23a31dc4\" (UID: \"59452b25-eace-4fe3-985a-2efd23a31dc4\") " Jan 30 06:46:44 crc kubenswrapper[4841]: I0130 06:46:44.231004 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59452b25-eace-4fe3-985a-2efd23a31dc4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59452b25-eace-4fe3-985a-2efd23a31dc4" (UID: "59452b25-eace-4fe3-985a-2efd23a31dc4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:46:44 crc kubenswrapper[4841]: I0130 06:46:44.236025 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59452b25-eace-4fe3-985a-2efd23a31dc4-kube-api-access-zsr5t" (OuterVolumeSpecName: "kube-api-access-zsr5t") pod "59452b25-eace-4fe3-985a-2efd23a31dc4" (UID: "59452b25-eace-4fe3-985a-2efd23a31dc4"). InnerVolumeSpecName "kube-api-access-zsr5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:46:44 crc kubenswrapper[4841]: I0130 06:46:44.332107 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsr5t\" (UniqueName: \"kubernetes.io/projected/59452b25-eace-4fe3-985a-2efd23a31dc4-kube-api-access-zsr5t\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:44 crc kubenswrapper[4841]: I0130 06:46:44.332145 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59452b25-eace-4fe3-985a-2efd23a31dc4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:44 crc kubenswrapper[4841]: I0130 06:46:44.457483 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa316b86-b9d2-4285-9ede-7319f99b2b13" path="/var/lib/kubelet/pods/fa316b86-b9d2-4285-9ede-7319f99b2b13/volumes" Jan 30 06:46:44 crc kubenswrapper[4841]: I0130 06:46:44.780540 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-a59b-account-create-update-xr8hp" event={"ID":"59452b25-eace-4fe3-985a-2efd23a31dc4","Type":"ContainerDied","Data":"947b5eaf35cc8f42b4b668d96ef25ea4b36102063cb38e5bdc0415438863fb71"} Jan 30 06:46:44 crc kubenswrapper[4841]: I0130 06:46:44.780922 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="947b5eaf35cc8f42b4b668d96ef25ea4b36102063cb38e5bdc0415438863fb71" Jan 30 06:46:44 crc kubenswrapper[4841]: I0130 06:46:44.780583 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-a59b-account-create-update-xr8hp" Jan 30 06:46:45 crc kubenswrapper[4841]: I0130 06:46:45.800942 4841 generic.go:334] "Generic (PLEG): container finished" podID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" containerID="6f16d134dfda198df4faa973050acd988c6ff6084ed57a09975e5e96d8f99c81" exitCode=0 Jan 30 06:46:45 crc kubenswrapper[4841]: I0130 06:46:45.801361 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbx4d" event={"ID":"79fa1885-942f-48ce-94f4-e1c807e0ae4e","Type":"ContainerDied","Data":"6f16d134dfda198df4faa973050acd988c6ff6084ed57a09975e5e96d8f99c81"} Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.267000 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-7vfn6"] Jan 30 06:46:46 crc kubenswrapper[4841]: E0130 06:46:46.267456 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59452b25-eace-4fe3-985a-2efd23a31dc4" containerName="mariadb-account-create-update" Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.267476 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="59452b25-eace-4fe3-985a-2efd23a31dc4" containerName="mariadb-account-create-update" Jan 30 06:46:46 crc kubenswrapper[4841]: E0130 06:46:46.267502 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c7b918d-ef6a-4644-bf1a-d653c4cfdeac" containerName="mariadb-database-create" Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.267511 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7b918d-ef6a-4644-bf1a-d653c4cfdeac" containerName="mariadb-database-create" Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.267720 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="59452b25-eace-4fe3-985a-2efd23a31dc4" containerName="mariadb-account-create-update" Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.267755 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c7b918d-ef6a-4644-bf1a-d653c4cfdeac" containerName="mariadb-database-create" Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.268650 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-7vfn6" Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.281393 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b57pq\" (UniqueName: \"kubernetes.io/projected/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8-kube-api-access-b57pq\") pod \"octavia-persistence-db-create-7vfn6\" (UID: \"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8\") " pod="openstack/octavia-persistence-db-create-7vfn6" Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.281557 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8-operator-scripts\") pod \"octavia-persistence-db-create-7vfn6\" (UID: \"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8\") " pod="openstack/octavia-persistence-db-create-7vfn6" Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.289788 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-7vfn6"] Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.384315 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b57pq\" (UniqueName: \"kubernetes.io/projected/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8-kube-api-access-b57pq\") pod \"octavia-persistence-db-create-7vfn6\" (UID: \"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8\") " pod="openstack/octavia-persistence-db-create-7vfn6" Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.384420 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8-operator-scripts\") pod \"octavia-persistence-db-create-7vfn6\" (UID: \"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8\") " pod="openstack/octavia-persistence-db-create-7vfn6" Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.385241 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8-operator-scripts\") pod \"octavia-persistence-db-create-7vfn6\" (UID: \"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8\") " pod="openstack/octavia-persistence-db-create-7vfn6" Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.416909 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b57pq\" (UniqueName: \"kubernetes.io/projected/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8-kube-api-access-b57pq\") pod \"octavia-persistence-db-create-7vfn6\" (UID: \"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8\") " pod="openstack/octavia-persistence-db-create-7vfn6" Jan 30 06:46:46 crc kubenswrapper[4841]: I0130 06:46:46.602940 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-7vfn6" Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.259661 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-7vfn6"] Jan 30 06:46:47 crc kubenswrapper[4841]: E0130 06:46:47.306750 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59452b25_eace_4fe3_985a_2efd23a31dc4.slice\": RecentStats: unable to find data in memory cache]" Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.658177 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-2df5-account-create-update-gqjjl"] Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.659722 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2df5-account-create-update-gqjjl" Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.662761 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.666092 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-2df5-account-create-update-gqjjl"] Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.713530 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266-operator-scripts\") pod \"octavia-2df5-account-create-update-gqjjl\" (UID: \"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266\") " pod="openstack/octavia-2df5-account-create-update-gqjjl" Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.713903 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhcth\" (UniqueName: \"kubernetes.io/projected/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266-kube-api-access-bhcth\") pod \"octavia-2df5-account-create-update-gqjjl\" (UID: \"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266\") " pod="openstack/octavia-2df5-account-create-update-gqjjl" Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.815640 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhcth\" (UniqueName: \"kubernetes.io/projected/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266-kube-api-access-bhcth\") pod \"octavia-2df5-account-create-update-gqjjl\" (UID: \"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266\") " pod="openstack/octavia-2df5-account-create-update-gqjjl" Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.815687 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266-operator-scripts\") pod \"octavia-2df5-account-create-update-gqjjl\" (UID: \"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266\") " pod="openstack/octavia-2df5-account-create-update-gqjjl" Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.816567 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266-operator-scripts\") pod \"octavia-2df5-account-create-update-gqjjl\" (UID: \"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266\") " pod="openstack/octavia-2df5-account-create-update-gqjjl" Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.842615 4841 generic.go:334] "Generic (PLEG): container finished" podID="ad3b0080-e6fb-4ac6-8e06-b4096892d2b8" containerID="c2912e987a0200d8fbbeb69885f2ad0a6171dd539ce5726ca73ec45187c4367c" exitCode=0 Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.842789 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-7vfn6" event={"ID":"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8","Type":"ContainerDied","Data":"c2912e987a0200d8fbbeb69885f2ad0a6171dd539ce5726ca73ec45187c4367c"} Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.842853 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-7vfn6" event={"ID":"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8","Type":"ContainerStarted","Data":"0117ebbaca161a43664f70a7dec6ecaf74c01b59d5e3750131ef0ae9325e52ed"} Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.843082 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhcth\" (UniqueName: \"kubernetes.io/projected/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266-kube-api-access-bhcth\") pod \"octavia-2df5-account-create-update-gqjjl\" (UID: \"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266\") " pod="openstack/octavia-2df5-account-create-update-gqjjl" Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.848280 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbx4d" event={"ID":"79fa1885-942f-48ce-94f4-e1c807e0ae4e","Type":"ContainerStarted","Data":"46fe073b4e3f27bc04882787542a08ab3de1d6ba7289789393e949eebe098118"} Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.895819 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wbx4d" podStartSLOduration=4.34665301 podStartE2EDuration="7.895800527s" podCreationTimestamp="2026-01-30 06:46:40 +0000 UTC" firstStartedPulling="2026-01-30 06:46:42.749650313 +0000 UTC m=+5939.743122951" lastFinishedPulling="2026-01-30 06:46:46.29879779 +0000 UTC m=+5943.292270468" observedRunningTime="2026-01-30 06:46:47.889334895 +0000 UTC m=+5944.882807533" watchObservedRunningTime="2026-01-30 06:46:47.895800527 +0000 UTC m=+5944.889273175" Jan 30 06:46:47 crc kubenswrapper[4841]: I0130 06:46:47.989272 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2df5-account-create-update-gqjjl" Jan 30 06:46:48 crc kubenswrapper[4841]: I0130 06:46:48.497704 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-2df5-account-create-update-gqjjl"] Jan 30 06:46:48 crc kubenswrapper[4841]: I0130 06:46:48.857595 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2df5-account-create-update-gqjjl" event={"ID":"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266","Type":"ContainerStarted","Data":"9ee49d0d2d9dc00ea3264731c984def580b89af0651675dfb616fe9f5ff92aff"} Jan 30 06:46:48 crc kubenswrapper[4841]: I0130 06:46:48.857670 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2df5-account-create-update-gqjjl" event={"ID":"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266","Type":"ContainerStarted","Data":"c0c7ccccf09583b4b46688f93669cd9ad407cf10b80a99b56e75e231bb0661fa"} Jan 30 06:46:48 crc kubenswrapper[4841]: I0130 06:46:48.883743 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-2df5-account-create-update-gqjjl" podStartSLOduration=1.883725878 podStartE2EDuration="1.883725878s" podCreationTimestamp="2026-01-30 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:46:48.868677937 +0000 UTC m=+5945.862150585" watchObservedRunningTime="2026-01-30 06:46:48.883725878 +0000 UTC m=+5945.877198516" Jan 30 06:46:49 crc kubenswrapper[4841]: I0130 06:46:49.260249 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-7vfn6" Jan 30 06:46:49 crc kubenswrapper[4841]: I0130 06:46:49.347143 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8-operator-scripts\") pod \"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8\" (UID: \"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8\") " Jan 30 06:46:49 crc kubenswrapper[4841]: I0130 06:46:49.347224 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b57pq\" (UniqueName: \"kubernetes.io/projected/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8-kube-api-access-b57pq\") pod \"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8\" (UID: \"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8\") " Jan 30 06:46:49 crc kubenswrapper[4841]: I0130 06:46:49.347836 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad3b0080-e6fb-4ac6-8e06-b4096892d2b8" (UID: "ad3b0080-e6fb-4ac6-8e06-b4096892d2b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:46:49 crc kubenswrapper[4841]: I0130 06:46:49.353627 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8-kube-api-access-b57pq" (OuterVolumeSpecName: "kube-api-access-b57pq") pod "ad3b0080-e6fb-4ac6-8e06-b4096892d2b8" (UID: "ad3b0080-e6fb-4ac6-8e06-b4096892d2b8"). InnerVolumeSpecName "kube-api-access-b57pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:46:49 crc kubenswrapper[4841]: I0130 06:46:49.450279 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:49 crc kubenswrapper[4841]: I0130 06:46:49.450315 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b57pq\" (UniqueName: \"kubernetes.io/projected/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8-kube-api-access-b57pq\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:49 crc kubenswrapper[4841]: I0130 06:46:49.885694 4841 generic.go:334] "Generic (PLEG): container finished" podID="9e24b2d2-fd41-4fb5-9433-e2e0d7b70266" containerID="9ee49d0d2d9dc00ea3264731c984def580b89af0651675dfb616fe9f5ff92aff" exitCode=0 Jan 30 06:46:49 crc kubenswrapper[4841]: I0130 06:46:49.885835 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2df5-account-create-update-gqjjl" event={"ID":"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266","Type":"ContainerDied","Data":"9ee49d0d2d9dc00ea3264731c984def580b89af0651675dfb616fe9f5ff92aff"} Jan 30 06:46:49 crc kubenswrapper[4841]: I0130 06:46:49.891066 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-7vfn6" event={"ID":"ad3b0080-e6fb-4ac6-8e06-b4096892d2b8","Type":"ContainerDied","Data":"0117ebbaca161a43664f70a7dec6ecaf74c01b59d5e3750131ef0ae9325e52ed"} Jan 30 06:46:49 crc kubenswrapper[4841]: I0130 06:46:49.891121 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0117ebbaca161a43664f70a7dec6ecaf74c01b59d5e3750131ef0ae9325e52ed" Jan 30 06:46:49 crc kubenswrapper[4841]: I0130 06:46:49.891161 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-7vfn6" Jan 30 06:46:50 crc kubenswrapper[4841]: I0130 06:46:50.778662 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:50 crc kubenswrapper[4841]: I0130 06:46:50.779234 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:46:51 crc kubenswrapper[4841]: I0130 06:46:51.339152 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2df5-account-create-update-gqjjl" Jan 30 06:46:51 crc kubenswrapper[4841]: I0130 06:46:51.395176 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266-operator-scripts\") pod \"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266\" (UID: \"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266\") " Jan 30 06:46:51 crc kubenswrapper[4841]: I0130 06:46:51.395299 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhcth\" (UniqueName: \"kubernetes.io/projected/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266-kube-api-access-bhcth\") pod \"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266\" (UID: \"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266\") " Jan 30 06:46:51 crc kubenswrapper[4841]: I0130 06:46:51.395936 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e24b2d2-fd41-4fb5-9433-e2e0d7b70266" (UID: "9e24b2d2-fd41-4fb5-9433-e2e0d7b70266"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:46:51 crc kubenswrapper[4841]: I0130 06:46:51.403333 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266-kube-api-access-bhcth" (OuterVolumeSpecName: "kube-api-access-bhcth") pod "9e24b2d2-fd41-4fb5-9433-e2e0d7b70266" (UID: "9e24b2d2-fd41-4fb5-9433-e2e0d7b70266"). InnerVolumeSpecName "kube-api-access-bhcth". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:46:51 crc kubenswrapper[4841]: I0130 06:46:51.432970 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:46:51 crc kubenswrapper[4841]: E0130 06:46:51.433279 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:46:51 crc kubenswrapper[4841]: I0130 06:46:51.497538 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhcth\" (UniqueName: \"kubernetes.io/projected/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266-kube-api-access-bhcth\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:51 crc kubenswrapper[4841]: I0130 06:46:51.497571 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:46:51 crc kubenswrapper[4841]: I0130 06:46:51.864033 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wbx4d" podUID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" containerName="registry-server" probeResult="failure" output=< Jan 30 06:46:51 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 06:46:51 crc kubenswrapper[4841]: > Jan 30 06:46:51 crc kubenswrapper[4841]: I0130 06:46:51.922437 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2df5-account-create-update-gqjjl" event={"ID":"9e24b2d2-fd41-4fb5-9433-e2e0d7b70266","Type":"ContainerDied","Data":"c0c7ccccf09583b4b46688f93669cd9ad407cf10b80a99b56e75e231bb0661fa"} Jan 30 06:46:51 crc kubenswrapper[4841]: I0130 06:46:51.922490 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0c7ccccf09583b4b46688f93669cd9ad407cf10b80a99b56e75e231bb0661fa" Jan 30 06:46:51 crc kubenswrapper[4841]: I0130 06:46:51.922569 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2df5-account-create-update-gqjjl" Jan 30 06:46:52 crc kubenswrapper[4841]: I0130 06:46:52.405548 4841 scope.go:117] "RemoveContainer" containerID="953edd53b511753824e06f1ab59887cc192942476e8602e5fb50cb89d2e5da02" Jan 30 06:46:52 crc kubenswrapper[4841]: I0130 06:46:52.483110 4841 scope.go:117] "RemoveContainer" containerID="866cd25e2c9d737cc7210a35349328ee12b39451f6c10b9048c7377c362c082a" Jan 30 06:46:52 crc kubenswrapper[4841]: I0130 06:46:52.512085 4841 scope.go:117] "RemoveContainer" containerID="2919ee10621b59e484c5a46b7fc4eff039c6cadb1fa1165aac92b9a68ce08d69" Jan 30 06:46:52 crc kubenswrapper[4841]: I0130 06:46:52.567239 4841 scope.go:117] "RemoveContainer" containerID="8fa138e315a57640c62777a9d19e772b8f8b5bf1d5053e969ad622a94d7228b8" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.774720 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-7698788cdc-k27zp"] Jan 30 06:46:53 crc kubenswrapper[4841]: E0130 06:46:53.775150 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e24b2d2-fd41-4fb5-9433-e2e0d7b70266" containerName="mariadb-account-create-update" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.775165 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e24b2d2-fd41-4fb5-9433-e2e0d7b70266" containerName="mariadb-account-create-update" Jan 30 06:46:53 crc kubenswrapper[4841]: E0130 06:46:53.775199 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3b0080-e6fb-4ac6-8e06-b4096892d2b8" containerName="mariadb-database-create" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.775208 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3b0080-e6fb-4ac6-8e06-b4096892d2b8" containerName="mariadb-database-create" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.775439 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e24b2d2-fd41-4fb5-9433-e2e0d7b70266" containerName="mariadb-account-create-update" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.775454 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3b0080-e6fb-4ac6-8e06-b4096892d2b8" containerName="mariadb-database-create" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.777553 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.783129 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-c9fvw" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.783454 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.783907 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-ovndbs" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.784125 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.807275 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7698788cdc-k27zp"] Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.852147 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/53f391b1-d694-41d6-b689-b41496fe31ca-octavia-run\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.852190 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-scripts\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.852303 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-config-data\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.852338 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-combined-ca-bundle\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.852421 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/53f391b1-d694-41d6-b689-b41496fe31ca-config-data-merged\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.852486 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-ovndb-tls-certs\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.955805 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-config-data\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.956114 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-combined-ca-bundle\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.956209 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/53f391b1-d694-41d6-b689-b41496fe31ca-config-data-merged\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.956327 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-ovndb-tls-certs\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.956443 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/53f391b1-d694-41d6-b689-b41496fe31ca-octavia-run\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.956527 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-scripts\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.956735 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/53f391b1-d694-41d6-b689-b41496fe31ca-config-data-merged\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.956984 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/53f391b1-d694-41d6-b689-b41496fe31ca-octavia-run\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.961140 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-combined-ca-bundle\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.961441 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-config-data\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.961459 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-scripts\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:53 crc kubenswrapper[4841]: I0130 06:46:53.964270 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-ovndb-tls-certs\") pod \"octavia-api-7698788cdc-k27zp\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:54 crc kubenswrapper[4841]: I0130 06:46:54.127288 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:46:54 crc kubenswrapper[4841]: I0130 06:46:54.619369 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7698788cdc-k27zp"] Jan 30 06:46:54 crc kubenswrapper[4841]: W0130 06:46:54.619977 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53f391b1_d694_41d6_b689_b41496fe31ca.slice/crio-f8ee41bcd3852e15c6d24fddc6a1e91f6fbacb39c9940748d49d51c2bed37efb WatchSource:0}: Error finding container f8ee41bcd3852e15c6d24fddc6a1e91f6fbacb39c9940748d49d51c2bed37efb: Status 404 returned error can't find the container with id f8ee41bcd3852e15c6d24fddc6a1e91f6fbacb39c9940748d49d51c2bed37efb Jan 30 06:46:54 crc kubenswrapper[4841]: I0130 06:46:54.949562 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7698788cdc-k27zp" event={"ID":"53f391b1-d694-41d6-b689-b41496fe31ca","Type":"ContainerStarted","Data":"f8ee41bcd3852e15c6d24fddc6a1e91f6fbacb39c9940748d49d51c2bed37efb"} Jan 30 06:46:57 crc kubenswrapper[4841]: E0130 06:46:57.553056 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59452b25_eace_4fe3_985a_2efd23a31dc4.slice\": RecentStats: unable to find data in memory cache]" Jan 30 06:47:00 crc kubenswrapper[4841]: I0130 06:47:00.828470 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:47:00 crc kubenswrapper[4841]: I0130 06:47:00.884357 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:47:01 crc kubenswrapper[4841]: I0130 06:47:01.084079 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wbx4d"] Jan 30 06:47:02 crc kubenswrapper[4841]: I0130 06:47:02.024683 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wbx4d" podUID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" containerName="registry-server" containerID="cri-o://46fe073b4e3f27bc04882787542a08ab3de1d6ba7289789393e949eebe098118" gracePeriod=2 Jan 30 06:47:03 crc kubenswrapper[4841]: I0130 06:47:03.041881 4841 generic.go:334] "Generic (PLEG): container finished" podID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" containerID="46fe073b4e3f27bc04882787542a08ab3de1d6ba7289789393e949eebe098118" exitCode=0 Jan 30 06:47:03 crc kubenswrapper[4841]: I0130 06:47:03.041942 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbx4d" event={"ID":"79fa1885-942f-48ce-94f4-e1c807e0ae4e","Type":"ContainerDied","Data":"46fe073b4e3f27bc04882787542a08ab3de1d6ba7289789393e949eebe098118"} Jan 30 06:47:03 crc kubenswrapper[4841]: I0130 06:47:03.432195 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:47:03 crc kubenswrapper[4841]: E0130 06:47:03.432805 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:47:03 crc kubenswrapper[4841]: I0130 06:47:03.685688 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:47:03 crc kubenswrapper[4841]: I0130 06:47:03.853234 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlrms\" (UniqueName: \"kubernetes.io/projected/79fa1885-942f-48ce-94f4-e1c807e0ae4e-kube-api-access-dlrms\") pod \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\" (UID: \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\") " Jan 30 06:47:03 crc kubenswrapper[4841]: I0130 06:47:03.853702 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fa1885-942f-48ce-94f4-e1c807e0ae4e-catalog-content\") pod \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\" (UID: \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\") " Jan 30 06:47:03 crc kubenswrapper[4841]: I0130 06:47:03.853771 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fa1885-942f-48ce-94f4-e1c807e0ae4e-utilities\") pod \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\" (UID: \"79fa1885-942f-48ce-94f4-e1c807e0ae4e\") " Jan 30 06:47:03 crc kubenswrapper[4841]: I0130 06:47:03.854804 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fa1885-942f-48ce-94f4-e1c807e0ae4e-utilities" (OuterVolumeSpecName: "utilities") pod "79fa1885-942f-48ce-94f4-e1c807e0ae4e" (UID: "79fa1885-942f-48ce-94f4-e1c807e0ae4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:47:03 crc kubenswrapper[4841]: I0130 06:47:03.862226 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fa1885-942f-48ce-94f4-e1c807e0ae4e-kube-api-access-dlrms" (OuterVolumeSpecName: "kube-api-access-dlrms") pod "79fa1885-942f-48ce-94f4-e1c807e0ae4e" (UID: "79fa1885-942f-48ce-94f4-e1c807e0ae4e"). InnerVolumeSpecName "kube-api-access-dlrms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:47:03 crc kubenswrapper[4841]: I0130 06:47:03.956120 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fa1885-942f-48ce-94f4-e1c807e0ae4e-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:03 crc kubenswrapper[4841]: I0130 06:47:03.956156 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlrms\" (UniqueName: \"kubernetes.io/projected/79fa1885-942f-48ce-94f4-e1c807e0ae4e-kube-api-access-dlrms\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:03 crc kubenswrapper[4841]: I0130 06:47:03.987701 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fa1885-942f-48ce-94f4-e1c807e0ae4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79fa1885-942f-48ce-94f4-e1c807e0ae4e" (UID: "79fa1885-942f-48ce-94f4-e1c807e0ae4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:47:04 crc kubenswrapper[4841]: I0130 06:47:04.055472 4841 generic.go:334] "Generic (PLEG): container finished" podID="53f391b1-d694-41d6-b689-b41496fe31ca" containerID="bf63610ed30e7b4d2dde3128803743eb1ac3f8b436f2568a2196d5d987954723" exitCode=0 Jan 30 06:47:04 crc kubenswrapper[4841]: I0130 06:47:04.055555 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7698788cdc-k27zp" event={"ID":"53f391b1-d694-41d6-b689-b41496fe31ca","Type":"ContainerDied","Data":"bf63610ed30e7b4d2dde3128803743eb1ac3f8b436f2568a2196d5d987954723"} Jan 30 06:47:04 crc kubenswrapper[4841]: I0130 06:47:04.057061 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fa1885-942f-48ce-94f4-e1c807e0ae4e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:04 crc kubenswrapper[4841]: I0130 06:47:04.059279 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbx4d" event={"ID":"79fa1885-942f-48ce-94f4-e1c807e0ae4e","Type":"ContainerDied","Data":"63bc894312fe6898f802d65001222053e36bb36267fdb714d68ad5da7fa6a24d"} Jan 30 06:47:04 crc kubenswrapper[4841]: I0130 06:47:04.059356 4841 scope.go:117] "RemoveContainer" containerID="46fe073b4e3f27bc04882787542a08ab3de1d6ba7289789393e949eebe098118" Jan 30 06:47:04 crc kubenswrapper[4841]: I0130 06:47:04.059380 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbx4d" Jan 30 06:47:04 crc kubenswrapper[4841]: I0130 06:47:04.097523 4841 scope.go:117] "RemoveContainer" containerID="6f16d134dfda198df4faa973050acd988c6ff6084ed57a09975e5e96d8f99c81" Jan 30 06:47:04 crc kubenswrapper[4841]: I0130 06:47:04.130134 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wbx4d"] Jan 30 06:47:04 crc kubenswrapper[4841]: I0130 06:47:04.142420 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wbx4d"] Jan 30 06:47:04 crc kubenswrapper[4841]: I0130 06:47:04.156378 4841 scope.go:117] "RemoveContainer" containerID="86d54b1f315e6e14acf06def77c133f99cf1d55aa71981ceadf21aebf38ec0f5" Jan 30 06:47:04 crc kubenswrapper[4841]: I0130 06:47:04.443784 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" path="/var/lib/kubelet/pods/79fa1885-942f-48ce-94f4-e1c807e0ae4e/volumes" Jan 30 06:47:05 crc kubenswrapper[4841]: I0130 06:47:05.078955 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7698788cdc-k27zp" event={"ID":"53f391b1-d694-41d6-b689-b41496fe31ca","Type":"ContainerStarted","Data":"bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310"} Jan 30 06:47:05 crc kubenswrapper[4841]: I0130 06:47:05.079007 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7698788cdc-k27zp" event={"ID":"53f391b1-d694-41d6-b689-b41496fe31ca","Type":"ContainerStarted","Data":"5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5"} Jan 30 06:47:05 crc kubenswrapper[4841]: I0130 06:47:05.079136 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:47:05 crc kubenswrapper[4841]: I0130 06:47:05.079171 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:47:05 crc kubenswrapper[4841]: I0130 06:47:05.103755 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-7698788cdc-k27zp" podStartSLOduration=3.334864213 podStartE2EDuration="12.103736446s" podCreationTimestamp="2026-01-30 06:46:53 +0000 UTC" firstStartedPulling="2026-01-30 06:46:54.622101819 +0000 UTC m=+5951.615574457" lastFinishedPulling="2026-01-30 06:47:03.390974052 +0000 UTC m=+5960.384446690" observedRunningTime="2026-01-30 06:47:05.097903901 +0000 UTC m=+5962.091376549" watchObservedRunningTime="2026-01-30 06:47:05.103736446 +0000 UTC m=+5962.097209074" Jan 30 06:47:07 crc kubenswrapper[4841]: E0130 06:47:07.777488 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59452b25_eace_4fe3_985a_2efd23a31dc4.slice\": RecentStats: unable to find data in memory cache]" Jan 30 06:47:12 crc kubenswrapper[4841]: I0130 06:47:12.980610 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.065497 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.258388 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-4wrmv" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.274638 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.293797 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2596f" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.426535 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4wrmv-config-7hlw5"] Jan 30 06:47:13 crc kubenswrapper[4841]: E0130 06:47:13.426891 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" containerName="registry-server" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.426902 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" containerName="registry-server" Jan 30 06:47:13 crc kubenswrapper[4841]: E0130 06:47:13.426923 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" containerName="extract-utilities" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.426929 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" containerName="extract-utilities" Jan 30 06:47:13 crc kubenswrapper[4841]: E0130 06:47:13.426941 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" containerName="extract-content" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.426948 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" containerName="extract-content" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.443488 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fa1885-942f-48ce-94f4-e1c807e0ae4e" containerName="registry-server" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.453586 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4wrmv-config-7hlw5"] Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.454667 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.456901 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.562271 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-run-ovn\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.562377 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdmj7\" (UniqueName: \"kubernetes.io/projected/29c3ed04-7fdf-48b0-a734-a203da69e52c-kube-api-access-jdmj7\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.562417 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/29c3ed04-7fdf-48b0-a734-a203da69e52c-additional-scripts\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.562475 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-run\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.562491 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29c3ed04-7fdf-48b0-a734-a203da69e52c-scripts\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.562586 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-log-ovn\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.664588 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-log-ovn\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.664868 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-log-ovn\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.665007 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-run-ovn\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.665066 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-run-ovn\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.665132 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdmj7\" (UniqueName: \"kubernetes.io/projected/29c3ed04-7fdf-48b0-a734-a203da69e52c-kube-api-access-jdmj7\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.665431 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/29c3ed04-7fdf-48b0-a734-a203da69e52c-additional-scripts\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.666041 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/29c3ed04-7fdf-48b0-a734-a203da69e52c-additional-scripts\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.666107 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-run\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.666169 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-run\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.666125 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29c3ed04-7fdf-48b0-a734-a203da69e52c-scripts\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.668460 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29c3ed04-7fdf-48b0-a734-a203da69e52c-scripts\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.686119 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdmj7\" (UniqueName: \"kubernetes.io/projected/29c3ed04-7fdf-48b0-a734-a203da69e52c-kube-api-access-jdmj7\") pod \"ovn-controller-4wrmv-config-7hlw5\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:13 crc kubenswrapper[4841]: I0130 06:47:13.774318 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:14 crc kubenswrapper[4841]: I0130 06:47:14.276373 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4wrmv-config-7hlw5"] Jan 30 06:47:15 crc kubenswrapper[4841]: I0130 06:47:15.203794 4841 generic.go:334] "Generic (PLEG): container finished" podID="29c3ed04-7fdf-48b0-a734-a203da69e52c" containerID="0ad4ef0bcb1ebca85479ccfcd113fdac37df6c082ca09a839d66babd6d174eca" exitCode=0 Jan 30 06:47:15 crc kubenswrapper[4841]: I0130 06:47:15.203848 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wrmv-config-7hlw5" event={"ID":"29c3ed04-7fdf-48b0-a734-a203da69e52c","Type":"ContainerDied","Data":"0ad4ef0bcb1ebca85479ccfcd113fdac37df6c082ca09a839d66babd6d174eca"} Jan 30 06:47:15 crc kubenswrapper[4841]: I0130 06:47:15.204134 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wrmv-config-7hlw5" event={"ID":"29c3ed04-7fdf-48b0-a734-a203da69e52c","Type":"ContainerStarted","Data":"aa5fe77d1f328067db4bad15ec06c02a2d088d8cddc4adeb207099c10b3d117a"} Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.683937 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.835991 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-run\") pod \"29c3ed04-7fdf-48b0-a734-a203da69e52c\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.836120 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-log-ovn\") pod \"29c3ed04-7fdf-48b0-a734-a203da69e52c\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.836140 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-run" (OuterVolumeSpecName: "var-run") pod "29c3ed04-7fdf-48b0-a734-a203da69e52c" (UID: "29c3ed04-7fdf-48b0-a734-a203da69e52c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.836169 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdmj7\" (UniqueName: \"kubernetes.io/projected/29c3ed04-7fdf-48b0-a734-a203da69e52c-kube-api-access-jdmj7\") pod \"29c3ed04-7fdf-48b0-a734-a203da69e52c\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.836198 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "29c3ed04-7fdf-48b0-a734-a203da69e52c" (UID: "29c3ed04-7fdf-48b0-a734-a203da69e52c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.836224 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29c3ed04-7fdf-48b0-a734-a203da69e52c-scripts\") pod \"29c3ed04-7fdf-48b0-a734-a203da69e52c\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.836240 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-run-ovn\") pod \"29c3ed04-7fdf-48b0-a734-a203da69e52c\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.836307 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/29c3ed04-7fdf-48b0-a734-a203da69e52c-additional-scripts\") pod \"29c3ed04-7fdf-48b0-a734-a203da69e52c\" (UID: \"29c3ed04-7fdf-48b0-a734-a203da69e52c\") " Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.836445 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "29c3ed04-7fdf-48b0-a734-a203da69e52c" (UID: "29c3ed04-7fdf-48b0-a734-a203da69e52c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.836714 4841 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.836734 4841 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.836744 4841 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ed04-7fdf-48b0-a734-a203da69e52c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.837050 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29c3ed04-7fdf-48b0-a734-a203da69e52c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "29c3ed04-7fdf-48b0-a734-a203da69e52c" (UID: "29c3ed04-7fdf-48b0-a734-a203da69e52c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.839004 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29c3ed04-7fdf-48b0-a734-a203da69e52c-scripts" (OuterVolumeSpecName: "scripts") pod "29c3ed04-7fdf-48b0-a734-a203da69e52c" (UID: "29c3ed04-7fdf-48b0-a734-a203da69e52c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.841380 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c3ed04-7fdf-48b0-a734-a203da69e52c-kube-api-access-jdmj7" (OuterVolumeSpecName: "kube-api-access-jdmj7") pod "29c3ed04-7fdf-48b0-a734-a203da69e52c" (UID: "29c3ed04-7fdf-48b0-a734-a203da69e52c"). InnerVolumeSpecName "kube-api-access-jdmj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.938636 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdmj7\" (UniqueName: \"kubernetes.io/projected/29c3ed04-7fdf-48b0-a734-a203da69e52c-kube-api-access-jdmj7\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.938894 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29c3ed04-7fdf-48b0-a734-a203da69e52c-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:16 crc kubenswrapper[4841]: I0130 06:47:16.938991 4841 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/29c3ed04-7fdf-48b0-a734-a203da69e52c-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:17 crc kubenswrapper[4841]: I0130 06:47:17.224825 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wrmv-config-7hlw5" event={"ID":"29c3ed04-7fdf-48b0-a734-a203da69e52c","Type":"ContainerDied","Data":"aa5fe77d1f328067db4bad15ec06c02a2d088d8cddc4adeb207099c10b3d117a"} Jan 30 06:47:17 crc kubenswrapper[4841]: I0130 06:47:17.224902 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa5fe77d1f328067db4bad15ec06c02a2d088d8cddc4adeb207099c10b3d117a" Jan 30 06:47:17 crc kubenswrapper[4841]: I0130 06:47:17.225473 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wrmv-config-7hlw5" Jan 30 06:47:17 crc kubenswrapper[4841]: I0130 06:47:17.433106 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:47:17 crc kubenswrapper[4841]: E0130 06:47:17.433744 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:47:17 crc kubenswrapper[4841]: I0130 06:47:17.790627 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-4wrmv-config-7hlw5"] Jan 30 06:47:17 crc kubenswrapper[4841]: I0130 06:47:17.807606 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-4wrmv-config-7hlw5"] Jan 30 06:47:18 crc kubenswrapper[4841]: E0130 06:47:18.038155 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59452b25_eace_4fe3_985a_2efd23a31dc4.slice\": RecentStats: unable to find data in memory cache]" Jan 30 06:47:18 crc kubenswrapper[4841]: I0130 06:47:18.447543 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c3ed04-7fdf-48b0-a734-a203da69e52c" path="/var/lib/kubelet/pods/29c3ed04-7fdf-48b0-a734-a203da69e52c/volumes" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.232913 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-c92xw"] Jan 30 06:47:20 crc kubenswrapper[4841]: E0130 06:47:20.233351 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c3ed04-7fdf-48b0-a734-a203da69e52c" containerName="ovn-config" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.233367 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c3ed04-7fdf-48b0-a734-a203da69e52c" containerName="ovn-config" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.233632 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="29c3ed04-7fdf-48b0-a734-a203da69e52c" containerName="ovn-config" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.234798 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.237629 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.237780 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.252003 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.254642 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-c92xw"] Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.320479 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f804e83-62a5-4880-b595-7ece506cb780-config-data\") pod \"octavia-rsyslog-c92xw\" (UID: \"1f804e83-62a5-4880-b595-7ece506cb780\") " pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.320530 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1f804e83-62a5-4880-b595-7ece506cb780-config-data-merged\") pod \"octavia-rsyslog-c92xw\" (UID: \"1f804e83-62a5-4880-b595-7ece506cb780\") " pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.320627 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/1f804e83-62a5-4880-b595-7ece506cb780-hm-ports\") pod \"octavia-rsyslog-c92xw\" (UID: \"1f804e83-62a5-4880-b595-7ece506cb780\") " pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.320667 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f804e83-62a5-4880-b595-7ece506cb780-scripts\") pod \"octavia-rsyslog-c92xw\" (UID: \"1f804e83-62a5-4880-b595-7ece506cb780\") " pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.422543 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f804e83-62a5-4880-b595-7ece506cb780-config-data\") pod \"octavia-rsyslog-c92xw\" (UID: \"1f804e83-62a5-4880-b595-7ece506cb780\") " pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.422810 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1f804e83-62a5-4880-b595-7ece506cb780-config-data-merged\") pod \"octavia-rsyslog-c92xw\" (UID: \"1f804e83-62a5-4880-b595-7ece506cb780\") " pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.422898 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/1f804e83-62a5-4880-b595-7ece506cb780-hm-ports\") pod \"octavia-rsyslog-c92xw\" (UID: \"1f804e83-62a5-4880-b595-7ece506cb780\") " pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.422937 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f804e83-62a5-4880-b595-7ece506cb780-scripts\") pod \"octavia-rsyslog-c92xw\" (UID: \"1f804e83-62a5-4880-b595-7ece506cb780\") " pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.423539 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1f804e83-62a5-4880-b595-7ece506cb780-config-data-merged\") pod \"octavia-rsyslog-c92xw\" (UID: \"1f804e83-62a5-4880-b595-7ece506cb780\") " pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.424023 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/1f804e83-62a5-4880-b595-7ece506cb780-hm-ports\") pod \"octavia-rsyslog-c92xw\" (UID: \"1f804e83-62a5-4880-b595-7ece506cb780\") " pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.428315 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f804e83-62a5-4880-b595-7ece506cb780-scripts\") pod \"octavia-rsyslog-c92xw\" (UID: \"1f804e83-62a5-4880-b595-7ece506cb780\") " pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.429770 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f804e83-62a5-4880-b595-7ece506cb780-config-data\") pod \"octavia-rsyslog-c92xw\" (UID: \"1f804e83-62a5-4880-b595-7ece506cb780\") " pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:20 crc kubenswrapper[4841]: I0130 06:47:20.564311 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.016979 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-wb5wm"] Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.018960 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.021891 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.037290 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-wb5wm"] Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.137827 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74ac8b5a-d763-4545-8b4a-bbaa045a56e3-httpd-config\") pod \"octavia-image-upload-65dd99cb46-wb5wm\" (UID: \"74ac8b5a-d763-4545-8b4a-bbaa045a56e3\") " pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.138149 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/74ac8b5a-d763-4545-8b4a-bbaa045a56e3-amphora-image\") pod \"octavia-image-upload-65dd99cb46-wb5wm\" (UID: \"74ac8b5a-d763-4545-8b4a-bbaa045a56e3\") " pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.177173 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-c92xw"] Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.239551 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74ac8b5a-d763-4545-8b4a-bbaa045a56e3-httpd-config\") pod \"octavia-image-upload-65dd99cb46-wb5wm\" (UID: \"74ac8b5a-d763-4545-8b4a-bbaa045a56e3\") " pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.239671 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/74ac8b5a-d763-4545-8b4a-bbaa045a56e3-amphora-image\") pod \"octavia-image-upload-65dd99cb46-wb5wm\" (UID: \"74ac8b5a-d763-4545-8b4a-bbaa045a56e3\") " pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.240160 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/74ac8b5a-d763-4545-8b4a-bbaa045a56e3-amphora-image\") pod \"octavia-image-upload-65dd99cb46-wb5wm\" (UID: \"74ac8b5a-d763-4545-8b4a-bbaa045a56e3\") " pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.246736 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74ac8b5a-d763-4545-8b4a-bbaa045a56e3-httpd-config\") pod \"octavia-image-upload-65dd99cb46-wb5wm\" (UID: \"74ac8b5a-d763-4545-8b4a-bbaa045a56e3\") " pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.259981 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-c92xw" event={"ID":"1f804e83-62a5-4880-b595-7ece506cb780","Type":"ContainerStarted","Data":"28161d13f335e2d3c7ee41c8cc418d748da7d60efe2b4a88081565da690ead3e"} Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.347385 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-c92xw"] Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.388587 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" Jan 30 06:47:21 crc kubenswrapper[4841]: I0130 06:47:21.877640 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-wb5wm"] Jan 30 06:47:21 crc kubenswrapper[4841]: W0130 06:47:21.892008 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74ac8b5a_d763_4545_8b4a_bbaa045a56e3.slice/crio-1ad7df0036056a140694be934467b241bd6f87e35523f7a9809591b27ac08231 WatchSource:0}: Error finding container 1ad7df0036056a140694be934467b241bd6f87e35523f7a9809591b27ac08231: Status 404 returned error can't find the container with id 1ad7df0036056a140694be934467b241bd6f87e35523f7a9809591b27ac08231 Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.070457 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-86b4d7c875-xgqnc"] Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.074099 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.080075 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-public-svc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.080480 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-internal-svc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.086364 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-86b4d7c875-xgqnc"] Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.174148 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-scripts\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.174189 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-public-tls-certs\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.174225 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/01c16743-1d26-4b7a-805a-1b7452a2dd0e-config-data-merged\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.174252 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-ovndb-tls-certs\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.174318 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-config-data\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.174350 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-internal-tls-certs\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.174389 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-combined-ca-bundle\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.174470 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/01c16743-1d26-4b7a-805a-1b7452a2dd0e-octavia-run\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.269747 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" event={"ID":"74ac8b5a-d763-4545-8b4a-bbaa045a56e3","Type":"ContainerStarted","Data":"1ad7df0036056a140694be934467b241bd6f87e35523f7a9809591b27ac08231"} Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.275915 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-ovndb-tls-certs\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.276012 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-config-data\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.276052 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-internal-tls-certs\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.276095 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-combined-ca-bundle\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.276112 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/01c16743-1d26-4b7a-805a-1b7452a2dd0e-octavia-run\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.276146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-scripts\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.276165 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-public-tls-certs\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.276194 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/01c16743-1d26-4b7a-805a-1b7452a2dd0e-config-data-merged\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.276622 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/01c16743-1d26-4b7a-805a-1b7452a2dd0e-config-data-merged\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.277188 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/01c16743-1d26-4b7a-805a-1b7452a2dd0e-octavia-run\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.282175 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-config-data\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.285282 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-public-tls-certs\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.285462 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-scripts\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.293505 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-internal-tls-certs\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.301159 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-ovndb-tls-certs\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.314868 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c16743-1d26-4b7a-805a-1b7452a2dd0e-combined-ca-bundle\") pod \"octavia-api-86b4d7c875-xgqnc\" (UID: \"01c16743-1d26-4b7a-805a-1b7452a2dd0e\") " pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:22 crc kubenswrapper[4841]: I0130 06:47:22.424179 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:23 crc kubenswrapper[4841]: I0130 06:47:23.110125 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-86b4d7c875-xgqnc"] Jan 30 06:47:23 crc kubenswrapper[4841]: W0130 06:47:23.217864 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01c16743_1d26_4b7a_805a_1b7452a2dd0e.slice/crio-4ea9bbf710da466e43f3dfb780ce64f8aa0aed45e61d836af4cb3bb03e21eb4a WatchSource:0}: Error finding container 4ea9bbf710da466e43f3dfb780ce64f8aa0aed45e61d836af4cb3bb03e21eb4a: Status 404 returned error can't find the container with id 4ea9bbf710da466e43f3dfb780ce64f8aa0aed45e61d836af4cb3bb03e21eb4a Jan 30 06:47:23 crc kubenswrapper[4841]: I0130 06:47:23.294207 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-c92xw" event={"ID":"1f804e83-62a5-4880-b595-7ece506cb780","Type":"ContainerStarted","Data":"7b7b0ba7de678e34805a0a740395061734207495002bb8091b9466e0c8805f1c"} Jan 30 06:47:23 crc kubenswrapper[4841]: I0130 06:47:23.299238 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-86b4d7c875-xgqnc" event={"ID":"01c16743-1d26-4b7a-805a-1b7452a2dd0e","Type":"ContainerStarted","Data":"4ea9bbf710da466e43f3dfb780ce64f8aa0aed45e61d836af4cb3bb03e21eb4a"} Jan 30 06:47:24 crc kubenswrapper[4841]: I0130 06:47:24.310448 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-86b4d7c875-xgqnc" event={"ID":"01c16743-1d26-4b7a-805a-1b7452a2dd0e","Type":"ContainerStarted","Data":"3f9a0f93e2c5e824b6b3317c2f0871e659172e9389b75db6e6bcc85d4632d2fb"} Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.323392 4841 generic.go:334] "Generic (PLEG): container finished" podID="01c16743-1d26-4b7a-805a-1b7452a2dd0e" containerID="3f9a0f93e2c5e824b6b3317c2f0871e659172e9389b75db6e6bcc85d4632d2fb" exitCode=0 Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.324922 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-86b4d7c875-xgqnc" event={"ID":"01c16743-1d26-4b7a-805a-1b7452a2dd0e","Type":"ContainerDied","Data":"3f9a0f93e2c5e824b6b3317c2f0871e659172e9389b75db6e6bcc85d4632d2fb"} Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.500881 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-64w76"] Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.503079 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.504927 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.506766 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.507132 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.551387 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-64w76"] Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.654771 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-scripts\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.654836 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-amphora-certs\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.654870 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-config-data\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.655046 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-combined-ca-bundle\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.655083 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-config-data-merged\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.655100 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-hm-ports\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.756498 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-config-data\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.756551 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-combined-ca-bundle\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.756575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-config-data-merged\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.756593 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-hm-ports\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.756734 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-scripts\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.756771 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-amphora-certs\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.757171 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-config-data-merged\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.758142 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-hm-ports\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.764095 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-amphora-certs\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.774449 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-config-data\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.776611 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-scripts\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.782012 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20a4f5d-18f6-4eeb-84ac-6d1ee628795e-combined-ca-bundle\") pod \"octavia-healthmanager-64w76\" (UID: \"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e\") " pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:25 crc kubenswrapper[4841]: I0130 06:47:25.861379 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:26 crc kubenswrapper[4841]: I0130 06:47:26.334036 4841 generic.go:334] "Generic (PLEG): container finished" podID="1f804e83-62a5-4880-b595-7ece506cb780" containerID="7b7b0ba7de678e34805a0a740395061734207495002bb8091b9466e0c8805f1c" exitCode=0 Jan 30 06:47:26 crc kubenswrapper[4841]: I0130 06:47:26.334101 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-c92xw" event={"ID":"1f804e83-62a5-4880-b595-7ece506cb780","Type":"ContainerDied","Data":"7b7b0ba7de678e34805a0a740395061734207495002bb8091b9466e0c8805f1c"} Jan 30 06:47:26 crc kubenswrapper[4841]: I0130 06:47:26.344150 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-86b4d7c875-xgqnc" event={"ID":"01c16743-1d26-4b7a-805a-1b7452a2dd0e","Type":"ContainerStarted","Data":"9625e879d37e03fbfcd0a54724b364f6ae27b450d2dc0be96075aaf09197040a"} Jan 30 06:47:26 crc kubenswrapper[4841]: I0130 06:47:26.344185 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-86b4d7c875-xgqnc" event={"ID":"01c16743-1d26-4b7a-805a-1b7452a2dd0e","Type":"ContainerStarted","Data":"8797bb5a231acd1c443eef4945da5e4ba0c2fc740a96a51479b3b2bc588a1a02"} Jan 30 06:47:26 crc kubenswrapper[4841]: I0130 06:47:26.344845 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:26 crc kubenswrapper[4841]: I0130 06:47:26.344887 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:26 crc kubenswrapper[4841]: I0130 06:47:26.383996 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-86b4d7c875-xgqnc" podStartSLOduration=4.383978066 podStartE2EDuration="4.383978066s" podCreationTimestamp="2026-01-30 06:47:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:47:26.375152721 +0000 UTC m=+5983.368625359" watchObservedRunningTime="2026-01-30 06:47:26.383978066 +0000 UTC m=+5983.377450704" Jan 30 06:47:26 crc kubenswrapper[4841]: I0130 06:47:26.466774 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-64w76"] Jan 30 06:47:26 crc kubenswrapper[4841]: W0130 06:47:26.472063 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode20a4f5d_18f6_4eeb_84ac_6d1ee628795e.slice/crio-fa5e2961c72a32d259c0d70fe48d40b374d9d973838327c9cc7e275ca27c4b82 WatchSource:0}: Error finding container fa5e2961c72a32d259c0d70fe48d40b374d9d973838327c9cc7e275ca27c4b82: Status 404 returned error can't find the container with id fa5e2961c72a32d259c0d70fe48d40b374d9d973838327c9cc7e275ca27c4b82 Jan 30 06:47:27 crc kubenswrapper[4841]: I0130 06:47:27.353783 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-64w76" event={"ID":"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e","Type":"ContainerStarted","Data":"bf58abbbe911d79d1c2a227e5f50207e46467aa147b6ebf27c76fb004a8bcea0"} Jan 30 06:47:27 crc kubenswrapper[4841]: I0130 06:47:27.354107 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-64w76" event={"ID":"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e","Type":"ContainerStarted","Data":"fa5e2961c72a32d259c0d70fe48d40b374d9d973838327c9cc7e275ca27c4b82"} Jan 30 06:47:28 crc kubenswrapper[4841]: E0130 06:47:28.270359 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59452b25_eace_4fe3_985a_2efd23a31dc4.slice\": RecentStats: unable to find data in memory cache]" Jan 30 06:47:28 crc kubenswrapper[4841]: I0130 06:47:28.435456 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:47:28 crc kubenswrapper[4841]: E0130 06:47:28.436740 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.374138 4841 generic.go:334] "Generic (PLEG): container finished" podID="e20a4f5d-18f6-4eeb-84ac-6d1ee628795e" containerID="bf58abbbe911d79d1c2a227e5f50207e46467aa147b6ebf27c76fb004a8bcea0" exitCode=0 Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.374240 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-64w76" event={"ID":"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e","Type":"ContainerDied","Data":"bf58abbbe911d79d1c2a227e5f50207e46467aa147b6ebf27c76fb004a8bcea0"} Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.601780 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-fvvjj"] Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.604568 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.607701 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.634628 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-fvvjj"] Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.739892 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-scripts\") pod \"octavia-db-sync-fvvjj\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.740062 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/621420dc-8438-4a7d-ad06-37e25becb572-config-data-merged\") pod \"octavia-db-sync-fvvjj\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.740091 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-config-data\") pod \"octavia-db-sync-fvvjj\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.740203 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-combined-ca-bundle\") pod \"octavia-db-sync-fvvjj\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.842032 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/621420dc-8438-4a7d-ad06-37e25becb572-config-data-merged\") pod \"octavia-db-sync-fvvjj\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.842078 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-config-data\") pod \"octavia-db-sync-fvvjj\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.842146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-combined-ca-bundle\") pod \"octavia-db-sync-fvvjj\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.842205 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-scripts\") pod \"octavia-db-sync-fvvjj\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.843115 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/621420dc-8438-4a7d-ad06-37e25becb572-config-data-merged\") pod \"octavia-db-sync-fvvjj\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.861461 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-config-data\") pod \"octavia-db-sync-fvvjj\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.863097 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-combined-ca-bundle\") pod \"octavia-db-sync-fvvjj\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.864133 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-scripts\") pod \"octavia-db-sync-fvvjj\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:29 crc kubenswrapper[4841]: I0130 06:47:29.935140 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:34 crc kubenswrapper[4841]: I0130 06:47:34.877524 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-fvvjj"] Jan 30 06:47:35 crc kubenswrapper[4841]: I0130 06:47:35.456308 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" event={"ID":"74ac8b5a-d763-4545-8b4a-bbaa045a56e3","Type":"ContainerStarted","Data":"423a3cbc737e118190f4aba3e6fe1338963c36be5bedffc4ac72c6f922b4a55e"} Jan 30 06:47:35 crc kubenswrapper[4841]: I0130 06:47:35.459588 4841 generic.go:334] "Generic (PLEG): container finished" podID="621420dc-8438-4a7d-ad06-37e25becb572" containerID="a58793f75867646f56782014bc7dfb75ad6b7818ff19eaa2ee886e3c411105dd" exitCode=0 Jan 30 06:47:35 crc kubenswrapper[4841]: I0130 06:47:35.459639 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-fvvjj" event={"ID":"621420dc-8438-4a7d-ad06-37e25becb572","Type":"ContainerDied","Data":"a58793f75867646f56782014bc7dfb75ad6b7818ff19eaa2ee886e3c411105dd"} Jan 30 06:47:35 crc kubenswrapper[4841]: I0130 06:47:35.459663 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-fvvjj" event={"ID":"621420dc-8438-4a7d-ad06-37e25becb572","Type":"ContainerStarted","Data":"3ab366b735a1985e23dac349eec5443a477a71a5edd9651773351736718fe02e"} Jan 30 06:47:35 crc kubenswrapper[4841]: I0130 06:47:35.463761 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-c92xw" event={"ID":"1f804e83-62a5-4880-b595-7ece506cb780","Type":"ContainerStarted","Data":"ccac50ea2e8d30301ca46b0dc7011760114f8488c1e6dc50344747b03d4e14bb"} Jan 30 06:47:35 crc kubenswrapper[4841]: I0130 06:47:35.464310 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:35 crc kubenswrapper[4841]: I0130 06:47:35.466742 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-64w76" event={"ID":"e20a4f5d-18f6-4eeb-84ac-6d1ee628795e","Type":"ContainerStarted","Data":"6be725f149bc62cc2cbe10ce0d1e3f93773c96c80d1c0a938f6d18f26b0f754c"} Jan 30 06:47:35 crc kubenswrapper[4841]: I0130 06:47:35.467225 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:35 crc kubenswrapper[4841]: I0130 06:47:35.508482 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-c92xw" podStartSLOduration=2.275380238 podStartE2EDuration="15.508461225s" podCreationTimestamp="2026-01-30 06:47:20 +0000 UTC" firstStartedPulling="2026-01-30 06:47:21.187739907 +0000 UTC m=+5978.181212545" lastFinishedPulling="2026-01-30 06:47:34.420820894 +0000 UTC m=+5991.414293532" observedRunningTime="2026-01-30 06:47:35.501351795 +0000 UTC m=+5992.494824483" watchObservedRunningTime="2026-01-30 06:47:35.508461225 +0000 UTC m=+5992.501933863" Jan 30 06:47:35 crc kubenswrapper[4841]: I0130 06:47:35.537617 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-64w76" podStartSLOduration=10.537598822 podStartE2EDuration="10.537598822s" podCreationTimestamp="2026-01-30 06:47:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:47:35.528836679 +0000 UTC m=+5992.522309337" watchObservedRunningTime="2026-01-30 06:47:35.537598822 +0000 UTC m=+5992.531071470" Jan 30 06:47:36 crc kubenswrapper[4841]: I0130 06:47:36.487336 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-fvvjj" event={"ID":"621420dc-8438-4a7d-ad06-37e25becb572","Type":"ContainerStarted","Data":"7c753b15d9f36d778db1c8d786c8c9dfc38edce3c553775a33eb5718aa479253"} Jan 30 06:47:36 crc kubenswrapper[4841]: I0130 06:47:36.538113 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-fvvjj" podStartSLOduration=7.538084639 podStartE2EDuration="7.538084639s" podCreationTimestamp="2026-01-30 06:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:47:36.517233542 +0000 UTC m=+5993.510706250" watchObservedRunningTime="2026-01-30 06:47:36.538084639 +0000 UTC m=+5993.531557317" Jan 30 06:47:38 crc kubenswrapper[4841]: E0130 06:47:38.475898 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59452b25_eace_4fe3_985a_2efd23a31dc4.slice\": RecentStats: unable to find data in memory cache]" Jan 30 06:47:38 crc kubenswrapper[4841]: I0130 06:47:38.561707 4841 generic.go:334] "Generic (PLEG): container finished" podID="74ac8b5a-d763-4545-8b4a-bbaa045a56e3" containerID="423a3cbc737e118190f4aba3e6fe1338963c36be5bedffc4ac72c6f922b4a55e" exitCode=0 Jan 30 06:47:38 crc kubenswrapper[4841]: I0130 06:47:38.561758 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" event={"ID":"74ac8b5a-d763-4545-8b4a-bbaa045a56e3","Type":"ContainerDied","Data":"423a3cbc737e118190f4aba3e6fe1338963c36be5bedffc4ac72c6f922b4a55e"} Jan 30 06:47:39 crc kubenswrapper[4841]: I0130 06:47:39.576176 4841 generic.go:334] "Generic (PLEG): container finished" podID="621420dc-8438-4a7d-ad06-37e25becb572" containerID="7c753b15d9f36d778db1c8d786c8c9dfc38edce3c553775a33eb5718aa479253" exitCode=0 Jan 30 06:47:39 crc kubenswrapper[4841]: I0130 06:47:39.576717 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-fvvjj" event={"ID":"621420dc-8438-4a7d-ad06-37e25becb572","Type":"ContainerDied","Data":"7c753b15d9f36d778db1c8d786c8c9dfc38edce3c553775a33eb5718aa479253"} Jan 30 06:47:39 crc kubenswrapper[4841]: I0130 06:47:39.580774 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" event={"ID":"74ac8b5a-d763-4545-8b4a-bbaa045a56e3","Type":"ContainerStarted","Data":"d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9"} Jan 30 06:47:39 crc kubenswrapper[4841]: I0130 06:47:39.623675 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" podStartSLOduration=6.94540144 podStartE2EDuration="19.623649749s" podCreationTimestamp="2026-01-30 06:47:20 +0000 UTC" firstStartedPulling="2026-01-30 06:47:21.895259579 +0000 UTC m=+5978.888732217" lastFinishedPulling="2026-01-30 06:47:34.573507888 +0000 UTC m=+5991.566980526" observedRunningTime="2026-01-30 06:47:39.620991508 +0000 UTC m=+5996.614464166" watchObservedRunningTime="2026-01-30 06:47:39.623649749 +0000 UTC m=+5996.617122417" Jan 30 06:47:40 crc kubenswrapper[4841]: I0130 06:47:40.433044 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:47:40 crc kubenswrapper[4841]: E0130 06:47:40.433350 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:47:40 crc kubenswrapper[4841]: I0130 06:47:40.910142 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-64w76" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.081825 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.212609 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-config-data\") pod \"621420dc-8438-4a7d-ad06-37e25becb572\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.212687 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/621420dc-8438-4a7d-ad06-37e25becb572-config-data-merged\") pod \"621420dc-8438-4a7d-ad06-37e25becb572\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.212779 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-scripts\") pod \"621420dc-8438-4a7d-ad06-37e25becb572\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.212894 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-combined-ca-bundle\") pod \"621420dc-8438-4a7d-ad06-37e25becb572\" (UID: \"621420dc-8438-4a7d-ad06-37e25becb572\") " Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.219028 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-scripts" (OuterVolumeSpecName: "scripts") pod "621420dc-8438-4a7d-ad06-37e25becb572" (UID: "621420dc-8438-4a7d-ad06-37e25becb572"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.224467 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-config-data" (OuterVolumeSpecName: "config-data") pod "621420dc-8438-4a7d-ad06-37e25becb572" (UID: "621420dc-8438-4a7d-ad06-37e25becb572"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.251040 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/621420dc-8438-4a7d-ad06-37e25becb572-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "621420dc-8438-4a7d-ad06-37e25becb572" (UID: "621420dc-8438-4a7d-ad06-37e25becb572"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.253872 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "621420dc-8438-4a7d-ad06-37e25becb572" (UID: "621420dc-8438-4a7d-ad06-37e25becb572"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.280297 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.315355 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.315692 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.315706 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/621420dc-8438-4a7d-ad06-37e25becb572-config-data-merged\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.315718 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621420dc-8438-4a7d-ad06-37e25becb572-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.376702 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-86b4d7c875-xgqnc" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.486957 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-7698788cdc-k27zp"] Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.487240 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-7698788cdc-k27zp" podUID="53f391b1-d694-41d6-b689-b41496fe31ca" containerName="octavia-api" containerID="cri-o://5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5" gracePeriod=30 Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.487718 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-7698788cdc-k27zp" podUID="53f391b1-d694-41d6-b689-b41496fe31ca" containerName="octavia-api-provider-agent" containerID="cri-o://bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310" gracePeriod=30 Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.598104 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-fvvjj" Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.606391 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-fvvjj" event={"ID":"621420dc-8438-4a7d-ad06-37e25becb572","Type":"ContainerDied","Data":"3ab366b735a1985e23dac349eec5443a477a71a5edd9651773351736718fe02e"} Jan 30 06:47:41 crc kubenswrapper[4841]: I0130 06:47:41.606577 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ab366b735a1985e23dac349eec5443a477a71a5edd9651773351736718fe02e" Jan 30 06:47:43 crc kubenswrapper[4841]: I0130 06:47:43.622852 4841 generic.go:334] "Generic (PLEG): container finished" podID="53f391b1-d694-41d6-b689-b41496fe31ca" containerID="bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310" exitCode=0 Jan 30 06:47:43 crc kubenswrapper[4841]: I0130 06:47:43.622913 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7698788cdc-k27zp" event={"ID":"53f391b1-d694-41d6-b689-b41496fe31ca","Type":"ContainerDied","Data":"bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310"} Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.201107 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.325175 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-ovndb-tls-certs\") pod \"53f391b1-d694-41d6-b689-b41496fe31ca\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.325232 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-combined-ca-bundle\") pod \"53f391b1-d694-41d6-b689-b41496fe31ca\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.325297 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-config-data\") pod \"53f391b1-d694-41d6-b689-b41496fe31ca\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.325509 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/53f391b1-d694-41d6-b689-b41496fe31ca-octavia-run\") pod \"53f391b1-d694-41d6-b689-b41496fe31ca\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.325657 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-scripts\") pod \"53f391b1-d694-41d6-b689-b41496fe31ca\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.325854 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53f391b1-d694-41d6-b689-b41496fe31ca-octavia-run" (OuterVolumeSpecName: "octavia-run") pod "53f391b1-d694-41d6-b689-b41496fe31ca" (UID: "53f391b1-d694-41d6-b689-b41496fe31ca"). InnerVolumeSpecName "octavia-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.326344 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/53f391b1-d694-41d6-b689-b41496fe31ca-config-data-merged\") pod \"53f391b1-d694-41d6-b689-b41496fe31ca\" (UID: \"53f391b1-d694-41d6-b689-b41496fe31ca\") " Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.327181 4841 reconciler_common.go:293] "Volume detached for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/53f391b1-d694-41d6-b689-b41496fe31ca-octavia-run\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.331445 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-scripts" (OuterVolumeSpecName: "scripts") pod "53f391b1-d694-41d6-b689-b41496fe31ca" (UID: "53f391b1-d694-41d6-b689-b41496fe31ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.332212 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-config-data" (OuterVolumeSpecName: "config-data") pod "53f391b1-d694-41d6-b689-b41496fe31ca" (UID: "53f391b1-d694-41d6-b689-b41496fe31ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.391540 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53f391b1-d694-41d6-b689-b41496fe31ca-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "53f391b1-d694-41d6-b689-b41496fe31ca" (UID: "53f391b1-d694-41d6-b689-b41496fe31ca"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.419090 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53f391b1-d694-41d6-b689-b41496fe31ca" (UID: "53f391b1-d694-41d6-b689-b41496fe31ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.429194 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.429229 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/53f391b1-d694-41d6-b689-b41496fe31ca-config-data-merged\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.429240 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.429248 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.502537 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "53f391b1-d694-41d6-b689-b41496fe31ca" (UID: "53f391b1-d694-41d6-b689-b41496fe31ca"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.530792 4841 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f391b1-d694-41d6-b689-b41496fe31ca-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.648946 4841 generic.go:334] "Generic (PLEG): container finished" podID="53f391b1-d694-41d6-b689-b41496fe31ca" containerID="5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5" exitCode=0 Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.648999 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7698788cdc-k27zp" event={"ID":"53f391b1-d694-41d6-b689-b41496fe31ca","Type":"ContainerDied","Data":"5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5"} Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.649025 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7698788cdc-k27zp" event={"ID":"53f391b1-d694-41d6-b689-b41496fe31ca","Type":"ContainerDied","Data":"f8ee41bcd3852e15c6d24fddc6a1e91f6fbacb39c9940748d49d51c2bed37efb"} Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.649044 4841 scope.go:117] "RemoveContainer" containerID="bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.649203 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7698788cdc-k27zp" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.686719 4841 scope.go:117] "RemoveContainer" containerID="5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.698482 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-7698788cdc-k27zp"] Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.706680 4841 scope.go:117] "RemoveContainer" containerID="bf63610ed30e7b4d2dde3128803743eb1ac3f8b436f2568a2196d5d987954723" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.713900 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-api-7698788cdc-k27zp"] Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.733619 4841 scope.go:117] "RemoveContainer" containerID="bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310" Jan 30 06:47:45 crc kubenswrapper[4841]: E0130 06:47:45.734392 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310\": container with ID starting with bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310 not found: ID does not exist" containerID="bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.734511 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310"} err="failed to get container status \"bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310\": rpc error: code = NotFound desc = could not find container \"bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310\": container with ID starting with bd498c428db5c8ce2a3c783cc3ca72e2850d611604c15a36748250898ba72310 not found: ID does not exist" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.734565 4841 scope.go:117] "RemoveContainer" containerID="5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5" Jan 30 06:47:45 crc kubenswrapper[4841]: E0130 06:47:45.736105 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5\": container with ID starting with 5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5 not found: ID does not exist" containerID="5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.736183 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5"} err="failed to get container status \"5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5\": rpc error: code = NotFound desc = could not find container \"5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5\": container with ID starting with 5f5a4327fd293ac9a5a762ee728be043de1e9de348d70f1c963967c755bfbfe5 not found: ID does not exist" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.736234 4841 scope.go:117] "RemoveContainer" containerID="bf63610ed30e7b4d2dde3128803743eb1ac3f8b436f2568a2196d5d987954723" Jan 30 06:47:45 crc kubenswrapper[4841]: E0130 06:47:45.740002 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf63610ed30e7b4d2dde3128803743eb1ac3f8b436f2568a2196d5d987954723\": container with ID starting with bf63610ed30e7b4d2dde3128803743eb1ac3f8b436f2568a2196d5d987954723 not found: ID does not exist" containerID="bf63610ed30e7b4d2dde3128803743eb1ac3f8b436f2568a2196d5d987954723" Jan 30 06:47:45 crc kubenswrapper[4841]: I0130 06:47:45.740026 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf63610ed30e7b4d2dde3128803743eb1ac3f8b436f2568a2196d5d987954723"} err="failed to get container status \"bf63610ed30e7b4d2dde3128803743eb1ac3f8b436f2568a2196d5d987954723\": rpc error: code = NotFound desc = could not find container \"bf63610ed30e7b4d2dde3128803743eb1ac3f8b436f2568a2196d5d987954723\": container with ID starting with bf63610ed30e7b4d2dde3128803743eb1ac3f8b436f2568a2196d5d987954723 not found: ID does not exist" Jan 30 06:47:46 crc kubenswrapper[4841]: I0130 06:47:46.451358 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f391b1-d694-41d6-b689-b41496fe31ca" path="/var/lib/kubelet/pods/53f391b1-d694-41d6-b689-b41496fe31ca/volumes" Jan 30 06:47:50 crc kubenswrapper[4841]: I0130 06:47:50.611155 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-c92xw" Jan 30 06:47:55 crc kubenswrapper[4841]: I0130 06:47:55.433197 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:47:55 crc kubenswrapper[4841]: I0130 06:47:55.788500 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"5f5c280cd214672383e7f51c0a2877cc3f0cec94eda93bd7736e321f46e8506f"} Jan 30 06:48:12 crc kubenswrapper[4841]: I0130 06:48:12.209453 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-wb5wm"] Jan 30 06:48:12 crc kubenswrapper[4841]: I0130 06:48:12.210168 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" podUID="74ac8b5a-d763-4545-8b4a-bbaa045a56e3" containerName="octavia-amphora-httpd" containerID="cri-o://d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9" gracePeriod=30 Jan 30 06:48:12 crc kubenswrapper[4841]: I0130 06:48:12.848580 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" Jan 30 06:48:12 crc kubenswrapper[4841]: I0130 06:48:12.954840 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/74ac8b5a-d763-4545-8b4a-bbaa045a56e3-amphora-image\") pod \"74ac8b5a-d763-4545-8b4a-bbaa045a56e3\" (UID: \"74ac8b5a-d763-4545-8b4a-bbaa045a56e3\") " Jan 30 06:48:12 crc kubenswrapper[4841]: I0130 06:48:12.955033 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74ac8b5a-d763-4545-8b4a-bbaa045a56e3-httpd-config\") pod \"74ac8b5a-d763-4545-8b4a-bbaa045a56e3\" (UID: \"74ac8b5a-d763-4545-8b4a-bbaa045a56e3\") " Jan 30 06:48:12 crc kubenswrapper[4841]: I0130 06:48:12.992217 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ac8b5a-d763-4545-8b4a-bbaa045a56e3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "74ac8b5a-d763-4545-8b4a-bbaa045a56e3" (UID: "74ac8b5a-d763-4545-8b4a-bbaa045a56e3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.023979 4841 generic.go:334] "Generic (PLEG): container finished" podID="74ac8b5a-d763-4545-8b4a-bbaa045a56e3" containerID="d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9" exitCode=0 Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.024028 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" event={"ID":"74ac8b5a-d763-4545-8b4a-bbaa045a56e3","Type":"ContainerDied","Data":"d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9"} Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.024053 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" event={"ID":"74ac8b5a-d763-4545-8b4a-bbaa045a56e3","Type":"ContainerDied","Data":"1ad7df0036056a140694be934467b241bd6f87e35523f7a9809591b27ac08231"} Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.024068 4841 scope.go:117] "RemoveContainer" containerID="d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9" Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.024101 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-wb5wm" Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.032575 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74ac8b5a-d763-4545-8b4a-bbaa045a56e3-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "74ac8b5a-d763-4545-8b4a-bbaa045a56e3" (UID: "74ac8b5a-d763-4545-8b4a-bbaa045a56e3"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.057096 4841 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/74ac8b5a-d763-4545-8b4a-bbaa045a56e3-amphora-image\") on node \"crc\" DevicePath \"\"" Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.057126 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74ac8b5a-d763-4545-8b4a-bbaa045a56e3-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.094691 4841 scope.go:117] "RemoveContainer" containerID="423a3cbc737e118190f4aba3e6fe1338963c36be5bedffc4ac72c6f922b4a55e" Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.113233 4841 scope.go:117] "RemoveContainer" containerID="d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9" Jan 30 06:48:13 crc kubenswrapper[4841]: E0130 06:48:13.113722 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9\": container with ID starting with d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9 not found: ID does not exist" containerID="d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9" Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.113781 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9"} err="failed to get container status \"d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9\": rpc error: code = NotFound desc = could not find container \"d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9\": container with ID starting with d206e4ffbcb611138c710022d3bcf76e59734fc2712e35ef9d3c8c25fe8fcea9 not found: ID does not exist" Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.113808 4841 scope.go:117] "RemoveContainer" containerID="423a3cbc737e118190f4aba3e6fe1338963c36be5bedffc4ac72c6f922b4a55e" Jan 30 06:48:13 crc kubenswrapper[4841]: E0130 06:48:13.114193 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"423a3cbc737e118190f4aba3e6fe1338963c36be5bedffc4ac72c6f922b4a55e\": container with ID starting with 423a3cbc737e118190f4aba3e6fe1338963c36be5bedffc4ac72c6f922b4a55e not found: ID does not exist" containerID="423a3cbc737e118190f4aba3e6fe1338963c36be5bedffc4ac72c6f922b4a55e" Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.114236 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"423a3cbc737e118190f4aba3e6fe1338963c36be5bedffc4ac72c6f922b4a55e"} err="failed to get container status \"423a3cbc737e118190f4aba3e6fe1338963c36be5bedffc4ac72c6f922b4a55e\": rpc error: code = NotFound desc = could not find container \"423a3cbc737e118190f4aba3e6fe1338963c36be5bedffc4ac72c6f922b4a55e\": container with ID starting with 423a3cbc737e118190f4aba3e6fe1338963c36be5bedffc4ac72c6f922b4a55e not found: ID does not exist" Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.355518 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-wb5wm"] Jan 30 06:48:13 crc kubenswrapper[4841]: I0130 06:48:13.363171 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-wb5wm"] Jan 30 06:48:14 crc kubenswrapper[4841]: I0130 06:48:14.451235 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74ac8b5a-d763-4545-8b4a-bbaa045a56e3" path="/var/lib/kubelet/pods/74ac8b5a-d763-4545-8b4a-bbaa045a56e3/volumes" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.610476 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-xxkcb"] Jan 30 06:48:20 crc kubenswrapper[4841]: E0130 06:48:20.612215 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f391b1-d694-41d6-b689-b41496fe31ca" containerName="octavia-api" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.612250 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f391b1-d694-41d6-b689-b41496fe31ca" containerName="octavia-api" Jan 30 06:48:20 crc kubenswrapper[4841]: E0130 06:48:20.612285 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ac8b5a-d763-4545-8b4a-bbaa045a56e3" containerName="init" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.612304 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ac8b5a-d763-4545-8b4a-bbaa045a56e3" containerName="init" Jan 30 06:48:20 crc kubenswrapper[4841]: E0130 06:48:20.612345 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f391b1-d694-41d6-b689-b41496fe31ca" containerName="init" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.612362 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f391b1-d694-41d6-b689-b41496fe31ca" containerName="init" Jan 30 06:48:20 crc kubenswrapper[4841]: E0130 06:48:20.612445 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621420dc-8438-4a7d-ad06-37e25becb572" containerName="octavia-db-sync" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.612466 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="621420dc-8438-4a7d-ad06-37e25becb572" containerName="octavia-db-sync" Jan 30 06:48:20 crc kubenswrapper[4841]: E0130 06:48:20.612504 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621420dc-8438-4a7d-ad06-37e25becb572" containerName="init" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.612521 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="621420dc-8438-4a7d-ad06-37e25becb572" containerName="init" Jan 30 06:48:20 crc kubenswrapper[4841]: E0130 06:48:20.612550 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ac8b5a-d763-4545-8b4a-bbaa045a56e3" containerName="octavia-amphora-httpd" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.612569 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ac8b5a-d763-4545-8b4a-bbaa045a56e3" containerName="octavia-amphora-httpd" Jan 30 06:48:20 crc kubenswrapper[4841]: E0130 06:48:20.612599 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f391b1-d694-41d6-b689-b41496fe31ca" containerName="octavia-api-provider-agent" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.612618 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f391b1-d694-41d6-b689-b41496fe31ca" containerName="octavia-api-provider-agent" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.612995 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="621420dc-8438-4a7d-ad06-37e25becb572" containerName="octavia-db-sync" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.613022 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ac8b5a-d763-4545-8b4a-bbaa045a56e3" containerName="octavia-amphora-httpd" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.613069 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f391b1-d694-41d6-b689-b41496fe31ca" containerName="octavia-api" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.613095 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f391b1-d694-41d6-b689-b41496fe31ca" containerName="octavia-api-provider-agent" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.615292 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.622563 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.629970 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-xxkcb"] Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.651876 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb-httpd-config\") pod \"octavia-image-upload-65dd99cb46-xxkcb\" (UID: \"c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb\") " pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.653520 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb-amphora-image\") pod \"octavia-image-upload-65dd99cb46-xxkcb\" (UID: \"c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb\") " pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.756306 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb-amphora-image\") pod \"octavia-image-upload-65dd99cb46-xxkcb\" (UID: \"c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb\") " pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.756576 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb-httpd-config\") pod \"octavia-image-upload-65dd99cb46-xxkcb\" (UID: \"c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb\") " pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.757249 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb-amphora-image\") pod \"octavia-image-upload-65dd99cb46-xxkcb\" (UID: \"c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb\") " pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.766225 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb-httpd-config\") pod \"octavia-image-upload-65dd99cb46-xxkcb\" (UID: \"c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb\") " pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" Jan 30 06:48:20 crc kubenswrapper[4841]: I0130 06:48:20.942310 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" Jan 30 06:48:21 crc kubenswrapper[4841]: I0130 06:48:21.470793 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-xxkcb"] Jan 30 06:48:22 crc kubenswrapper[4841]: I0130 06:48:22.176130 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" event={"ID":"c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb","Type":"ContainerStarted","Data":"bb0cc2d2b90cab63f7f11d62fbfe37903688ba921c4f27b51834a0b917f6ee83"} Jan 30 06:48:23 crc kubenswrapper[4841]: I0130 06:48:23.193799 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" event={"ID":"c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb","Type":"ContainerStarted","Data":"852752c2e0f855bb0f94f7a08896d480c54b57fc70a8ade64642221a797bc80b"} Jan 30 06:48:24 crc kubenswrapper[4841]: I0130 06:48:24.208543 4841 generic.go:334] "Generic (PLEG): container finished" podID="c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb" containerID="852752c2e0f855bb0f94f7a08896d480c54b57fc70a8ade64642221a797bc80b" exitCode=0 Jan 30 06:48:24 crc kubenswrapper[4841]: I0130 06:48:24.208606 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" event={"ID":"c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb","Type":"ContainerDied","Data":"852752c2e0f855bb0f94f7a08896d480c54b57fc70a8ade64642221a797bc80b"} Jan 30 06:48:25 crc kubenswrapper[4841]: I0130 06:48:25.223444 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" event={"ID":"c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb","Type":"ContainerStarted","Data":"b162d1d4e959b3d1326e3900d800e98fecc3b94fe86a79512fef168c452890bf"} Jan 30 06:48:25 crc kubenswrapper[4841]: I0130 06:48:25.260374 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-65dd99cb46-xxkcb" podStartSLOduration=4.757415965 podStartE2EDuration="5.26035211s" podCreationTimestamp="2026-01-30 06:48:20 +0000 UTC" firstStartedPulling="2026-01-30 06:48:21.486006016 +0000 UTC m=+6038.479478694" lastFinishedPulling="2026-01-30 06:48:21.988942191 +0000 UTC m=+6038.982414839" observedRunningTime="2026-01-30 06:48:25.245497544 +0000 UTC m=+6042.238970222" watchObservedRunningTime="2026-01-30 06:48:25.26035211 +0000 UTC m=+6042.253824758" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.366546 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-d9nvn"] Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.369767 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.373424 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.373935 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.375831 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-d9nvn"] Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.546665 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8414debc-743c-4450-9a4b-d3cc68df42c7-config-data\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.546767 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8414debc-743c-4450-9a4b-d3cc68df42c7-config-data-merged\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.546807 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8414debc-743c-4450-9a4b-d3cc68df42c7-amphora-certs\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.547043 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8414debc-743c-4450-9a4b-d3cc68df42c7-combined-ca-bundle\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.547318 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8414debc-743c-4450-9a4b-d3cc68df42c7-hm-ports\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.547455 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8414debc-743c-4450-9a4b-d3cc68df42c7-scripts\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.649906 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8414debc-743c-4450-9a4b-d3cc68df42c7-hm-ports\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.649998 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8414debc-743c-4450-9a4b-d3cc68df42c7-scripts\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.650102 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8414debc-743c-4450-9a4b-d3cc68df42c7-config-data\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.650178 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8414debc-743c-4450-9a4b-d3cc68df42c7-config-data-merged\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.650224 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8414debc-743c-4450-9a4b-d3cc68df42c7-amphora-certs\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.651045 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8414debc-743c-4450-9a4b-d3cc68df42c7-config-data-merged\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.651590 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8414debc-743c-4450-9a4b-d3cc68df42c7-combined-ca-bundle\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.651801 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8414debc-743c-4450-9a4b-d3cc68df42c7-hm-ports\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.659365 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8414debc-743c-4450-9a4b-d3cc68df42c7-config-data\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.670873 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8414debc-743c-4450-9a4b-d3cc68df42c7-amphora-certs\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.672487 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8414debc-743c-4450-9a4b-d3cc68df42c7-scripts\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.676342 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8414debc-743c-4450-9a4b-d3cc68df42c7-combined-ca-bundle\") pod \"octavia-housekeeping-d9nvn\" (UID: \"8414debc-743c-4450-9a4b-d3cc68df42c7\") " pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:29 crc kubenswrapper[4841]: I0130 06:48:29.706537 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:30 crc kubenswrapper[4841]: I0130 06:48:30.296730 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-d9nvn"] Jan 30 06:48:30 crc kubenswrapper[4841]: W0130 06:48:30.306092 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8414debc_743c_4450_9a4b_d3cc68df42c7.slice/crio-5da371455ecab5ccff59e8c286d84cc0fc88ba23e736e753e4e4f89323f4713f WatchSource:0}: Error finding container 5da371455ecab5ccff59e8c286d84cc0fc88ba23e736e753e4e4f89323f4713f: Status 404 returned error can't find the container with id 5da371455ecab5ccff59e8c286d84cc0fc88ba23e736e753e4e4f89323f4713f Jan 30 06:48:30 crc kubenswrapper[4841]: I0130 06:48:30.308414 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:48:31 crc kubenswrapper[4841]: I0130 06:48:31.298636 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-d9nvn" event={"ID":"8414debc-743c-4450-9a4b-d3cc68df42c7","Type":"ContainerStarted","Data":"5da371455ecab5ccff59e8c286d84cc0fc88ba23e736e753e4e4f89323f4713f"} Jan 30 06:48:31 crc kubenswrapper[4841]: I0130 06:48:31.917365 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-55s5s"] Jan 30 06:48:31 crc kubenswrapper[4841]: I0130 06:48:31.919111 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-55s5s" Jan 30 06:48:31 crc kubenswrapper[4841]: I0130 06:48:31.921496 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Jan 30 06:48:31 crc kubenswrapper[4841]: I0130 06:48:31.921953 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Jan 30 06:48:31 crc kubenswrapper[4841]: I0130 06:48:31.929296 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-55s5s"] Jan 30 06:48:31 crc kubenswrapper[4841]: I0130 06:48:31.961802 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794000ea-5a8e-44dd-b44c-af445a21f8ec-config-data\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:31 crc kubenswrapper[4841]: I0130 06:48:31.961874 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/794000ea-5a8e-44dd-b44c-af445a21f8ec-amphora-certs\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:31 crc kubenswrapper[4841]: I0130 06:48:31.961906 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/794000ea-5a8e-44dd-b44c-af445a21f8ec-config-data-merged\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:31 crc kubenswrapper[4841]: I0130 06:48:31.961928 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/794000ea-5a8e-44dd-b44c-af445a21f8ec-hm-ports\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:31 crc kubenswrapper[4841]: I0130 06:48:31.961968 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794000ea-5a8e-44dd-b44c-af445a21f8ec-combined-ca-bundle\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:31 crc kubenswrapper[4841]: I0130 06:48:31.962015 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794000ea-5a8e-44dd-b44c-af445a21f8ec-scripts\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.063611 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794000ea-5a8e-44dd-b44c-af445a21f8ec-combined-ca-bundle\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.063714 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794000ea-5a8e-44dd-b44c-af445a21f8ec-scripts\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.063796 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794000ea-5a8e-44dd-b44c-af445a21f8ec-config-data\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.063858 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/794000ea-5a8e-44dd-b44c-af445a21f8ec-amphora-certs\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.063911 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/794000ea-5a8e-44dd-b44c-af445a21f8ec-config-data-merged\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.063939 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/794000ea-5a8e-44dd-b44c-af445a21f8ec-hm-ports\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.064795 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/794000ea-5a8e-44dd-b44c-af445a21f8ec-config-data-merged\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.065348 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/794000ea-5a8e-44dd-b44c-af445a21f8ec-hm-ports\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.071374 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/794000ea-5a8e-44dd-b44c-af445a21f8ec-amphora-certs\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.072244 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794000ea-5a8e-44dd-b44c-af445a21f8ec-scripts\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.076658 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794000ea-5a8e-44dd-b44c-af445a21f8ec-combined-ca-bundle\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.087823 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794000ea-5a8e-44dd-b44c-af445a21f8ec-config-data\") pod \"octavia-worker-55s5s\" (UID: \"794000ea-5a8e-44dd-b44c-af445a21f8ec\") " pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.245218 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-55s5s" Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.326533 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-d9nvn" event={"ID":"8414debc-743c-4450-9a4b-d3cc68df42c7","Type":"ContainerStarted","Data":"f2355ef5314bd62f1aef323b0c072da559b2cbbaa970ace5c5b1d0a3351ff07d"} Jan 30 06:48:32 crc kubenswrapper[4841]: I0130 06:48:32.827498 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-55s5s"] Jan 30 06:48:32 crc kubenswrapper[4841]: W0130 06:48:32.876024 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod794000ea_5a8e_44dd_b44c_af445a21f8ec.slice/crio-4fd5a9a7f1419af24955d4aa1174577de6461bbbbb968a84b66610828bc2243b WatchSource:0}: Error finding container 4fd5a9a7f1419af24955d4aa1174577de6461bbbbb968a84b66610828bc2243b: Status 404 returned error can't find the container with id 4fd5a9a7f1419af24955d4aa1174577de6461bbbbb968a84b66610828bc2243b Jan 30 06:48:33 crc kubenswrapper[4841]: I0130 06:48:33.338010 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-55s5s" event={"ID":"794000ea-5a8e-44dd-b44c-af445a21f8ec","Type":"ContainerStarted","Data":"4fd5a9a7f1419af24955d4aa1174577de6461bbbbb968a84b66610828bc2243b"} Jan 30 06:48:33 crc kubenswrapper[4841]: I0130 06:48:33.528871 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-64w76"] Jan 30 06:48:34 crc kubenswrapper[4841]: I0130 06:48:34.353729 4841 generic.go:334] "Generic (PLEG): container finished" podID="8414debc-743c-4450-9a4b-d3cc68df42c7" containerID="f2355ef5314bd62f1aef323b0c072da559b2cbbaa970ace5c5b1d0a3351ff07d" exitCode=0 Jan 30 06:48:34 crc kubenswrapper[4841]: I0130 06:48:34.353791 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-d9nvn" event={"ID":"8414debc-743c-4450-9a4b-d3cc68df42c7","Type":"ContainerDied","Data":"f2355ef5314bd62f1aef323b0c072da559b2cbbaa970ace5c5b1d0a3351ff07d"} Jan 30 06:48:35 crc kubenswrapper[4841]: I0130 06:48:35.369097 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-55s5s" event={"ID":"794000ea-5a8e-44dd-b44c-af445a21f8ec","Type":"ContainerStarted","Data":"e3c8dc96f5458eaffc2456a058017d6cd28da139b52aac61934ec9fe2ec8836d"} Jan 30 06:48:35 crc kubenswrapper[4841]: I0130 06:48:35.376281 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-d9nvn" event={"ID":"8414debc-743c-4450-9a4b-d3cc68df42c7","Type":"ContainerStarted","Data":"13eebfb72447e2686ea905b99de4830c664598f8c1fcd3a5595d9c3bcc40a714"} Jan 30 06:48:35 crc kubenswrapper[4841]: I0130 06:48:35.376830 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:35 crc kubenswrapper[4841]: I0130 06:48:35.446968 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-d9nvn" podStartSLOduration=5.097572286 podStartE2EDuration="6.446942568s" podCreationTimestamp="2026-01-30 06:48:29 +0000 UTC" firstStartedPulling="2026-01-30 06:48:30.308164911 +0000 UTC m=+6047.301637549" lastFinishedPulling="2026-01-30 06:48:31.657535193 +0000 UTC m=+6048.651007831" observedRunningTime="2026-01-30 06:48:35.435737309 +0000 UTC m=+6052.429209957" watchObservedRunningTime="2026-01-30 06:48:35.446942568 +0000 UTC m=+6052.440415236" Jan 30 06:48:36 crc kubenswrapper[4841]: I0130 06:48:36.397736 4841 generic.go:334] "Generic (PLEG): container finished" podID="794000ea-5a8e-44dd-b44c-af445a21f8ec" containerID="e3c8dc96f5458eaffc2456a058017d6cd28da139b52aac61934ec9fe2ec8836d" exitCode=0 Jan 30 06:48:36 crc kubenswrapper[4841]: I0130 06:48:36.397971 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-55s5s" event={"ID":"794000ea-5a8e-44dd-b44c-af445a21f8ec","Type":"ContainerDied","Data":"e3c8dc96f5458eaffc2456a058017d6cd28da139b52aac61934ec9fe2ec8836d"} Jan 30 06:48:37 crc kubenswrapper[4841]: I0130 06:48:37.413371 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-55s5s" event={"ID":"794000ea-5a8e-44dd-b44c-af445a21f8ec","Type":"ContainerStarted","Data":"40834c2745ea6f5d75810d7b7d3a503c57b7f59cbbe8b2b2056dd1f6c84c4145"} Jan 30 06:48:37 crc kubenswrapper[4841]: I0130 06:48:37.413762 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-55s5s" Jan 30 06:48:37 crc kubenswrapper[4841]: I0130 06:48:37.436997 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-55s5s" podStartSLOduration=4.55277904 podStartE2EDuration="6.436979488s" podCreationTimestamp="2026-01-30 06:48:31 +0000 UTC" firstStartedPulling="2026-01-30 06:48:32.878917751 +0000 UTC m=+6049.872390399" lastFinishedPulling="2026-01-30 06:48:34.763118209 +0000 UTC m=+6051.756590847" observedRunningTime="2026-01-30 06:48:37.434006229 +0000 UTC m=+6054.427478867" watchObservedRunningTime="2026-01-30 06:48:37.436979488 +0000 UTC m=+6054.430452126" Jan 30 06:48:44 crc kubenswrapper[4841]: I0130 06:48:44.747491 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-d9nvn" Jan 30 06:48:47 crc kubenswrapper[4841]: I0130 06:48:47.295007 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-55s5s" Jan 30 06:49:01 crc kubenswrapper[4841]: I0130 06:49:01.054334 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-f53a-account-create-update-t7kwr"] Jan 30 06:49:01 crc kubenswrapper[4841]: I0130 06:49:01.070823 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-nklng"] Jan 30 06:49:01 crc kubenswrapper[4841]: I0130 06:49:01.079698 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-f53a-account-create-update-t7kwr"] Jan 30 06:49:01 crc kubenswrapper[4841]: I0130 06:49:01.087082 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-nklng"] Jan 30 06:49:02 crc kubenswrapper[4841]: I0130 06:49:02.457919 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fc0c89-4399-4a71-ad3b-40cc2361d1b3" path="/var/lib/kubelet/pods/23fc0c89-4399-4a71-ad3b-40cc2361d1b3/volumes" Jan 30 06:49:02 crc kubenswrapper[4841]: I0130 06:49:02.458949 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c897c9-7826-48bc-89cb-01a9622ea531" path="/var/lib/kubelet/pods/35c897c9-7826-48bc-89cb-01a9622ea531/volumes" Jan 30 06:49:21 crc kubenswrapper[4841]: I0130 06:49:21.067220 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-67gbp"] Jan 30 06:49:21 crc kubenswrapper[4841]: I0130 06:49:21.079367 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-67gbp"] Jan 30 06:49:22 crc kubenswrapper[4841]: I0130 06:49:22.467152 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfcdfd69-9665-4513-a512-13b0d0914f33" path="/var/lib/kubelet/pods/cfcdfd69-9665-4513-a512-13b0d0914f33/volumes" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.454931 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cd6dfbc4f-mgq48"] Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.457495 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.467861 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cd6dfbc4f-mgq48"] Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.469718 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.469897 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.469855 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-57bfs" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.470592 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.493341 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.493700 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="95f2605a-425e-4320-a8c0-e78d3ad93fbb" containerName="glance-log" containerID="cri-o://91bc07556715b9b9bfde8ae1343195414189761c5a84ed76fb8332062c9a4a0c" gracePeriod=30 Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.494101 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="95f2605a-425e-4320-a8c0-e78d3ad93fbb" containerName="glance-httpd" containerID="cri-o://71df0b5d837f19d4320f3cdb5e1bd39c13694d215319db8e6ce04bcd80b0fbae" gracePeriod=30 Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.563838 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.564126 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b63ca59c-2c67-4c8d-8cef-344f3e76ba06" containerName="glance-log" containerID="cri-o://f23d5789a7d0f92a2ffe4f082632a36d784cdaaf4d451234700e7ea77d4bc8d5" gracePeriod=30 Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.564628 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b63ca59c-2c67-4c8d-8cef-344f3e76ba06" containerName="glance-httpd" containerID="cri-o://80f19d874887893ff5379339d65dec8893e864761923c4b813ab8523b9791d51" gracePeriod=30 Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.589985 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32f3107e-b6dd-4265-8517-b38b5f4eae56-config-data\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.590045 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32f3107e-b6dd-4265-8517-b38b5f4eae56-horizon-secret-key\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.590071 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32f3107e-b6dd-4265-8517-b38b5f4eae56-logs\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.590116 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32f3107e-b6dd-4265-8517-b38b5f4eae56-scripts\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.590148 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zm4w\" (UniqueName: \"kubernetes.io/projected/32f3107e-b6dd-4265-8517-b38b5f4eae56-kube-api-access-8zm4w\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.754701 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32f3107e-b6dd-4265-8517-b38b5f4eae56-config-data\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.754785 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32f3107e-b6dd-4265-8517-b38b5f4eae56-horizon-secret-key\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.754820 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32f3107e-b6dd-4265-8517-b38b5f4eae56-logs\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.754887 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32f3107e-b6dd-4265-8517-b38b5f4eae56-scripts\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.754928 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zm4w\" (UniqueName: \"kubernetes.io/projected/32f3107e-b6dd-4265-8517-b38b5f4eae56-kube-api-access-8zm4w\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.755773 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32f3107e-b6dd-4265-8517-b38b5f4eae56-logs\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.756422 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32f3107e-b6dd-4265-8517-b38b5f4eae56-scripts\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.756395 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32f3107e-b6dd-4265-8517-b38b5f4eae56-config-data\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.762536 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32f3107e-b6dd-4265-8517-b38b5f4eae56-horizon-secret-key\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.778137 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zm4w\" (UniqueName: \"kubernetes.io/projected/32f3107e-b6dd-4265-8517-b38b5f4eae56-kube-api-access-8zm4w\") pod \"horizon-5cd6dfbc4f-mgq48\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.789168 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.838918 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-665cc8dc4f-n4nmm"] Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.842195 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.864754 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-665cc8dc4f-n4nmm"] Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.959185 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-logs\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.959241 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-config-data\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.959269 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-horizon-secret-key\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.959391 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfqdj\" (UniqueName: \"kubernetes.io/projected/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-kube-api-access-bfqdj\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:42 crc kubenswrapper[4841]: I0130 06:49:42.959508 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-scripts\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.062814 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-config-data\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.062876 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-horizon-secret-key\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.062945 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfqdj\" (UniqueName: \"kubernetes.io/projected/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-kube-api-access-bfqdj\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.062973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-scripts\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.063867 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-scripts\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.063959 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-logs\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.064193 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-logs\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.064573 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-config-data\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.071158 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-horizon-secret-key\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.089381 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfqdj\" (UniqueName: \"kubernetes.io/projected/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-kube-api-access-bfqdj\") pod \"horizon-665cc8dc4f-n4nmm\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.220384 4841 generic.go:334] "Generic (PLEG): container finished" podID="95f2605a-425e-4320-a8c0-e78d3ad93fbb" containerID="91bc07556715b9b9bfde8ae1343195414189761c5a84ed76fb8332062c9a4a0c" exitCode=143 Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.220662 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95f2605a-425e-4320-a8c0-e78d3ad93fbb","Type":"ContainerDied","Data":"91bc07556715b9b9bfde8ae1343195414189761c5a84ed76fb8332062c9a4a0c"} Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.224645 4841 generic.go:334] "Generic (PLEG): container finished" podID="b63ca59c-2c67-4c8d-8cef-344f3e76ba06" containerID="f23d5789a7d0f92a2ffe4f082632a36d784cdaaf4d451234700e7ea77d4bc8d5" exitCode=143 Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.224742 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b63ca59c-2c67-4c8d-8cef-344f3e76ba06","Type":"ContainerDied","Data":"f23d5789a7d0f92a2ffe4f082632a36d784cdaaf4d451234700e7ea77d4bc8d5"} Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.226271 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.284261 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cd6dfbc4f-mgq48"] Jan 30 06:49:43 crc kubenswrapper[4841]: I0130 06:49:43.686656 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-665cc8dc4f-n4nmm"] Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.236911 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd6dfbc4f-mgq48" event={"ID":"32f3107e-b6dd-4265-8517-b38b5f4eae56","Type":"ContainerStarted","Data":"406cd56d527d429b3ebcae2b9752bf3d593a236cc8100ce796cf7d1e715f875d"} Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.240268 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665cc8dc4f-n4nmm" event={"ID":"342b4f2d-a28e-4a01-9d96-bac5f85b92d1","Type":"ContainerStarted","Data":"d259b9e679fe3f4fe4a4a7d92424e94b0639eb027b880ea20b121358266f85fd"} Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.311996 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cd6dfbc4f-mgq48"] Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.354819 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67cf966bf4-l6cdd"] Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.356984 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.359836 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.370045 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67cf966bf4-l6cdd"] Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.415308 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-665cc8dc4f-n4nmm"] Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.449980 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-c6697b658-bhtrl"] Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.451499 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.465748 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c6697b658-bhtrl"] Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.500306 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-horizon-secret-key\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.500342 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-combined-ca-bundle\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.500386 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-config-data\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.500535 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-scripts\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.500683 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-logs\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.500863 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-horizon-tls-certs\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.500892 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g6gp\" (UniqueName: \"kubernetes.io/projected/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-kube-api-access-6g6gp\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.603129 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-horizon-secret-key\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.603213 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-combined-ca-bundle\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.603253 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfa7cd93-66af-4dae-a610-d61b6235af81-config-data\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.603307 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-horizon-tls-certs\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.603391 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-config-data\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.603548 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfa7cd93-66af-4dae-a610-d61b6235af81-scripts\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.603797 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-scripts\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.603861 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-combined-ca-bundle\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.603883 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfa7cd93-66af-4dae-a610-d61b6235af81-logs\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.603938 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjt9n\" (UniqueName: \"kubernetes.io/projected/cfa7cd93-66af-4dae-a610-d61b6235af81-kube-api-access-hjt9n\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.603979 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-horizon-secret-key\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.604108 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-logs\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.604320 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-horizon-tls-certs\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.604355 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g6gp\" (UniqueName: \"kubernetes.io/projected/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-kube-api-access-6g6gp\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.604827 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-scripts\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.604979 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-config-data\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.605030 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-logs\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.608335 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-horizon-secret-key\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.609737 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-combined-ca-bundle\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.612251 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-horizon-tls-certs\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.624160 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g6gp\" (UniqueName: \"kubernetes.io/projected/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-kube-api-access-6g6gp\") pod \"horizon-67cf966bf4-l6cdd\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.682007 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.706671 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-combined-ca-bundle\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.706716 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfa7cd93-66af-4dae-a610-d61b6235af81-logs\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.706742 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjt9n\" (UniqueName: \"kubernetes.io/projected/cfa7cd93-66af-4dae-a610-d61b6235af81-kube-api-access-hjt9n\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.706767 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-horizon-secret-key\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.706874 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfa7cd93-66af-4dae-a610-d61b6235af81-config-data\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.706904 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-horizon-tls-certs\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.706928 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfa7cd93-66af-4dae-a610-d61b6235af81-scripts\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.707786 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfa7cd93-66af-4dae-a610-d61b6235af81-scripts\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.711252 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfa7cd93-66af-4dae-a610-d61b6235af81-config-data\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.712362 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-horizon-tls-certs\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.712450 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfa7cd93-66af-4dae-a610-d61b6235af81-logs\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.714161 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-horizon-secret-key\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.717756 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-combined-ca-bundle\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:44 crc kubenswrapper[4841]: I0130 06:49:44.724547 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjt9n\" (UniqueName: \"kubernetes.io/projected/cfa7cd93-66af-4dae-a610-d61b6235af81-kube-api-access-hjt9n\") pod \"horizon-c6697b658-bhtrl\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:46 crc kubenswrapper[4841]: I0130 06:49:44.792969 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:46 crc kubenswrapper[4841]: I0130 06:49:46.265197 4841 generic.go:334] "Generic (PLEG): container finished" podID="b63ca59c-2c67-4c8d-8cef-344f3e76ba06" containerID="80f19d874887893ff5379339d65dec8893e864761923c4b813ab8523b9791d51" exitCode=0 Jan 30 06:49:46 crc kubenswrapper[4841]: I0130 06:49:46.265316 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b63ca59c-2c67-4c8d-8cef-344f3e76ba06","Type":"ContainerDied","Data":"80f19d874887893ff5379339d65dec8893e864761923c4b813ab8523b9791d51"} Jan 30 06:49:46 crc kubenswrapper[4841]: I0130 06:49:46.270100 4841 generic.go:334] "Generic (PLEG): container finished" podID="95f2605a-425e-4320-a8c0-e78d3ad93fbb" containerID="71df0b5d837f19d4320f3cdb5e1bd39c13694d215319db8e6ce04bcd80b0fbae" exitCode=0 Jan 30 06:49:46 crc kubenswrapper[4841]: I0130 06:49:46.270140 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95f2605a-425e-4320-a8c0-e78d3ad93fbb","Type":"ContainerDied","Data":"71df0b5d837f19d4320f3cdb5e1bd39c13694d215319db8e6ce04bcd80b0fbae"} Jan 30 06:49:47 crc kubenswrapper[4841]: I0130 06:49:47.325255 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67cf966bf4-l6cdd"] Jan 30 06:49:47 crc kubenswrapper[4841]: I0130 06:49:47.413087 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c6697b658-bhtrl"] Jan 30 06:49:52 crc kubenswrapper[4841]: W0130 06:49:52.053051 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfa7cd93_66af_4dae_a610_d61b6235af81.slice/crio-07be13e42f8f1e277979d6d1e4463c7ce9aa70bf75e5a8466f4b98fd69193687 WatchSource:0}: Error finding container 07be13e42f8f1e277979d6d1e4463c7ce9aa70bf75e5a8466f4b98fd69193687: Status 404 returned error can't find the container with id 07be13e42f8f1e277979d6d1e4463c7ce9aa70bf75e5a8466f4b98fd69193687 Jan 30 06:49:52 crc kubenswrapper[4841]: W0130 06:49:52.068361 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb32b5d51_02d2_4eb7_9e52_09e2b5ec908f.slice/crio-2615dedd4c27c5625cb603030b57c855bbdfa4151b12ed6931f79c3598dcc30e WatchSource:0}: Error finding container 2615dedd4c27c5625cb603030b57c855bbdfa4151b12ed6931f79c3598dcc30e: Status 404 returned error can't find the container with id 2615dedd4c27c5625cb603030b57c855bbdfa4151b12ed6931f79c3598dcc30e Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.202830 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.209017 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.342495 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95f2605a-425e-4320-a8c0-e78d3ad93fbb","Type":"ContainerDied","Data":"f82953e97852f28f78d9a9197436b66479991a54a292afbf06ca136c6a3af681"} Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.342552 4841 scope.go:117] "RemoveContainer" containerID="71df0b5d837f19d4320f3cdb5e1bd39c13694d215319db8e6ce04bcd80b0fbae" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.342688 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.352734 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b63ca59c-2c67-4c8d-8cef-344f3e76ba06","Type":"ContainerDied","Data":"7bb80d22258d1cf0748f2f288adaec59ba82de119a9327a15a27c6bae10d45c4"} Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.352831 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.358865 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6697b658-bhtrl" event={"ID":"cfa7cd93-66af-4dae-a610-d61b6235af81","Type":"ContainerStarted","Data":"07be13e42f8f1e277979d6d1e4463c7ce9aa70bf75e5a8466f4b98fd69193687"} Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.360304 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67cf966bf4-l6cdd" event={"ID":"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f","Type":"ContainerStarted","Data":"2615dedd4c27c5625cb603030b57c855bbdfa4151b12ed6931f79c3598dcc30e"} Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380138 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-httpd-run\") pod \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380250 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82js2\" (UniqueName: \"kubernetes.io/projected/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-kube-api-access-82js2\") pod \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380283 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-internal-tls-certs\") pod \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380319 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-combined-ca-bundle\") pod \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380351 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-combined-ca-bundle\") pod \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380417 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-scripts\") pod \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380465 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-config-data\") pod \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380487 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-config-data\") pod \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380543 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95f2605a-425e-4320-a8c0-e78d3ad93fbb-httpd-run\") pod \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380573 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jflds\" (UniqueName: \"kubernetes.io/projected/95f2605a-425e-4320-a8c0-e78d3ad93fbb-kube-api-access-jflds\") pod \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380589 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95f2605a-425e-4320-a8c0-e78d3ad93fbb-logs\") pod \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380637 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-scripts\") pod \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380662 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-logs\") pod \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\" (UID: \"b63ca59c-2c67-4c8d-8cef-344f3e76ba06\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.380687 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-public-tls-certs\") pod \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\" (UID: \"95f2605a-425e-4320-a8c0-e78d3ad93fbb\") " Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.381239 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b63ca59c-2c67-4c8d-8cef-344f3e76ba06" (UID: "b63ca59c-2c67-4c8d-8cef-344f3e76ba06"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.383631 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95f2605a-425e-4320-a8c0-e78d3ad93fbb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "95f2605a-425e-4320-a8c0-e78d3ad93fbb" (UID: "95f2605a-425e-4320-a8c0-e78d3ad93fbb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.384245 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-logs" (OuterVolumeSpecName: "logs") pod "b63ca59c-2c67-4c8d-8cef-344f3e76ba06" (UID: "b63ca59c-2c67-4c8d-8cef-344f3e76ba06"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.385839 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95f2605a-425e-4320-a8c0-e78d3ad93fbb-logs" (OuterVolumeSpecName: "logs") pod "95f2605a-425e-4320-a8c0-e78d3ad93fbb" (UID: "95f2605a-425e-4320-a8c0-e78d3ad93fbb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.386798 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-kube-api-access-82js2" (OuterVolumeSpecName: "kube-api-access-82js2") pod "b63ca59c-2c67-4c8d-8cef-344f3e76ba06" (UID: "b63ca59c-2c67-4c8d-8cef-344f3e76ba06"). InnerVolumeSpecName "kube-api-access-82js2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.389352 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-scripts" (OuterVolumeSpecName: "scripts") pod "b63ca59c-2c67-4c8d-8cef-344f3e76ba06" (UID: "b63ca59c-2c67-4c8d-8cef-344f3e76ba06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.412737 4841 scope.go:117] "RemoveContainer" containerID="91bc07556715b9b9bfde8ae1343195414189761c5a84ed76fb8332062c9a4a0c" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.412932 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-scripts" (OuterVolumeSpecName: "scripts") pod "95f2605a-425e-4320-a8c0-e78d3ad93fbb" (UID: "95f2605a-425e-4320-a8c0-e78d3ad93fbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.413083 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f2605a-425e-4320-a8c0-e78d3ad93fbb-kube-api-access-jflds" (OuterVolumeSpecName: "kube-api-access-jflds") pod "95f2605a-425e-4320-a8c0-e78d3ad93fbb" (UID: "95f2605a-425e-4320-a8c0-e78d3ad93fbb"). InnerVolumeSpecName "kube-api-access-jflds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.473821 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b63ca59c-2c67-4c8d-8cef-344f3e76ba06" (UID: "b63ca59c-2c67-4c8d-8cef-344f3e76ba06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.474776 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-config-data" (OuterVolumeSpecName: "config-data") pod "b63ca59c-2c67-4c8d-8cef-344f3e76ba06" (UID: "b63ca59c-2c67-4c8d-8cef-344f3e76ba06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.476183 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95f2605a-425e-4320-a8c0-e78d3ad93fbb" (UID: "95f2605a-425e-4320-a8c0-e78d3ad93fbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.483248 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.483361 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82js2\" (UniqueName: \"kubernetes.io/projected/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-kube-api-access-82js2\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.483455 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.483523 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.483576 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.483666 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.483721 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95f2605a-425e-4320-a8c0-e78d3ad93fbb-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.483774 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jflds\" (UniqueName: \"kubernetes.io/projected/95f2605a-425e-4320-a8c0-e78d3ad93fbb-kube-api-access-jflds\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.483825 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95f2605a-425e-4320-a8c0-e78d3ad93fbb-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.483873 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.483924 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.488113 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b63ca59c-2c67-4c8d-8cef-344f3e76ba06" (UID: "b63ca59c-2c67-4c8d-8cef-344f3e76ba06"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.492354 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-config-data" (OuterVolumeSpecName: "config-data") pod "95f2605a-425e-4320-a8c0-e78d3ad93fbb" (UID: "95f2605a-425e-4320-a8c0-e78d3ad93fbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.497645 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "95f2605a-425e-4320-a8c0-e78d3ad93fbb" (UID: "95f2605a-425e-4320-a8c0-e78d3ad93fbb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.529521 4841 scope.go:117] "RemoveContainer" containerID="80f19d874887893ff5379339d65dec8893e864761923c4b813ab8523b9791d51" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.563756 4841 scope.go:117] "RemoveContainer" containerID="f23d5789a7d0f92a2ffe4f082632a36d784cdaaf4d451234700e7ea77d4bc8d5" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.585948 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.585992 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b63ca59c-2c67-4c8d-8cef-344f3e76ba06-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.586002 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f2605a-425e-4320-a8c0-e78d3ad93fbb-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.703803 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.748910 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.764581 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.801907 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:49:52 crc kubenswrapper[4841]: E0130 06:49:52.802516 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f2605a-425e-4320-a8c0-e78d3ad93fbb" containerName="glance-httpd" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.802531 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f2605a-425e-4320-a8c0-e78d3ad93fbb" containerName="glance-httpd" Jan 30 06:49:52 crc kubenswrapper[4841]: E0130 06:49:52.802540 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63ca59c-2c67-4c8d-8cef-344f3e76ba06" containerName="glance-log" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.802546 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63ca59c-2c67-4c8d-8cef-344f3e76ba06" containerName="glance-log" Jan 30 06:49:52 crc kubenswrapper[4841]: E0130 06:49:52.802563 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f2605a-425e-4320-a8c0-e78d3ad93fbb" containerName="glance-log" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.802569 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f2605a-425e-4320-a8c0-e78d3ad93fbb" containerName="glance-log" Jan 30 06:49:52 crc kubenswrapper[4841]: E0130 06:49:52.802603 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63ca59c-2c67-4c8d-8cef-344f3e76ba06" containerName="glance-httpd" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.802609 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63ca59c-2c67-4c8d-8cef-344f3e76ba06" containerName="glance-httpd" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.802822 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63ca59c-2c67-4c8d-8cef-344f3e76ba06" containerName="glance-httpd" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.802842 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f2605a-425e-4320-a8c0-e78d3ad93fbb" containerName="glance-httpd" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.802871 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63ca59c-2c67-4c8d-8cef-344f3e76ba06" containerName="glance-log" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.802889 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f2605a-425e-4320-a8c0-e78d3ad93fbb" containerName="glance-log" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.804479 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.806686 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.807143 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s4m68" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.807282 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.816930 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.829930 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.855039 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.859718 4841 scope.go:117] "RemoveContainer" containerID="867d005cd21bd781ade02045dfd4c6c1ccd01682a05940e6aeef94bf359071ca" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.864234 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.865961 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.868457 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.869704 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.876165 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.892960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a77158-2a09-4554-b6c1-c04b11fff9cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.893230 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75a77158-2a09-4554-b6c1-c04b11fff9cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.893292 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a77158-2a09-4554-b6c1-c04b11fff9cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.893313 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gthtb\" (UniqueName: \"kubernetes.io/projected/75a77158-2a09-4554-b6c1-c04b11fff9cf-kube-api-access-gthtb\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.893348 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a77158-2a09-4554-b6c1-c04b11fff9cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.893387 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75a77158-2a09-4554-b6c1-c04b11fff9cf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.893429 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75a77158-2a09-4554-b6c1-c04b11fff9cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.904124 4841 scope.go:117] "RemoveContainer" containerID="283f4f5754c1552cbee44d9d748bf54079893509b6ca16e144da1a9ea78724d3" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.941985 4841 scope.go:117] "RemoveContainer" containerID="fc64e979b3458d4a87bcfd6e8565b68dd5cc597575da82700f3703fa395a8367" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.994680 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d60ed7-0a8d-45c7-8448-89a43c660178-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.994720 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lrxf\" (UniqueName: \"kubernetes.io/projected/02d60ed7-0a8d-45c7-8448-89a43c660178-kube-api-access-2lrxf\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.994742 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d60ed7-0a8d-45c7-8448-89a43c660178-config-data\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.994767 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a77158-2a09-4554-b6c1-c04b11fff9cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.994787 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75a77158-2a09-4554-b6c1-c04b11fff9cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.994830 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02d60ed7-0a8d-45c7-8448-89a43c660178-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.994880 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a77158-2a09-4554-b6c1-c04b11fff9cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.994899 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gthtb\" (UniqueName: \"kubernetes.io/projected/75a77158-2a09-4554-b6c1-c04b11fff9cf-kube-api-access-gthtb\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.994929 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a77158-2a09-4554-b6c1-c04b11fff9cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.994972 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75a77158-2a09-4554-b6c1-c04b11fff9cf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.995000 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75a77158-2a09-4554-b6c1-c04b11fff9cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.995028 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d60ed7-0a8d-45c7-8448-89a43c660178-scripts\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.995045 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02d60ed7-0a8d-45c7-8448-89a43c660178-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.995060 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02d60ed7-0a8d-45c7-8448-89a43c660178-logs\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.996329 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75a77158-2a09-4554-b6c1-c04b11fff9cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:52 crc kubenswrapper[4841]: I0130 06:49:52.996644 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75a77158-2a09-4554-b6c1-c04b11fff9cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.003627 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75a77158-2a09-4554-b6c1-c04b11fff9cf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.003775 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a77158-2a09-4554-b6c1-c04b11fff9cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.004130 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75a77158-2a09-4554-b6c1-c04b11fff9cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.010003 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75a77158-2a09-4554-b6c1-c04b11fff9cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.010860 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gthtb\" (UniqueName: \"kubernetes.io/projected/75a77158-2a09-4554-b6c1-c04b11fff9cf-kube-api-access-gthtb\") pod \"glance-default-internal-api-0\" (UID: \"75a77158-2a09-4554-b6c1-c04b11fff9cf\") " pod="openstack/glance-default-internal-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.111521 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d60ed7-0a8d-45c7-8448-89a43c660178-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.111586 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lrxf\" (UniqueName: \"kubernetes.io/projected/02d60ed7-0a8d-45c7-8448-89a43c660178-kube-api-access-2lrxf\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.111626 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d60ed7-0a8d-45c7-8448-89a43c660178-config-data\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.111730 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02d60ed7-0a8d-45c7-8448-89a43c660178-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.112046 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d60ed7-0a8d-45c7-8448-89a43c660178-scripts\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.112076 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02d60ed7-0a8d-45c7-8448-89a43c660178-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.112096 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02d60ed7-0a8d-45c7-8448-89a43c660178-logs\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.112763 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02d60ed7-0a8d-45c7-8448-89a43c660178-logs\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.113073 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02d60ed7-0a8d-45c7-8448-89a43c660178-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.116873 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d60ed7-0a8d-45c7-8448-89a43c660178-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.125063 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02d60ed7-0a8d-45c7-8448-89a43c660178-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.125400 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d60ed7-0a8d-45c7-8448-89a43c660178-config-data\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.127866 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.129992 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d60ed7-0a8d-45c7-8448-89a43c660178-scripts\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.133590 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lrxf\" (UniqueName: \"kubernetes.io/projected/02d60ed7-0a8d-45c7-8448-89a43c660178-kube-api-access-2lrxf\") pod \"glance-default-external-api-0\" (UID: \"02d60ed7-0a8d-45c7-8448-89a43c660178\") " pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.187207 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.378238 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67cf966bf4-l6cdd" event={"ID":"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f","Type":"ContainerStarted","Data":"67d26d8a2b7bb3854d4c64a0d38149ff1c129f3f6a963336277c0f8ab206577b"} Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.378914 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67cf966bf4-l6cdd" event={"ID":"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f","Type":"ContainerStarted","Data":"9cd67ad651434aa9878f56de8705229c2671573dd5e09e67f7fe1f4c9ff352b6"} Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.386368 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665cc8dc4f-n4nmm" event={"ID":"342b4f2d-a28e-4a01-9d96-bac5f85b92d1","Type":"ContainerStarted","Data":"05566e8161c5101359e42a4e28748b89fb0babfac3acb4bf52b7cdbc535fbf99"} Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.386427 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665cc8dc4f-n4nmm" event={"ID":"342b4f2d-a28e-4a01-9d96-bac5f85b92d1","Type":"ContainerStarted","Data":"8b2b6d23f6d6a879504208c6851105f3921849f47223b26d5639cd5f415cbac9"} Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.386550 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-665cc8dc4f-n4nmm" podUID="342b4f2d-a28e-4a01-9d96-bac5f85b92d1" containerName="horizon-log" containerID="cri-o://8b2b6d23f6d6a879504208c6851105f3921849f47223b26d5639cd5f415cbac9" gracePeriod=30 Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.386783 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-665cc8dc4f-n4nmm" podUID="342b4f2d-a28e-4a01-9d96-bac5f85b92d1" containerName="horizon" containerID="cri-o://05566e8161c5101359e42a4e28748b89fb0babfac3acb4bf52b7cdbc535fbf99" gracePeriod=30 Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.399785 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6697b658-bhtrl" event={"ID":"cfa7cd93-66af-4dae-a610-d61b6235af81","Type":"ContainerStarted","Data":"d49ea3d715b7b663c44877634bae9eb11ba651f1df614d0a17cf00d282c2a96c"} Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.399836 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6697b658-bhtrl" event={"ID":"cfa7cd93-66af-4dae-a610-d61b6235af81","Type":"ContainerStarted","Data":"26bf91e83f7f783284c5ae3f95e4da9624775557e77d94dea27183e0d82dd032"} Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.404279 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd6dfbc4f-mgq48" event={"ID":"32f3107e-b6dd-4265-8517-b38b5f4eae56","Type":"ContainerStarted","Data":"cbfbc64cb59db5551caad92a4386f7c31a23575e8edfba3cbc79b10c1cc230b3"} Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.404331 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd6dfbc4f-mgq48" event={"ID":"32f3107e-b6dd-4265-8517-b38b5f4eae56","Type":"ContainerStarted","Data":"c54062dcbea7ed0a094727f88d48c5ccf8487d469448d1f2b652bab61c963314"} Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.404472 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cd6dfbc4f-mgq48" podUID="32f3107e-b6dd-4265-8517-b38b5f4eae56" containerName="horizon-log" containerID="cri-o://c54062dcbea7ed0a094727f88d48c5ccf8487d469448d1f2b652bab61c963314" gracePeriod=30 Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.404726 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cd6dfbc4f-mgq48" podUID="32f3107e-b6dd-4265-8517-b38b5f4eae56" containerName="horizon" containerID="cri-o://cbfbc64cb59db5551caad92a4386f7c31a23575e8edfba3cbc79b10c1cc230b3" gracePeriod=30 Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.409634 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67cf966bf4-l6cdd" podStartSLOduration=8.942921298 podStartE2EDuration="9.409610596s" podCreationTimestamp="2026-01-30 06:49:44 +0000 UTC" firstStartedPulling="2026-01-30 06:49:52.072278745 +0000 UTC m=+6129.065751383" lastFinishedPulling="2026-01-30 06:49:52.538968043 +0000 UTC m=+6129.532440681" observedRunningTime="2026-01-30 06:49:53.400983716 +0000 UTC m=+6130.394456364" watchObservedRunningTime="2026-01-30 06:49:53.409610596 +0000 UTC m=+6130.403083244" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.433265 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-c6697b658-bhtrl" podStartSLOduration=8.928202845 podStartE2EDuration="9.433242596s" podCreationTimestamp="2026-01-30 06:49:44 +0000 UTC" firstStartedPulling="2026-01-30 06:49:52.05970201 +0000 UTC m=+6129.053174648" lastFinishedPulling="2026-01-30 06:49:52.564741761 +0000 UTC m=+6129.558214399" observedRunningTime="2026-01-30 06:49:53.427410851 +0000 UTC m=+6130.420883489" watchObservedRunningTime="2026-01-30 06:49:53.433242596 +0000 UTC m=+6130.426715234" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.448390 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-665cc8dc4f-n4nmm" podStartSLOduration=2.895250172 podStartE2EDuration="11.44836918s" podCreationTimestamp="2026-01-30 06:49:42 +0000 UTC" firstStartedPulling="2026-01-30 06:49:43.701251094 +0000 UTC m=+6120.694723752" lastFinishedPulling="2026-01-30 06:49:52.254370082 +0000 UTC m=+6129.247842760" observedRunningTime="2026-01-30 06:49:53.444710552 +0000 UTC m=+6130.438183200" watchObservedRunningTime="2026-01-30 06:49:53.44836918 +0000 UTC m=+6130.441841818" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.470436 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5cd6dfbc4f-mgq48" podStartSLOduration=2.481044872 podStartE2EDuration="11.470387107s" podCreationTimestamp="2026-01-30 06:49:42 +0000 UTC" firstStartedPulling="2026-01-30 06:49:43.292829899 +0000 UTC m=+6120.286302527" lastFinishedPulling="2026-01-30 06:49:52.282172124 +0000 UTC m=+6129.275644762" observedRunningTime="2026-01-30 06:49:53.462695632 +0000 UTC m=+6130.456168270" watchObservedRunningTime="2026-01-30 06:49:53.470387107 +0000 UTC m=+6130.463859745" Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.679176 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 06:49:53 crc kubenswrapper[4841]: I0130 06:49:53.888506 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 06:49:54 crc kubenswrapper[4841]: I0130 06:49:54.418757 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02d60ed7-0a8d-45c7-8448-89a43c660178","Type":"ContainerStarted","Data":"0ec791cda14c0c454d9970a849baff3c148b20fed76df3342bf4aea5752cc390"} Jan 30 06:49:54 crc kubenswrapper[4841]: I0130 06:49:54.422345 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"75a77158-2a09-4554-b6c1-c04b11fff9cf","Type":"ContainerStarted","Data":"fcfdcfab141b4d80d02f955e915a00620ee14e26556654e222959234c87f41b7"} Jan 30 06:49:54 crc kubenswrapper[4841]: I0130 06:49:54.445226 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95f2605a-425e-4320-a8c0-e78d3ad93fbb" path="/var/lib/kubelet/pods/95f2605a-425e-4320-a8c0-e78d3ad93fbb/volumes" Jan 30 06:49:54 crc kubenswrapper[4841]: I0130 06:49:54.446095 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b63ca59c-2c67-4c8d-8cef-344f3e76ba06" path="/var/lib/kubelet/pods/b63ca59c-2c67-4c8d-8cef-344f3e76ba06/volumes" Jan 30 06:49:54 crc kubenswrapper[4841]: I0130 06:49:54.682513 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:54 crc kubenswrapper[4841]: I0130 06:49:54.682869 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:49:54 crc kubenswrapper[4841]: I0130 06:49:54.793423 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:54 crc kubenswrapper[4841]: I0130 06:49:54.793466 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:49:55 crc kubenswrapper[4841]: I0130 06:49:55.434047 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02d60ed7-0a8d-45c7-8448-89a43c660178","Type":"ContainerStarted","Data":"2c4518e73a7c06492810b69fb6651139b615b019d3662246254653a6ba24e4bd"} Jan 30 06:49:55 crc kubenswrapper[4841]: I0130 06:49:55.434328 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02d60ed7-0a8d-45c7-8448-89a43c660178","Type":"ContainerStarted","Data":"ab8bc150b5389e29512193157ba7b6fa6f31d90443f3c4a8b87f10fe940df87d"} Jan 30 06:49:55 crc kubenswrapper[4841]: I0130 06:49:55.437184 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"75a77158-2a09-4554-b6c1-c04b11fff9cf","Type":"ContainerStarted","Data":"88d00d67e5e695dc24e7a702703d3eb83daff87900686cb24c16e8c5c8eb029e"} Jan 30 06:49:55 crc kubenswrapper[4841]: I0130 06:49:55.437237 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"75a77158-2a09-4554-b6c1-c04b11fff9cf","Type":"ContainerStarted","Data":"a9276b9bd2ee2007d32e1a3b41ca1e2d8c35de0169b910ffa8ea538f4a63e480"} Jan 30 06:49:55 crc kubenswrapper[4841]: I0130 06:49:55.460039 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.4600146560000002 podStartE2EDuration="3.460014656s" podCreationTimestamp="2026-01-30 06:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:49:55.457019707 +0000 UTC m=+6132.450492345" watchObservedRunningTime="2026-01-30 06:49:55.460014656 +0000 UTC m=+6132.453487304" Jan 30 06:49:55 crc kubenswrapper[4841]: I0130 06:49:55.481521 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.48150299 podStartE2EDuration="3.48150299s" podCreationTimestamp="2026-01-30 06:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:49:55.477983976 +0000 UTC m=+6132.471456634" watchObservedRunningTime="2026-01-30 06:49:55.48150299 +0000 UTC m=+6132.474975638" Jan 30 06:49:57 crc kubenswrapper[4841]: I0130 06:49:57.064340 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-9mnmt"] Jan 30 06:49:57 crc kubenswrapper[4841]: I0130 06:49:57.081297 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-9mnmt"] Jan 30 06:49:58 crc kubenswrapper[4841]: I0130 06:49:58.031182 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1c57-account-create-update-gz9rp"] Jan 30 06:49:58 crc kubenswrapper[4841]: I0130 06:49:58.040624 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1c57-account-create-update-gz9rp"] Jan 30 06:49:58 crc kubenswrapper[4841]: I0130 06:49:58.447275 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b11e52-0c46-441a-9938-0b4237eb9e94" path="/var/lib/kubelet/pods/c9b11e52-0c46-441a-9938-0b4237eb9e94/volumes" Jan 30 06:49:58 crc kubenswrapper[4841]: I0130 06:49:58.447923 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c76d5a-1ed6-49cc-8c42-ebc88890c477" path="/var/lib/kubelet/pods/c9c76d5a-1ed6-49cc-8c42-ebc88890c477/volumes" Jan 30 06:50:02 crc kubenswrapper[4841]: I0130 06:50:02.790500 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.129322 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.129426 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.188332 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.188449 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.195579 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.203589 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.226481 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.261904 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.271487 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.542347 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.542445 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.542467 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 06:50:03 crc kubenswrapper[4841]: I0130 06:50:03.542487 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 06:50:04 crc kubenswrapper[4841]: I0130 06:50:04.684475 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67cf966bf4-l6cdd" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8443: connect: connection refused" Jan 30 06:50:04 crc kubenswrapper[4841]: I0130 06:50:04.795517 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c6697b658-bhtrl" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.121:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.121:8443: connect: connection refused" Jan 30 06:50:05 crc kubenswrapper[4841]: I0130 06:50:05.734031 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 06:50:05 crc kubenswrapper[4841]: I0130 06:50:05.734113 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 06:50:05 crc kubenswrapper[4841]: I0130 06:50:05.769683 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 06:50:05 crc kubenswrapper[4841]: I0130 06:50:05.859387 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 06:50:05 crc kubenswrapper[4841]: I0130 06:50:05.859510 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 06:50:05 crc kubenswrapper[4841]: I0130 06:50:05.864266 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 06:50:10 crc kubenswrapper[4841]: I0130 06:50:10.463782 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:50:10 crc kubenswrapper[4841]: I0130 06:50:10.464263 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:50:11 crc kubenswrapper[4841]: I0130 06:50:11.036797 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-b9xdq"] Jan 30 06:50:11 crc kubenswrapper[4841]: I0130 06:50:11.045575 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-b9xdq"] Jan 30 06:50:12 crc kubenswrapper[4841]: I0130 06:50:12.474231 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5825634c-2fce-4b58-8ed9-c6e113c162f0" path="/var/lib/kubelet/pods/5825634c-2fce-4b58-8ed9-c6e113c162f0/volumes" Jan 30 06:50:16 crc kubenswrapper[4841]: I0130 06:50:16.492184 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:50:16 crc kubenswrapper[4841]: I0130 06:50:16.496162 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:50:17 crc kubenswrapper[4841]: I0130 06:50:17.976459 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:50:18 crc kubenswrapper[4841]: I0130 06:50:18.040869 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:50:18 crc kubenswrapper[4841]: I0130 06:50:18.126745 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67cf966bf4-l6cdd"] Jan 30 06:50:18 crc kubenswrapper[4841]: I0130 06:50:18.698117 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67cf966bf4-l6cdd" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerName="horizon-log" containerID="cri-o://9cd67ad651434aa9878f56de8705229c2671573dd5e09e67f7fe1f4c9ff352b6" gracePeriod=30 Jan 30 06:50:18 crc kubenswrapper[4841]: I0130 06:50:18.698196 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67cf966bf4-l6cdd" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerName="horizon" containerID="cri-o://67d26d8a2b7bb3854d4c64a0d38149ff1c129f3f6a963336277c0f8ab206577b" gracePeriod=30 Jan 30 06:50:22 crc kubenswrapper[4841]: I0130 06:50:22.741194 4841 generic.go:334] "Generic (PLEG): container finished" podID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerID="67d26d8a2b7bb3854d4c64a0d38149ff1c129f3f6a963336277c0f8ab206577b" exitCode=0 Jan 30 06:50:22 crc kubenswrapper[4841]: I0130 06:50:22.741274 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67cf966bf4-l6cdd" event={"ID":"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f","Type":"ContainerDied","Data":"67d26d8a2b7bb3854d4c64a0d38149ff1c129f3f6a963336277c0f8ab206577b"} Jan 30 06:50:23 crc kubenswrapper[4841]: I0130 06:50:23.767734 4841 generic.go:334] "Generic (PLEG): container finished" podID="32f3107e-b6dd-4265-8517-b38b5f4eae56" containerID="cbfbc64cb59db5551caad92a4386f7c31a23575e8edfba3cbc79b10c1cc230b3" exitCode=137 Jan 30 06:50:23 crc kubenswrapper[4841]: I0130 06:50:23.768189 4841 generic.go:334] "Generic (PLEG): container finished" podID="32f3107e-b6dd-4265-8517-b38b5f4eae56" containerID="c54062dcbea7ed0a094727f88d48c5ccf8487d469448d1f2b652bab61c963314" exitCode=137 Jan 30 06:50:23 crc kubenswrapper[4841]: I0130 06:50:23.767992 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd6dfbc4f-mgq48" event={"ID":"32f3107e-b6dd-4265-8517-b38b5f4eae56","Type":"ContainerDied","Data":"cbfbc64cb59db5551caad92a4386f7c31a23575e8edfba3cbc79b10c1cc230b3"} Jan 30 06:50:23 crc kubenswrapper[4841]: I0130 06:50:23.768250 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd6dfbc4f-mgq48" event={"ID":"32f3107e-b6dd-4265-8517-b38b5f4eae56","Type":"ContainerDied","Data":"c54062dcbea7ed0a094727f88d48c5ccf8487d469448d1f2b652bab61c963314"} Jan 30 06:50:23 crc kubenswrapper[4841]: I0130 06:50:23.773541 4841 generic.go:334] "Generic (PLEG): container finished" podID="342b4f2d-a28e-4a01-9d96-bac5f85b92d1" containerID="05566e8161c5101359e42a4e28748b89fb0babfac3acb4bf52b7cdbc535fbf99" exitCode=137 Jan 30 06:50:23 crc kubenswrapper[4841]: I0130 06:50:23.773566 4841 generic.go:334] "Generic (PLEG): container finished" podID="342b4f2d-a28e-4a01-9d96-bac5f85b92d1" containerID="8b2b6d23f6d6a879504208c6851105f3921849f47223b26d5639cd5f415cbac9" exitCode=137 Jan 30 06:50:23 crc kubenswrapper[4841]: I0130 06:50:23.773583 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665cc8dc4f-n4nmm" event={"ID":"342b4f2d-a28e-4a01-9d96-bac5f85b92d1","Type":"ContainerDied","Data":"05566e8161c5101359e42a4e28748b89fb0babfac3acb4bf52b7cdbc535fbf99"} Jan 30 06:50:23 crc kubenswrapper[4841]: I0130 06:50:23.773602 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665cc8dc4f-n4nmm" event={"ID":"342b4f2d-a28e-4a01-9d96-bac5f85b92d1","Type":"ContainerDied","Data":"8b2b6d23f6d6a879504208c6851105f3921849f47223b26d5639cd5f415cbac9"} Jan 30 06:50:23 crc kubenswrapper[4841]: I0130 06:50:23.935467 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:50:23 crc kubenswrapper[4841]: I0130 06:50:23.948197 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.038733 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-config-data\") pod \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.038814 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfqdj\" (UniqueName: \"kubernetes.io/projected/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-kube-api-access-bfqdj\") pod \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.039190 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-logs\") pod \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.039273 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-scripts\") pod \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.039322 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-horizon-secret-key\") pod \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\" (UID: \"342b4f2d-a28e-4a01-9d96-bac5f85b92d1\") " Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.039857 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-logs" (OuterVolumeSpecName: "logs") pod "342b4f2d-a28e-4a01-9d96-bac5f85b92d1" (UID: "342b4f2d-a28e-4a01-9d96-bac5f85b92d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.045161 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "342b4f2d-a28e-4a01-9d96-bac5f85b92d1" (UID: "342b4f2d-a28e-4a01-9d96-bac5f85b92d1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.048776 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-kube-api-access-bfqdj" (OuterVolumeSpecName: "kube-api-access-bfqdj") pod "342b4f2d-a28e-4a01-9d96-bac5f85b92d1" (UID: "342b4f2d-a28e-4a01-9d96-bac5f85b92d1"). InnerVolumeSpecName "kube-api-access-bfqdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.064225 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-config-data" (OuterVolumeSpecName: "config-data") pod "342b4f2d-a28e-4a01-9d96-bac5f85b92d1" (UID: "342b4f2d-a28e-4a01-9d96-bac5f85b92d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.065451 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-scripts" (OuterVolumeSpecName: "scripts") pod "342b4f2d-a28e-4a01-9d96-bac5f85b92d1" (UID: "342b4f2d-a28e-4a01-9d96-bac5f85b92d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.141214 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32f3107e-b6dd-4265-8517-b38b5f4eae56-scripts\") pod \"32f3107e-b6dd-4265-8517-b38b5f4eae56\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.141527 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32f3107e-b6dd-4265-8517-b38b5f4eae56-logs\") pod \"32f3107e-b6dd-4265-8517-b38b5f4eae56\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.141622 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32f3107e-b6dd-4265-8517-b38b5f4eae56-config-data\") pod \"32f3107e-b6dd-4265-8517-b38b5f4eae56\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.141741 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32f3107e-b6dd-4265-8517-b38b5f4eae56-horizon-secret-key\") pod \"32f3107e-b6dd-4265-8517-b38b5f4eae56\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.141793 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zm4w\" (UniqueName: \"kubernetes.io/projected/32f3107e-b6dd-4265-8517-b38b5f4eae56-kube-api-access-8zm4w\") pod \"32f3107e-b6dd-4265-8517-b38b5f4eae56\" (UID: \"32f3107e-b6dd-4265-8517-b38b5f4eae56\") " Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.141905 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32f3107e-b6dd-4265-8517-b38b5f4eae56-logs" (OuterVolumeSpecName: "logs") pod "32f3107e-b6dd-4265-8517-b38b5f4eae56" (UID: "32f3107e-b6dd-4265-8517-b38b5f4eae56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.142338 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfqdj\" (UniqueName: \"kubernetes.io/projected/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-kube-api-access-bfqdj\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.142356 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.142379 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.142387 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32f3107e-b6dd-4265-8517-b38b5f4eae56-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.142420 4841 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.142430 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342b4f2d-a28e-4a01-9d96-bac5f85b92d1-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.145298 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f3107e-b6dd-4265-8517-b38b5f4eae56-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "32f3107e-b6dd-4265-8517-b38b5f4eae56" (UID: "32f3107e-b6dd-4265-8517-b38b5f4eae56"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.145848 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32f3107e-b6dd-4265-8517-b38b5f4eae56-kube-api-access-8zm4w" (OuterVolumeSpecName: "kube-api-access-8zm4w") pod "32f3107e-b6dd-4265-8517-b38b5f4eae56" (UID: "32f3107e-b6dd-4265-8517-b38b5f4eae56"). InnerVolumeSpecName "kube-api-access-8zm4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.163974 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f3107e-b6dd-4265-8517-b38b5f4eae56-scripts" (OuterVolumeSpecName: "scripts") pod "32f3107e-b6dd-4265-8517-b38b5f4eae56" (UID: "32f3107e-b6dd-4265-8517-b38b5f4eae56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.173284 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f3107e-b6dd-4265-8517-b38b5f4eae56-config-data" (OuterVolumeSpecName: "config-data") pod "32f3107e-b6dd-4265-8517-b38b5f4eae56" (UID: "32f3107e-b6dd-4265-8517-b38b5f4eae56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.243980 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32f3107e-b6dd-4265-8517-b38b5f4eae56-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.244021 4841 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32f3107e-b6dd-4265-8517-b38b5f4eae56-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.244036 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zm4w\" (UniqueName: \"kubernetes.io/projected/32f3107e-b6dd-4265-8517-b38b5f4eae56-kube-api-access-8zm4w\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.244054 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32f3107e-b6dd-4265-8517-b38b5f4eae56-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.682738 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67cf966bf4-l6cdd" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8443: connect: connection refused" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.788598 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665cc8dc4f-n4nmm" event={"ID":"342b4f2d-a28e-4a01-9d96-bac5f85b92d1","Type":"ContainerDied","Data":"d259b9e679fe3f4fe4a4a7d92424e94b0639eb027b880ea20b121358266f85fd"} Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.788647 4841 scope.go:117] "RemoveContainer" containerID="05566e8161c5101359e42a4e28748b89fb0babfac3acb4bf52b7cdbc535fbf99" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.788770 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665cc8dc4f-n4nmm" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.793897 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd6dfbc4f-mgq48" event={"ID":"32f3107e-b6dd-4265-8517-b38b5f4eae56","Type":"ContainerDied","Data":"406cd56d527d429b3ebcae2b9752bf3d593a236cc8100ce796cf7d1e715f875d"} Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.794194 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cd6dfbc4f-mgq48" Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.826470 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-665cc8dc4f-n4nmm"] Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.835709 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-665cc8dc4f-n4nmm"] Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.855089 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cd6dfbc4f-mgq48"] Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.863411 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cd6dfbc4f-mgq48"] Jan 30 06:50:24 crc kubenswrapper[4841]: I0130 06:50:24.989855 4841 scope.go:117] "RemoveContainer" containerID="8b2b6d23f6d6a879504208c6851105f3921849f47223b26d5639cd5f415cbac9" Jan 30 06:50:25 crc kubenswrapper[4841]: I0130 06:50:25.010579 4841 scope.go:117] "RemoveContainer" containerID="cbfbc64cb59db5551caad92a4386f7c31a23575e8edfba3cbc79b10c1cc230b3" Jan 30 06:50:25 crc kubenswrapper[4841]: I0130 06:50:25.182822 4841 scope.go:117] "RemoveContainer" containerID="c54062dcbea7ed0a094727f88d48c5ccf8487d469448d1f2b652bab61c963314" Jan 30 06:50:26 crc kubenswrapper[4841]: I0130 06:50:26.444840 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32f3107e-b6dd-4265-8517-b38b5f4eae56" path="/var/lib/kubelet/pods/32f3107e-b6dd-4265-8517-b38b5f4eae56/volumes" Jan 30 06:50:26 crc kubenswrapper[4841]: I0130 06:50:26.446829 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="342b4f2d-a28e-4a01-9d96-bac5f85b92d1" path="/var/lib/kubelet/pods/342b4f2d-a28e-4a01-9d96-bac5f85b92d1/volumes" Jan 30 06:50:34 crc kubenswrapper[4841]: I0130 06:50:34.683092 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67cf966bf4-l6cdd" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8443: connect: connection refused" Jan 30 06:50:40 crc kubenswrapper[4841]: I0130 06:50:40.464012 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:50:40 crc kubenswrapper[4841]: I0130 06:50:40.464919 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:50:44 crc kubenswrapper[4841]: I0130 06:50:44.683354 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67cf966bf4-l6cdd" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8443: connect: connection refused" Jan 30 06:50:44 crc kubenswrapper[4841]: I0130 06:50:44.684153 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.091926 4841 generic.go:334] "Generic (PLEG): container finished" podID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerID="9cd67ad651434aa9878f56de8705229c2671573dd5e09e67f7fe1f4c9ff352b6" exitCode=137 Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.092029 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67cf966bf4-l6cdd" event={"ID":"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f","Type":"ContainerDied","Data":"9cd67ad651434aa9878f56de8705229c2671573dd5e09e67f7fe1f4c9ff352b6"} Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.092721 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67cf966bf4-l6cdd" event={"ID":"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f","Type":"ContainerDied","Data":"2615dedd4c27c5625cb603030b57c855bbdfa4151b12ed6931f79c3598dcc30e"} Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.092750 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2615dedd4c27c5625cb603030b57c855bbdfa4151b12ed6931f79c3598dcc30e" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.163660 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.274659 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-combined-ca-bundle\") pod \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.275025 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-horizon-secret-key\") pod \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.275100 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-logs\") pod \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.275189 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-horizon-tls-certs\") pod \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.275331 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-config-data\") pod \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.275431 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-scripts\") pod \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.275538 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6gp\" (UniqueName: \"kubernetes.io/projected/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-kube-api-access-6g6gp\") pod \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\" (UID: \"b32b5d51-02d2-4eb7-9e52-09e2b5ec908f\") " Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.277062 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-logs" (OuterVolumeSpecName: "logs") pod "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" (UID: "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.282318 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" (UID: "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.295308 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-kube-api-access-6g6gp" (OuterVolumeSpecName: "kube-api-access-6g6gp") pod "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" (UID: "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f"). InnerVolumeSpecName "kube-api-access-6g6gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.308391 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" (UID: "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.321169 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-config-data" (OuterVolumeSpecName: "config-data") pod "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" (UID: "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.322873 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-scripts" (OuterVolumeSpecName: "scripts") pod "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" (UID: "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.353003 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" (UID: "b32b5d51-02d2-4eb7-9e52-09e2b5ec908f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.378290 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.378350 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.378370 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6gp\" (UniqueName: \"kubernetes.io/projected/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-kube-api-access-6g6gp\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.378392 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.378438 4841 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.378455 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:49 crc kubenswrapper[4841]: I0130 06:50:49.378471 4841 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 06:50:50 crc kubenswrapper[4841]: I0130 06:50:50.105157 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67cf966bf4-l6cdd" Jan 30 06:50:50 crc kubenswrapper[4841]: I0130 06:50:50.164765 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67cf966bf4-l6cdd"] Jan 30 06:50:50 crc kubenswrapper[4841]: I0130 06:50:50.179318 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67cf966bf4-l6cdd"] Jan 30 06:50:50 crc kubenswrapper[4841]: I0130 06:50:50.453125 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" path="/var/lib/kubelet/pods/b32b5d51-02d2-4eb7-9e52-09e2b5ec908f/volumes" Jan 30 06:50:53 crc kubenswrapper[4841]: I0130 06:50:53.098154 4841 scope.go:117] "RemoveContainer" containerID="4b19921ac258b7a81bd50539cda824dc6e1104ee0edd57de4e4db8a55fb77560" Jan 30 06:50:53 crc kubenswrapper[4841]: I0130 06:50:53.130751 4841 scope.go:117] "RemoveContainer" containerID="c0c975c2ae8957381471e3b4bd5ef1260153a2516ce0e7e793c0053ca9ef7c14" Jan 30 06:50:53 crc kubenswrapper[4841]: I0130 06:50:53.161296 4841 scope.go:117] "RemoveContainer" containerID="e4de32d38c29af149fbc2cb43bae0696cede6db965c16533eab42f831e6483a0" Jan 30 06:50:53 crc kubenswrapper[4841]: I0130 06:50:53.216668 4841 scope.go:117] "RemoveContainer" containerID="8a99113da5cb60b54732ce2862f6e446264f7631e60823cea8617e66044c1d76" Jan 30 06:50:53 crc kubenswrapper[4841]: I0130 06:50:53.265110 4841 scope.go:117] "RemoveContainer" containerID="cb797749fe33ff934f3ab20cebadd7d9c856c6147d540bbfb4367f4c5d56926b" Jan 30 06:50:53 crc kubenswrapper[4841]: I0130 06:50:53.294010 4841 scope.go:117] "RemoveContainer" containerID="8238e9d7576ea53b78b4060063ecd593cd36dfb4b2b76876395e3f04eff8d08b" Jan 30 06:50:53 crc kubenswrapper[4841]: I0130 06:50:53.347636 4841 scope.go:117] "RemoveContainer" containerID="f784072bbf33149506c374713bcad7f1e775ba849f776ee2bbf0fd14d862f299" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.573435 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67fcb744b8-lwsgh"] Jan 30 06:50:59 crc kubenswrapper[4841]: E0130 06:50:59.574348 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerName="horizon" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.574366 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerName="horizon" Jan 30 06:50:59 crc kubenswrapper[4841]: E0130 06:50:59.574382 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="342b4f2d-a28e-4a01-9d96-bac5f85b92d1" containerName="horizon" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.574391 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="342b4f2d-a28e-4a01-9d96-bac5f85b92d1" containerName="horizon" Jan 30 06:50:59 crc kubenswrapper[4841]: E0130 06:50:59.574427 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f3107e-b6dd-4265-8517-b38b5f4eae56" containerName="horizon" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.574438 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f3107e-b6dd-4265-8517-b38b5f4eae56" containerName="horizon" Jan 30 06:50:59 crc kubenswrapper[4841]: E0130 06:50:59.574452 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f3107e-b6dd-4265-8517-b38b5f4eae56" containerName="horizon-log" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.574462 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f3107e-b6dd-4265-8517-b38b5f4eae56" containerName="horizon-log" Jan 30 06:50:59 crc kubenswrapper[4841]: E0130 06:50:59.574473 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="342b4f2d-a28e-4a01-9d96-bac5f85b92d1" containerName="horizon-log" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.574481 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="342b4f2d-a28e-4a01-9d96-bac5f85b92d1" containerName="horizon-log" Jan 30 06:50:59 crc kubenswrapper[4841]: E0130 06:50:59.574500 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerName="horizon-log" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.574508 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerName="horizon-log" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.574757 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="342b4f2d-a28e-4a01-9d96-bac5f85b92d1" containerName="horizon-log" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.574776 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="342b4f2d-a28e-4a01-9d96-bac5f85b92d1" containerName="horizon" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.574792 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f3107e-b6dd-4265-8517-b38b5f4eae56" containerName="horizon-log" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.574808 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f3107e-b6dd-4265-8517-b38b5f4eae56" containerName="horizon" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.574824 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerName="horizon" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.574843 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32b5d51-02d2-4eb7-9e52-09e2b5ec908f" containerName="horizon-log" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.576064 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.595386 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67fcb744b8-lwsgh"] Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.711893 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0fcb13ee-9577-4300-909e-735238669ee3-horizon-secret-key\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.712053 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fcb13ee-9577-4300-909e-735238669ee3-scripts\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.712103 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fcb13ee-9577-4300-909e-735238669ee3-config-data\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.712182 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fcb13ee-9577-4300-909e-735238669ee3-logs\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.712309 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fcb13ee-9577-4300-909e-735238669ee3-combined-ca-bundle\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.712356 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frs8m\" (UniqueName: \"kubernetes.io/projected/0fcb13ee-9577-4300-909e-735238669ee3-kube-api-access-frs8m\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.712576 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fcb13ee-9577-4300-909e-735238669ee3-horizon-tls-certs\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.814165 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fcb13ee-9577-4300-909e-735238669ee3-scripts\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.814232 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fcb13ee-9577-4300-909e-735238669ee3-config-data\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.814292 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fcb13ee-9577-4300-909e-735238669ee3-logs\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.814343 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fcb13ee-9577-4300-909e-735238669ee3-combined-ca-bundle\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.814390 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frs8m\" (UniqueName: \"kubernetes.io/projected/0fcb13ee-9577-4300-909e-735238669ee3-kube-api-access-frs8m\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.814448 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fcb13ee-9577-4300-909e-735238669ee3-horizon-tls-certs\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.814522 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0fcb13ee-9577-4300-909e-735238669ee3-horizon-secret-key\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.815172 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fcb13ee-9577-4300-909e-735238669ee3-logs\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.815645 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fcb13ee-9577-4300-909e-735238669ee3-scripts\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.816475 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fcb13ee-9577-4300-909e-735238669ee3-config-data\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.821415 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0fcb13ee-9577-4300-909e-735238669ee3-horizon-secret-key\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.821911 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fcb13ee-9577-4300-909e-735238669ee3-combined-ca-bundle\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.822308 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fcb13ee-9577-4300-909e-735238669ee3-horizon-tls-certs\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.835453 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frs8m\" (UniqueName: \"kubernetes.io/projected/0fcb13ee-9577-4300-909e-735238669ee3-kube-api-access-frs8m\") pod \"horizon-67fcb744b8-lwsgh\" (UID: \"0fcb13ee-9577-4300-909e-735238669ee3\") " pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:50:59 crc kubenswrapper[4841]: I0130 06:50:59.895234 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.350513 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67fcb744b8-lwsgh"] Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.821809 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-sgbz8"] Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.824155 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-sgbz8" Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.826493 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-sgbz8"] Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.870021 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90c33414-5508-46f4-bce1-b008af425e4c-operator-scripts\") pod \"heat-db-create-sgbz8\" (UID: \"90c33414-5508-46f4-bce1-b008af425e4c\") " pod="openstack/heat-db-create-sgbz8" Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.870370 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5jfv\" (UniqueName: \"kubernetes.io/projected/90c33414-5508-46f4-bce1-b008af425e4c-kube-api-access-l5jfv\") pod \"heat-db-create-sgbz8\" (UID: \"90c33414-5508-46f4-bce1-b008af425e4c\") " pod="openstack/heat-db-create-sgbz8" Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.938429 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-30c7-account-create-update-tzfxj"] Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.939652 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-30c7-account-create-update-tzfxj" Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.942060 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.946900 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-30c7-account-create-update-tzfxj"] Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.972114 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5jfv\" (UniqueName: \"kubernetes.io/projected/90c33414-5508-46f4-bce1-b008af425e4c-kube-api-access-l5jfv\") pod \"heat-db-create-sgbz8\" (UID: \"90c33414-5508-46f4-bce1-b008af425e4c\") " pod="openstack/heat-db-create-sgbz8" Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.977073 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90c33414-5508-46f4-bce1-b008af425e4c-operator-scripts\") pod \"heat-db-create-sgbz8\" (UID: \"90c33414-5508-46f4-bce1-b008af425e4c\") " pod="openstack/heat-db-create-sgbz8" Jan 30 06:51:00 crc kubenswrapper[4841]: I0130 06:51:00.977977 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90c33414-5508-46f4-bce1-b008af425e4c-operator-scripts\") pod \"heat-db-create-sgbz8\" (UID: \"90c33414-5508-46f4-bce1-b008af425e4c\") " pod="openstack/heat-db-create-sgbz8" Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.016083 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5jfv\" (UniqueName: \"kubernetes.io/projected/90c33414-5508-46f4-bce1-b008af425e4c-kube-api-access-l5jfv\") pod \"heat-db-create-sgbz8\" (UID: \"90c33414-5508-46f4-bce1-b008af425e4c\") " pod="openstack/heat-db-create-sgbz8" Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.077814 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhjq5\" (UniqueName: \"kubernetes.io/projected/96f84dc1-99f0-4b50-ae30-d15c6166c0b9-kube-api-access-hhjq5\") pod \"heat-30c7-account-create-update-tzfxj\" (UID: \"96f84dc1-99f0-4b50-ae30-d15c6166c0b9\") " pod="openstack/heat-30c7-account-create-update-tzfxj" Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.077919 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96f84dc1-99f0-4b50-ae30-d15c6166c0b9-operator-scripts\") pod \"heat-30c7-account-create-update-tzfxj\" (UID: \"96f84dc1-99f0-4b50-ae30-d15c6166c0b9\") " pod="openstack/heat-30c7-account-create-update-tzfxj" Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.179019 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96f84dc1-99f0-4b50-ae30-d15c6166c0b9-operator-scripts\") pod \"heat-30c7-account-create-update-tzfxj\" (UID: \"96f84dc1-99f0-4b50-ae30-d15c6166c0b9\") " pod="openstack/heat-30c7-account-create-update-tzfxj" Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.179145 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhjq5\" (UniqueName: \"kubernetes.io/projected/96f84dc1-99f0-4b50-ae30-d15c6166c0b9-kube-api-access-hhjq5\") pod \"heat-30c7-account-create-update-tzfxj\" (UID: \"96f84dc1-99f0-4b50-ae30-d15c6166c0b9\") " pod="openstack/heat-30c7-account-create-update-tzfxj" Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.180130 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96f84dc1-99f0-4b50-ae30-d15c6166c0b9-operator-scripts\") pod \"heat-30c7-account-create-update-tzfxj\" (UID: \"96f84dc1-99f0-4b50-ae30-d15c6166c0b9\") " pod="openstack/heat-30c7-account-create-update-tzfxj" Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.186472 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-sgbz8" Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.196040 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhjq5\" (UniqueName: \"kubernetes.io/projected/96f84dc1-99f0-4b50-ae30-d15c6166c0b9-kube-api-access-hhjq5\") pod \"heat-30c7-account-create-update-tzfxj\" (UID: \"96f84dc1-99f0-4b50-ae30-d15c6166c0b9\") " pod="openstack/heat-30c7-account-create-update-tzfxj" Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.255967 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-30c7-account-create-update-tzfxj" Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.258604 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67fcb744b8-lwsgh" event={"ID":"0fcb13ee-9577-4300-909e-735238669ee3","Type":"ContainerStarted","Data":"ef809b6933ddfcf98c2aee71f0b38343c0c9c64a6985a264e592457e3bf96c7b"} Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.258654 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67fcb744b8-lwsgh" event={"ID":"0fcb13ee-9577-4300-909e-735238669ee3","Type":"ContainerStarted","Data":"4b2d06d732610b226c5de76efa1ed5ce5d895a438edc99e0f7121051c9694b11"} Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.258665 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67fcb744b8-lwsgh" event={"ID":"0fcb13ee-9577-4300-909e-735238669ee3","Type":"ContainerStarted","Data":"ebcb101f21b62647bed68b85e52b7f1f089876d75cc0b3b1488bf77df8981d38"} Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.289213 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67fcb744b8-lwsgh" podStartSLOduration=2.289194673 podStartE2EDuration="2.289194673s" podCreationTimestamp="2026-01-30 06:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:51:01.27592017 +0000 UTC m=+6198.269392808" watchObservedRunningTime="2026-01-30 06:51:01.289194673 +0000 UTC m=+6198.282667311" Jan 30 06:51:01 crc kubenswrapper[4841]: W0130 06:51:01.665391 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90c33414_5508_46f4_bce1_b008af425e4c.slice/crio-7b174845486d4afc452a46cafca1bc6d01b2b23420a68a3b477c079477d5402f WatchSource:0}: Error finding container 7b174845486d4afc452a46cafca1bc6d01b2b23420a68a3b477c079477d5402f: Status 404 returned error can't find the container with id 7b174845486d4afc452a46cafca1bc6d01b2b23420a68a3b477c079477d5402f Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.670942 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-sgbz8"] Jan 30 06:51:01 crc kubenswrapper[4841]: W0130 06:51:01.758030 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96f84dc1_99f0_4b50_ae30_d15c6166c0b9.slice/crio-259ff06e02bd4162e6e470f874c78bbb9e9c2570fe20ec1d267c0689eb79bec1 WatchSource:0}: Error finding container 259ff06e02bd4162e6e470f874c78bbb9e9c2570fe20ec1d267c0689eb79bec1: Status 404 returned error can't find the container with id 259ff06e02bd4162e6e470f874c78bbb9e9c2570fe20ec1d267c0689eb79bec1 Jan 30 06:51:01 crc kubenswrapper[4841]: I0130 06:51:01.760849 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-30c7-account-create-update-tzfxj"] Jan 30 06:51:02 crc kubenswrapper[4841]: I0130 06:51:02.267435 4841 generic.go:334] "Generic (PLEG): container finished" podID="90c33414-5508-46f4-bce1-b008af425e4c" containerID="20043f1975fcdde8b6a39c9ce9b879d7e01f63fe9829b3f8a38492ae13b5b75b" exitCode=0 Jan 30 06:51:02 crc kubenswrapper[4841]: I0130 06:51:02.267490 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-sgbz8" event={"ID":"90c33414-5508-46f4-bce1-b008af425e4c","Type":"ContainerDied","Data":"20043f1975fcdde8b6a39c9ce9b879d7e01f63fe9829b3f8a38492ae13b5b75b"} Jan 30 06:51:02 crc kubenswrapper[4841]: I0130 06:51:02.267512 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-sgbz8" event={"ID":"90c33414-5508-46f4-bce1-b008af425e4c","Type":"ContainerStarted","Data":"7b174845486d4afc452a46cafca1bc6d01b2b23420a68a3b477c079477d5402f"} Jan 30 06:51:02 crc kubenswrapper[4841]: I0130 06:51:02.268879 4841 generic.go:334] "Generic (PLEG): container finished" podID="96f84dc1-99f0-4b50-ae30-d15c6166c0b9" containerID="3c04dafe32da32c2968f6b97bb1aa5d805cece9d89402407e9a0a0b7fc40840c" exitCode=0 Jan 30 06:51:02 crc kubenswrapper[4841]: I0130 06:51:02.269751 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-30c7-account-create-update-tzfxj" event={"ID":"96f84dc1-99f0-4b50-ae30-d15c6166c0b9","Type":"ContainerDied","Data":"3c04dafe32da32c2968f6b97bb1aa5d805cece9d89402407e9a0a0b7fc40840c"} Jan 30 06:51:02 crc kubenswrapper[4841]: I0130 06:51:02.269775 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-30c7-account-create-update-tzfxj" event={"ID":"96f84dc1-99f0-4b50-ae30-d15c6166c0b9","Type":"ContainerStarted","Data":"259ff06e02bd4162e6e470f874c78bbb9e9c2570fe20ec1d267c0689eb79bec1"} Jan 30 06:51:03 crc kubenswrapper[4841]: I0130 06:51:03.747124 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-sgbz8" Jan 30 06:51:03 crc kubenswrapper[4841]: I0130 06:51:03.753329 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-30c7-account-create-update-tzfxj" Jan 30 06:51:03 crc kubenswrapper[4841]: I0130 06:51:03.845532 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5jfv\" (UniqueName: \"kubernetes.io/projected/90c33414-5508-46f4-bce1-b008af425e4c-kube-api-access-l5jfv\") pod \"90c33414-5508-46f4-bce1-b008af425e4c\" (UID: \"90c33414-5508-46f4-bce1-b008af425e4c\") " Jan 30 06:51:03 crc kubenswrapper[4841]: I0130 06:51:03.845873 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90c33414-5508-46f4-bce1-b008af425e4c-operator-scripts\") pod \"90c33414-5508-46f4-bce1-b008af425e4c\" (UID: \"90c33414-5508-46f4-bce1-b008af425e4c\") " Jan 30 06:51:03 crc kubenswrapper[4841]: I0130 06:51:03.846873 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c33414-5508-46f4-bce1-b008af425e4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90c33414-5508-46f4-bce1-b008af425e4c" (UID: "90c33414-5508-46f4-bce1-b008af425e4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:51:03 crc kubenswrapper[4841]: I0130 06:51:03.861185 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90c33414-5508-46f4-bce1-b008af425e4c-kube-api-access-l5jfv" (OuterVolumeSpecName: "kube-api-access-l5jfv") pod "90c33414-5508-46f4-bce1-b008af425e4c" (UID: "90c33414-5508-46f4-bce1-b008af425e4c"). InnerVolumeSpecName "kube-api-access-l5jfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:51:03 crc kubenswrapper[4841]: I0130 06:51:03.948147 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96f84dc1-99f0-4b50-ae30-d15c6166c0b9-operator-scripts\") pod \"96f84dc1-99f0-4b50-ae30-d15c6166c0b9\" (UID: \"96f84dc1-99f0-4b50-ae30-d15c6166c0b9\") " Jan 30 06:51:03 crc kubenswrapper[4841]: I0130 06:51:03.948907 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhjq5\" (UniqueName: \"kubernetes.io/projected/96f84dc1-99f0-4b50-ae30-d15c6166c0b9-kube-api-access-hhjq5\") pod \"96f84dc1-99f0-4b50-ae30-d15c6166c0b9\" (UID: \"96f84dc1-99f0-4b50-ae30-d15c6166c0b9\") " Jan 30 06:51:03 crc kubenswrapper[4841]: I0130 06:51:03.949462 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90c33414-5508-46f4-bce1-b008af425e4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:03 crc kubenswrapper[4841]: I0130 06:51:03.949478 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5jfv\" (UniqueName: \"kubernetes.io/projected/90c33414-5508-46f4-bce1-b008af425e4c-kube-api-access-l5jfv\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:03 crc kubenswrapper[4841]: I0130 06:51:03.948811 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96f84dc1-99f0-4b50-ae30-d15c6166c0b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "96f84dc1-99f0-4b50-ae30-d15c6166c0b9" (UID: "96f84dc1-99f0-4b50-ae30-d15c6166c0b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:51:03 crc kubenswrapper[4841]: I0130 06:51:03.953969 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f84dc1-99f0-4b50-ae30-d15c6166c0b9-kube-api-access-hhjq5" (OuterVolumeSpecName: "kube-api-access-hhjq5") pod "96f84dc1-99f0-4b50-ae30-d15c6166c0b9" (UID: "96f84dc1-99f0-4b50-ae30-d15c6166c0b9"). InnerVolumeSpecName "kube-api-access-hhjq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:51:04 crc kubenswrapper[4841]: I0130 06:51:04.051570 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96f84dc1-99f0-4b50-ae30-d15c6166c0b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:04 crc kubenswrapper[4841]: I0130 06:51:04.051608 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhjq5\" (UniqueName: \"kubernetes.io/projected/96f84dc1-99f0-4b50-ae30-d15c6166c0b9-kube-api-access-hhjq5\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:04 crc kubenswrapper[4841]: I0130 06:51:04.310222 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-30c7-account-create-update-tzfxj" event={"ID":"96f84dc1-99f0-4b50-ae30-d15c6166c0b9","Type":"ContainerDied","Data":"259ff06e02bd4162e6e470f874c78bbb9e9c2570fe20ec1d267c0689eb79bec1"} Jan 30 06:51:04 crc kubenswrapper[4841]: I0130 06:51:04.310273 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-30c7-account-create-update-tzfxj" Jan 30 06:51:04 crc kubenswrapper[4841]: I0130 06:51:04.310287 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="259ff06e02bd4162e6e470f874c78bbb9e9c2570fe20ec1d267c0689eb79bec1" Jan 30 06:51:04 crc kubenswrapper[4841]: I0130 06:51:04.313142 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-sgbz8" event={"ID":"90c33414-5508-46f4-bce1-b008af425e4c","Type":"ContainerDied","Data":"7b174845486d4afc452a46cafca1bc6d01b2b23420a68a3b477c079477d5402f"} Jan 30 06:51:04 crc kubenswrapper[4841]: I0130 06:51:04.313183 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b174845486d4afc452a46cafca1bc6d01b2b23420a68a3b477c079477d5402f" Jan 30 06:51:04 crc kubenswrapper[4841]: I0130 06:51:04.313258 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-sgbz8" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.041457 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-8sv85"] Jan 30 06:51:06 crc kubenswrapper[4841]: E0130 06:51:06.042052 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f84dc1-99f0-4b50-ae30-d15c6166c0b9" containerName="mariadb-account-create-update" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.042067 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f84dc1-99f0-4b50-ae30-d15c6166c0b9" containerName="mariadb-account-create-update" Jan 30 06:51:06 crc kubenswrapper[4841]: E0130 06:51:06.042090 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90c33414-5508-46f4-bce1-b008af425e4c" containerName="mariadb-database-create" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.042097 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c33414-5508-46f4-bce1-b008af425e4c" containerName="mariadb-database-create" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.042289 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f84dc1-99f0-4b50-ae30-d15c6166c0b9" containerName="mariadb-account-create-update" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.042303 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="90c33414-5508-46f4-bce1-b008af425e4c" containerName="mariadb-database-create" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.042962 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.045603 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-q5r6c" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.046037 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.107578 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-8sv85"] Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.196952 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr8xp\" (UniqueName: \"kubernetes.io/projected/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-kube-api-access-vr8xp\") pod \"heat-db-sync-8sv85\" (UID: \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\") " pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.196998 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-combined-ca-bundle\") pod \"heat-db-sync-8sv85\" (UID: \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\") " pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.197028 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-config-data\") pod \"heat-db-sync-8sv85\" (UID: \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\") " pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.299295 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr8xp\" (UniqueName: \"kubernetes.io/projected/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-kube-api-access-vr8xp\") pod \"heat-db-sync-8sv85\" (UID: \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\") " pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.299362 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-combined-ca-bundle\") pod \"heat-db-sync-8sv85\" (UID: \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\") " pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.299450 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-config-data\") pod \"heat-db-sync-8sv85\" (UID: \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\") " pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.308020 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-config-data\") pod \"heat-db-sync-8sv85\" (UID: \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\") " pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.315496 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-combined-ca-bundle\") pod \"heat-db-sync-8sv85\" (UID: \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\") " pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.332931 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr8xp\" (UniqueName: \"kubernetes.io/projected/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-kube-api-access-vr8xp\") pod \"heat-db-sync-8sv85\" (UID: \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\") " pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.373990 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:06 crc kubenswrapper[4841]: I0130 06:51:06.922734 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-8sv85"] Jan 30 06:51:07 crc kubenswrapper[4841]: I0130 06:51:07.347287 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-8sv85" event={"ID":"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3","Type":"ContainerStarted","Data":"97f508d0574dcc1ad98584e536bc68ad1524d7b12ec07c64af1de458db126282"} Jan 30 06:51:09 crc kubenswrapper[4841]: I0130 06:51:09.895938 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:51:09 crc kubenswrapper[4841]: I0130 06:51:09.896390 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:51:10 crc kubenswrapper[4841]: I0130 06:51:10.468523 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:51:10 crc kubenswrapper[4841]: I0130 06:51:10.468786 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:51:10 crc kubenswrapper[4841]: I0130 06:51:10.468827 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 06:51:10 crc kubenswrapper[4841]: I0130 06:51:10.469579 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f5c280cd214672383e7f51c0a2877cc3f0cec94eda93bd7736e321f46e8506f"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:51:10 crc kubenswrapper[4841]: I0130 06:51:10.469629 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://5f5c280cd214672383e7f51c0a2877cc3f0cec94eda93bd7736e321f46e8506f" gracePeriod=600 Jan 30 06:51:11 crc kubenswrapper[4841]: I0130 06:51:11.395213 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="5f5c280cd214672383e7f51c0a2877cc3f0cec94eda93bd7736e321f46e8506f" exitCode=0 Jan 30 06:51:11 crc kubenswrapper[4841]: I0130 06:51:11.395264 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"5f5c280cd214672383e7f51c0a2877cc3f0cec94eda93bd7736e321f46e8506f"} Jan 30 06:51:11 crc kubenswrapper[4841]: I0130 06:51:11.395299 4841 scope.go:117] "RemoveContainer" containerID="05013f4521a12c4cce26c98bf819754197f4fa3fa484acad07245d967437b156" Jan 30 06:51:15 crc kubenswrapper[4841]: I0130 06:51:15.435807 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2"} Jan 30 06:51:15 crc kubenswrapper[4841]: I0130 06:51:15.437480 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-8sv85" event={"ID":"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3","Type":"ContainerStarted","Data":"a20b84a78e155854c194b854cb0f8b3a36b090e47ec4ea63e5b03a2a9c57d365"} Jan 30 06:51:15 crc kubenswrapper[4841]: I0130 06:51:15.492191 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-8sv85" podStartSLOduration=1.545817657 podStartE2EDuration="9.492167491s" podCreationTimestamp="2026-01-30 06:51:06 +0000 UTC" firstStartedPulling="2026-01-30 06:51:06.916220824 +0000 UTC m=+6203.909693502" lastFinishedPulling="2026-01-30 06:51:14.862570698 +0000 UTC m=+6211.856043336" observedRunningTime="2026-01-30 06:51:15.486364876 +0000 UTC m=+6212.479837524" watchObservedRunningTime="2026-01-30 06:51:15.492167491 +0000 UTC m=+6212.485640139" Jan 30 06:51:17 crc kubenswrapper[4841]: I0130 06:51:17.491611 4841 generic.go:334] "Generic (PLEG): container finished" podID="a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3" containerID="a20b84a78e155854c194b854cb0f8b3a36b090e47ec4ea63e5b03a2a9c57d365" exitCode=0 Jan 30 06:51:17 crc kubenswrapper[4841]: I0130 06:51:17.491710 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-8sv85" event={"ID":"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3","Type":"ContainerDied","Data":"a20b84a78e155854c194b854cb0f8b3a36b090e47ec4ea63e5b03a2a9c57d365"} Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.262585 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8ngxh"] Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.266236 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.276158 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ngxh"] Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.397102 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btnk7\" (UniqueName: \"kubernetes.io/projected/00800c5e-91fd-4437-b1cf-6c5dece7d02c-kube-api-access-btnk7\") pod \"community-operators-8ngxh\" (UID: \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\") " pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.397538 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00800c5e-91fd-4437-b1cf-6c5dece7d02c-catalog-content\") pod \"community-operators-8ngxh\" (UID: \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\") " pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.397766 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00800c5e-91fd-4437-b1cf-6c5dece7d02c-utilities\") pod \"community-operators-8ngxh\" (UID: \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\") " pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.499697 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00800c5e-91fd-4437-b1cf-6c5dece7d02c-utilities\") pod \"community-operators-8ngxh\" (UID: \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\") " pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.501287 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00800c5e-91fd-4437-b1cf-6c5dece7d02c-utilities\") pod \"community-operators-8ngxh\" (UID: \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\") " pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.502976 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btnk7\" (UniqueName: \"kubernetes.io/projected/00800c5e-91fd-4437-b1cf-6c5dece7d02c-kube-api-access-btnk7\") pod \"community-operators-8ngxh\" (UID: \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\") " pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.503922 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00800c5e-91fd-4437-b1cf-6c5dece7d02c-catalog-content\") pod \"community-operators-8ngxh\" (UID: \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\") " pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.504321 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00800c5e-91fd-4437-b1cf-6c5dece7d02c-catalog-content\") pod \"community-operators-8ngxh\" (UID: \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\") " pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.520900 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btnk7\" (UniqueName: \"kubernetes.io/projected/00800c5e-91fd-4437-b1cf-6c5dece7d02c-kube-api-access-btnk7\") pod \"community-operators-8ngxh\" (UID: \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\") " pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.621120 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:18 crc kubenswrapper[4841]: I0130 06:51:18.930065 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.023345 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr8xp\" (UniqueName: \"kubernetes.io/projected/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-kube-api-access-vr8xp\") pod \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\" (UID: \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\") " Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.023580 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-config-data\") pod \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\" (UID: \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\") " Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.023723 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-combined-ca-bundle\") pod \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\" (UID: \"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3\") " Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.028633 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-kube-api-access-vr8xp" (OuterVolumeSpecName: "kube-api-access-vr8xp") pod "a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3" (UID: "a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3"). InnerVolumeSpecName "kube-api-access-vr8xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.056717 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3" (UID: "a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.096288 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-config-data" (OuterVolumeSpecName: "config-data") pod "a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3" (UID: "a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.126728 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr8xp\" (UniqueName: \"kubernetes.io/projected/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-kube-api-access-vr8xp\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.126769 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.126783 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:19 crc kubenswrapper[4841]: W0130 06:51:19.239846 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00800c5e_91fd_4437_b1cf_6c5dece7d02c.slice/crio-8dfb67f0800f4b4e56d7f3f67cf7255dd7f89ee5665392495d29cb532b97d8fe WatchSource:0}: Error finding container 8dfb67f0800f4b4e56d7f3f67cf7255dd7f89ee5665392495d29cb532b97d8fe: Status 404 returned error can't find the container with id 8dfb67f0800f4b4e56d7f3f67cf7255dd7f89ee5665392495d29cb532b97d8fe Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.243803 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ngxh"] Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.520301 4841 generic.go:334] "Generic (PLEG): container finished" podID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" containerID="ab383f4a10622bde029a29293874a4c1eeea8bb226c9f0784f52a2234a3b0885" exitCode=0 Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.520393 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngxh" event={"ID":"00800c5e-91fd-4437-b1cf-6c5dece7d02c","Type":"ContainerDied","Data":"ab383f4a10622bde029a29293874a4c1eeea8bb226c9f0784f52a2234a3b0885"} Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.520838 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngxh" event={"ID":"00800c5e-91fd-4437-b1cf-6c5dece7d02c","Type":"ContainerStarted","Data":"8dfb67f0800f4b4e56d7f3f67cf7255dd7f89ee5665392495d29cb532b97d8fe"} Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.523089 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-8sv85" event={"ID":"a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3","Type":"ContainerDied","Data":"97f508d0574dcc1ad98584e536bc68ad1524d7b12ec07c64af1de458db126282"} Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.523118 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97f508d0574dcc1ad98584e536bc68ad1524d7b12ec07c64af1de458db126282" Jan 30 06:51:19 crc kubenswrapper[4841]: I0130 06:51:19.523237 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-8sv85" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.566388 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngxh" event={"ID":"00800c5e-91fd-4437-b1cf-6c5dece7d02c","Type":"ContainerStarted","Data":"8a6d6817790f4cfb4f68c08ffb7af36afe1e7b2f53150e206f46727dd5acbecc"} Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.816961 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6f87f7d7cf-94vvt"] Jan 30 06:51:20 crc kubenswrapper[4841]: E0130 06:51:20.820059 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3" containerName="heat-db-sync" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.820081 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3" containerName="heat-db-sync" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.820298 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3" containerName="heat-db-sync" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.820909 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.839236 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-q5r6c" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.839318 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.841320 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.867891 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6f87f7d7cf-94vvt"] Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.923866 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6fd5789c7-qwll9"] Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.924980 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.926719 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.944973 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6fd5789c7-qwll9"] Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.973882 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6fdd9f8874-d7x78"] Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.975016 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.980096 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.988911 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6fdd9f8874-d7x78"] Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.997109 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-668rv\" (UniqueName: \"kubernetes.io/projected/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-kube-api-access-668rv\") pod \"heat-engine-6f87f7d7cf-94vvt\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.997441 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-combined-ca-bundle\") pod \"heat-engine-6f87f7d7cf-94vvt\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.997481 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-config-data\") pod \"heat-engine-6f87f7d7cf-94vvt\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:20 crc kubenswrapper[4841]: I0130 06:51:20.997513 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-config-data-custom\") pod \"heat-engine-6f87f7d7cf-94vvt\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.099652 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-config-data-custom\") pod \"heat-cfnapi-6fdd9f8874-d7x78\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.099907 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx5mk\" (UniqueName: \"kubernetes.io/projected/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-kube-api-access-zx5mk\") pod \"heat-api-6fd5789c7-qwll9\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.100078 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-combined-ca-bundle\") pod \"heat-api-6fd5789c7-qwll9\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.100201 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-config-data-custom\") pod \"heat-api-6fd5789c7-qwll9\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.100558 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-668rv\" (UniqueName: \"kubernetes.io/projected/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-kube-api-access-668rv\") pod \"heat-engine-6f87f7d7cf-94vvt\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.100690 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdwlj\" (UniqueName: \"kubernetes.io/projected/098c43da-f08f-4050-bed0-08952682d551-kube-api-access-fdwlj\") pod \"heat-cfnapi-6fdd9f8874-d7x78\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.100807 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-combined-ca-bundle\") pod \"heat-cfnapi-6fdd9f8874-d7x78\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.100937 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-combined-ca-bundle\") pod \"heat-engine-6f87f7d7cf-94vvt\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.101054 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-config-data\") pod \"heat-engine-6f87f7d7cf-94vvt\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.101171 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-config-data\") pod \"heat-cfnapi-6fdd9f8874-d7x78\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.101296 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-config-data-custom\") pod \"heat-engine-6f87f7d7cf-94vvt\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.101508 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-config-data\") pod \"heat-api-6fd5789c7-qwll9\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.106841 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-config-data-custom\") pod \"heat-engine-6f87f7d7cf-94vvt\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.107436 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-config-data\") pod \"heat-engine-6f87f7d7cf-94vvt\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.117789 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-668rv\" (UniqueName: \"kubernetes.io/projected/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-kube-api-access-668rv\") pod \"heat-engine-6f87f7d7cf-94vvt\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.128475 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-combined-ca-bundle\") pod \"heat-engine-6f87f7d7cf-94vvt\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.159259 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.203870 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx5mk\" (UniqueName: \"kubernetes.io/projected/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-kube-api-access-zx5mk\") pod \"heat-api-6fd5789c7-qwll9\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.203962 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-combined-ca-bundle\") pod \"heat-api-6fd5789c7-qwll9\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.203992 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-config-data-custom\") pod \"heat-api-6fd5789c7-qwll9\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.204122 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdwlj\" (UniqueName: \"kubernetes.io/projected/098c43da-f08f-4050-bed0-08952682d551-kube-api-access-fdwlj\") pod \"heat-cfnapi-6fdd9f8874-d7x78\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.204157 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-combined-ca-bundle\") pod \"heat-cfnapi-6fdd9f8874-d7x78\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.204186 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-config-data\") pod \"heat-cfnapi-6fdd9f8874-d7x78\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.204246 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-config-data\") pod \"heat-api-6fd5789c7-qwll9\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.204292 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-config-data-custom\") pod \"heat-cfnapi-6fdd9f8874-d7x78\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.214590 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-config-data\") pod \"heat-api-6fd5789c7-qwll9\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.215189 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-combined-ca-bundle\") pod \"heat-cfnapi-6fdd9f8874-d7x78\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.223337 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-config-data\") pod \"heat-cfnapi-6fdd9f8874-d7x78\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.225974 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-config-data-custom\") pod \"heat-api-6fd5789c7-qwll9\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.230764 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-combined-ca-bundle\") pod \"heat-api-6fd5789c7-qwll9\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.233425 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx5mk\" (UniqueName: \"kubernetes.io/projected/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-kube-api-access-zx5mk\") pod \"heat-api-6fd5789c7-qwll9\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.233959 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdwlj\" (UniqueName: \"kubernetes.io/projected/098c43da-f08f-4050-bed0-08952682d551-kube-api-access-fdwlj\") pod \"heat-cfnapi-6fdd9f8874-d7x78\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.246225 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.248782 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-config-data-custom\") pod \"heat-cfnapi-6fdd9f8874-d7x78\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.295592 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:21 crc kubenswrapper[4841]: W0130 06:51:21.684894 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc77bd65_bd9c_4a2e_842d_3606ce65a12b.slice/crio-4d0d989186d7b25b1fa1c3c7d68cecdb36d3639e714c76cac31ec46bc40c1247 WatchSource:0}: Error finding container 4d0d989186d7b25b1fa1c3c7d68cecdb36d3639e714c76cac31ec46bc40c1247: Status 404 returned error can't find the container with id 4d0d989186d7b25b1fa1c3c7d68cecdb36d3639e714c76cac31ec46bc40c1247 Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.690078 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6f87f7d7cf-94vvt"] Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.803539 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6fd5789c7-qwll9"] Jan 30 06:51:21 crc kubenswrapper[4841]: I0130 06:51:21.897379 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6fdd9f8874-d7x78"] Jan 30 06:51:22 crc kubenswrapper[4841]: I0130 06:51:22.029369 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:51:22 crc kubenswrapper[4841]: I0130 06:51:22.587950 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6fd5789c7-qwll9" event={"ID":"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446","Type":"ContainerStarted","Data":"686cede759085a3573f951d6ee7b65d242bd00f3be933784eea1772bedab1e71"} Jan 30 06:51:22 crc kubenswrapper[4841]: I0130 06:51:22.589659 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6f87f7d7cf-94vvt" event={"ID":"cc77bd65-bd9c-4a2e-842d-3606ce65a12b","Type":"ContainerStarted","Data":"dd2543abbce5d1875b5229d80c68b6c6476076d303ca792ce54ce4cf5e1877a4"} Jan 30 06:51:22 crc kubenswrapper[4841]: I0130 06:51:22.589681 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6f87f7d7cf-94vvt" event={"ID":"cc77bd65-bd9c-4a2e-842d-3606ce65a12b","Type":"ContainerStarted","Data":"4d0d989186d7b25b1fa1c3c7d68cecdb36d3639e714c76cac31ec46bc40c1247"} Jan 30 06:51:22 crc kubenswrapper[4841]: I0130 06:51:22.589843 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:22 crc kubenswrapper[4841]: I0130 06:51:22.590833 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" event={"ID":"098c43da-f08f-4050-bed0-08952682d551","Type":"ContainerStarted","Data":"965efbcc6d37ea2dbc50149e8298fbdaa162298a95a1b1f9d9901c96c9d49f97"} Jan 30 06:51:22 crc kubenswrapper[4841]: I0130 06:51:22.608509 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6f87f7d7cf-94vvt" podStartSLOduration=2.608492766 podStartE2EDuration="2.608492766s" podCreationTimestamp="2026-01-30 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:51:22.601659294 +0000 UTC m=+6219.595131932" watchObservedRunningTime="2026-01-30 06:51:22.608492766 +0000 UTC m=+6219.601965404" Jan 30 06:51:23 crc kubenswrapper[4841]: I0130 06:51:23.046728 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-tcrrl"] Jan 30 06:51:23 crc kubenswrapper[4841]: I0130 06:51:23.054709 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d67b-account-create-update-vkk58"] Jan 30 06:51:23 crc kubenswrapper[4841]: I0130 06:51:23.061736 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-tcrrl"] Jan 30 06:51:23 crc kubenswrapper[4841]: I0130 06:51:23.070812 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d67b-account-create-update-vkk58"] Jan 30 06:51:23 crc kubenswrapper[4841]: I0130 06:51:23.606321 4841 generic.go:334] "Generic (PLEG): container finished" podID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" containerID="8a6d6817790f4cfb4f68c08ffb7af36afe1e7b2f53150e206f46727dd5acbecc" exitCode=0 Jan 30 06:51:23 crc kubenswrapper[4841]: I0130 06:51:23.606379 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngxh" event={"ID":"00800c5e-91fd-4437-b1cf-6c5dece7d02c","Type":"ContainerDied","Data":"8a6d6817790f4cfb4f68c08ffb7af36afe1e7b2f53150e206f46727dd5acbecc"} Jan 30 06:51:23 crc kubenswrapper[4841]: I0130 06:51:23.667540 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-67fcb744b8-lwsgh" Jan 30 06:51:23 crc kubenswrapper[4841]: I0130 06:51:23.745904 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c6697b658-bhtrl"] Jan 30 06:51:23 crc kubenswrapper[4841]: I0130 06:51:23.746129 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c6697b658-bhtrl" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerName="horizon-log" containerID="cri-o://26bf91e83f7f783284c5ae3f95e4da9624775557e77d94dea27183e0d82dd032" gracePeriod=30 Jan 30 06:51:23 crc kubenswrapper[4841]: I0130 06:51:23.746597 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c6697b658-bhtrl" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerName="horizon" containerID="cri-o://d49ea3d715b7b663c44877634bae9eb11ba651f1df614d0a17cf00d282c2a96c" gracePeriod=30 Jan 30 06:51:24 crc kubenswrapper[4841]: I0130 06:51:24.446115 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43" path="/var/lib/kubelet/pods/e3a4d80f-1e00-4dd4-b5ee-1e300fb25c43/volumes" Jan 30 06:51:24 crc kubenswrapper[4841]: I0130 06:51:24.448029 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e57083-b3fd-4ac6-b8bf-3761a913defc" path="/var/lib/kubelet/pods/f9e57083-b3fd-4ac6-b8bf-3761a913defc/volumes" Jan 30 06:51:25 crc kubenswrapper[4841]: I0130 06:51:25.637556 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6fd5789c7-qwll9" event={"ID":"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446","Type":"ContainerStarted","Data":"e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84"} Jan 30 06:51:25 crc kubenswrapper[4841]: I0130 06:51:25.638017 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:25 crc kubenswrapper[4841]: I0130 06:51:25.640465 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngxh" event={"ID":"00800c5e-91fd-4437-b1cf-6c5dece7d02c","Type":"ContainerStarted","Data":"0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2"} Jan 30 06:51:25 crc kubenswrapper[4841]: I0130 06:51:25.642574 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" event={"ID":"098c43da-f08f-4050-bed0-08952682d551","Type":"ContainerStarted","Data":"aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea"} Jan 30 06:51:25 crc kubenswrapper[4841]: I0130 06:51:25.642703 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:25 crc kubenswrapper[4841]: I0130 06:51:25.660334 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6fd5789c7-qwll9" podStartSLOduration=2.81106349 podStartE2EDuration="5.660316408s" podCreationTimestamp="2026-01-30 06:51:20 +0000 UTC" firstStartedPulling="2026-01-30 06:51:21.809374081 +0000 UTC m=+6218.802846719" lastFinishedPulling="2026-01-30 06:51:24.658626999 +0000 UTC m=+6221.652099637" observedRunningTime="2026-01-30 06:51:25.651614685 +0000 UTC m=+6222.645087343" watchObservedRunningTime="2026-01-30 06:51:25.660316408 +0000 UTC m=+6222.653789046" Jan 30 06:51:25 crc kubenswrapper[4841]: I0130 06:51:25.677676 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8ngxh" podStartSLOduration=2.553854612 podStartE2EDuration="7.67766183s" podCreationTimestamp="2026-01-30 06:51:18 +0000 UTC" firstStartedPulling="2026-01-30 06:51:19.525916084 +0000 UTC m=+6216.519388742" lastFinishedPulling="2026-01-30 06:51:24.649723322 +0000 UTC m=+6221.643195960" observedRunningTime="2026-01-30 06:51:25.672283287 +0000 UTC m=+6222.665755925" watchObservedRunningTime="2026-01-30 06:51:25.67766183 +0000 UTC m=+6222.671134468" Jan 30 06:51:25 crc kubenswrapper[4841]: I0130 06:51:25.685960 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" podStartSLOduration=2.927609588 podStartE2EDuration="5.685943481s" podCreationTimestamp="2026-01-30 06:51:20 +0000 UTC" firstStartedPulling="2026-01-30 06:51:21.890888255 +0000 UTC m=+6218.884360893" lastFinishedPulling="2026-01-30 06:51:24.649222148 +0000 UTC m=+6221.642694786" observedRunningTime="2026-01-30 06:51:25.683855515 +0000 UTC m=+6222.677328153" watchObservedRunningTime="2026-01-30 06:51:25.685943481 +0000 UTC m=+6222.679416129" Jan 30 06:51:26 crc kubenswrapper[4841]: I0130 06:51:26.921884 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c6697b658-bhtrl" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.121:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:47768->10.217.1.121:8443: read: connection reset by peer" Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.667620 4841 generic.go:334] "Generic (PLEG): container finished" podID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerID="d49ea3d715b7b663c44877634bae9eb11ba651f1df614d0a17cf00d282c2a96c" exitCode=0 Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.667689 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6697b658-bhtrl" event={"ID":"cfa7cd93-66af-4dae-a610-d61b6235af81","Type":"ContainerDied","Data":"d49ea3d715b7b663c44877634bae9eb11ba651f1df614d0a17cf00d282c2a96c"} Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.813654 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-69b8748bf7-r7xx5"] Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.815064 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.832113 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-69b8748bf7-r7xx5"] Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.874691 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-65545bbfd6-drrjx"] Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.876049 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.899050 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-64c974b8d6-smh57"] Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.900351 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.911668 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-64c974b8d6-smh57"] Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.922165 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-65545bbfd6-drrjx"] Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.946180 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkr7p\" (UniqueName: \"kubernetes.io/projected/484c8903-064d-425e-b01b-2d61dbe306da-kube-api-access-pkr7p\") pod \"heat-engine-69b8748bf7-r7xx5\" (UID: \"484c8903-064d-425e-b01b-2d61dbe306da\") " pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.946270 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484c8903-064d-425e-b01b-2d61dbe306da-combined-ca-bundle\") pod \"heat-engine-69b8748bf7-r7xx5\" (UID: \"484c8903-064d-425e-b01b-2d61dbe306da\") " pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.946314 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/484c8903-064d-425e-b01b-2d61dbe306da-config-data\") pod \"heat-engine-69b8748bf7-r7xx5\" (UID: \"484c8903-064d-425e-b01b-2d61dbe306da\") " pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:27 crc kubenswrapper[4841]: I0130 06:51:27.946373 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/484c8903-064d-425e-b01b-2d61dbe306da-config-data-custom\") pod \"heat-engine-69b8748bf7-r7xx5\" (UID: \"484c8903-064d-425e-b01b-2d61dbe306da\") " pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.049374 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-config-data\") pod \"heat-api-65545bbfd6-drrjx\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.049443 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/484c8903-064d-425e-b01b-2d61dbe306da-config-data-custom\") pod \"heat-engine-69b8748bf7-r7xx5\" (UID: \"484c8903-064d-425e-b01b-2d61dbe306da\") " pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.049465 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-config-data-custom\") pod \"heat-cfnapi-64c974b8d6-smh57\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.049510 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-combined-ca-bundle\") pod \"heat-cfnapi-64c974b8d6-smh57\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.049532 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-config-data\") pod \"heat-cfnapi-64c974b8d6-smh57\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.049946 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkr7p\" (UniqueName: \"kubernetes.io/projected/484c8903-064d-425e-b01b-2d61dbe306da-kube-api-access-pkr7p\") pod \"heat-engine-69b8748bf7-r7xx5\" (UID: \"484c8903-064d-425e-b01b-2d61dbe306da\") " pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.050017 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vzr2\" (UniqueName: \"kubernetes.io/projected/76af154e-00d9-4cf0-80dd-2b39090fb44e-kube-api-access-5vzr2\") pod \"heat-cfnapi-64c974b8d6-smh57\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.050125 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-combined-ca-bundle\") pod \"heat-api-65545bbfd6-drrjx\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.050344 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484c8903-064d-425e-b01b-2d61dbe306da-combined-ca-bundle\") pod \"heat-engine-69b8748bf7-r7xx5\" (UID: \"484c8903-064d-425e-b01b-2d61dbe306da\") " pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.050533 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-config-data-custom\") pod \"heat-api-65545bbfd6-drrjx\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.050753 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/484c8903-064d-425e-b01b-2d61dbe306da-config-data\") pod \"heat-engine-69b8748bf7-r7xx5\" (UID: \"484c8903-064d-425e-b01b-2d61dbe306da\") " pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.050788 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8lrb\" (UniqueName: \"kubernetes.io/projected/095b719b-f5af-480d-a260-f44e526d65ee-kube-api-access-v8lrb\") pod \"heat-api-65545bbfd6-drrjx\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.059284 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484c8903-064d-425e-b01b-2d61dbe306da-combined-ca-bundle\") pod \"heat-engine-69b8748bf7-r7xx5\" (UID: \"484c8903-064d-425e-b01b-2d61dbe306da\") " pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.061337 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/484c8903-064d-425e-b01b-2d61dbe306da-config-data\") pod \"heat-engine-69b8748bf7-r7xx5\" (UID: \"484c8903-064d-425e-b01b-2d61dbe306da\") " pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.064930 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/484c8903-064d-425e-b01b-2d61dbe306da-config-data-custom\") pod \"heat-engine-69b8748bf7-r7xx5\" (UID: \"484c8903-064d-425e-b01b-2d61dbe306da\") " pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.071733 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkr7p\" (UniqueName: \"kubernetes.io/projected/484c8903-064d-425e-b01b-2d61dbe306da-kube-api-access-pkr7p\") pod \"heat-engine-69b8748bf7-r7xx5\" (UID: \"484c8903-064d-425e-b01b-2d61dbe306da\") " pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.137115 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.153601 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vzr2\" (UniqueName: \"kubernetes.io/projected/76af154e-00d9-4cf0-80dd-2b39090fb44e-kube-api-access-5vzr2\") pod \"heat-cfnapi-64c974b8d6-smh57\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.153716 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-combined-ca-bundle\") pod \"heat-api-65545bbfd6-drrjx\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.153794 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-config-data-custom\") pod \"heat-api-65545bbfd6-drrjx\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.153881 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8lrb\" (UniqueName: \"kubernetes.io/projected/095b719b-f5af-480d-a260-f44e526d65ee-kube-api-access-v8lrb\") pod \"heat-api-65545bbfd6-drrjx\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.154001 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-config-data\") pod \"heat-api-65545bbfd6-drrjx\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.154039 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-config-data-custom\") pod \"heat-cfnapi-64c974b8d6-smh57\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.154118 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-combined-ca-bundle\") pod \"heat-cfnapi-64c974b8d6-smh57\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.154172 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-config-data\") pod \"heat-cfnapi-64c974b8d6-smh57\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.158050 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-config-data-custom\") pod \"heat-cfnapi-64c974b8d6-smh57\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.160423 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-combined-ca-bundle\") pod \"heat-cfnapi-64c974b8d6-smh57\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.160795 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-config-data-custom\") pod \"heat-api-65545bbfd6-drrjx\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.163676 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-config-data\") pod \"heat-cfnapi-64c974b8d6-smh57\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.170321 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-config-data\") pod \"heat-api-65545bbfd6-drrjx\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.171548 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-combined-ca-bundle\") pod \"heat-api-65545bbfd6-drrjx\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.179894 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8lrb\" (UniqueName: \"kubernetes.io/projected/095b719b-f5af-480d-a260-f44e526d65ee-kube-api-access-v8lrb\") pod \"heat-api-65545bbfd6-drrjx\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.181722 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vzr2\" (UniqueName: \"kubernetes.io/projected/76af154e-00d9-4cf0-80dd-2b39090fb44e-kube-api-access-5vzr2\") pod \"heat-cfnapi-64c974b8d6-smh57\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.195928 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.221626 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.622238 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.622501 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.659867 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-69b8748bf7-r7xx5"] Jan 30 06:51:28 crc kubenswrapper[4841]: W0130 06:51:28.739619 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod095b719b_f5af_480d_a260_f44e526d65ee.slice/crio-abd913d2b5d31669758505abb21840f3e6e6cadcc12c875d0ae9ebf8b05a761c WatchSource:0}: Error finding container abd913d2b5d31669758505abb21840f3e6e6cadcc12c875d0ae9ebf8b05a761c: Status 404 returned error can't find the container with id abd913d2b5d31669758505abb21840f3e6e6cadcc12c875d0ae9ebf8b05a761c Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.740910 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-65545bbfd6-drrjx"] Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.767860 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-64c974b8d6-smh57"] Jan 30 06:51:28 crc kubenswrapper[4841]: W0130 06:51:28.769665 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76af154e_00d9_4cf0_80dd_2b39090fb44e.slice/crio-75baba782775a936718d246f25f8145f681ac55bbc9fa4cd455a56988ef7cb59 WatchSource:0}: Error finding container 75baba782775a936718d246f25f8145f681ac55bbc9fa4cd455a56988ef7cb59: Status 404 returned error can't find the container with id 75baba782775a936718d246f25f8145f681ac55bbc9fa4cd455a56988ef7cb59 Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.919198 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6fd5789c7-qwll9"] Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.919754 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6fdd9f8874-d7x78"] Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.927367 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-6fd5789c7-qwll9" podUID="dafd8dae-b1e5-4c27-9644-0d1ebf5fb446" containerName="heat-api" containerID="cri-o://e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84" gracePeriod=60 Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.928174 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" podUID="098c43da-f08f-4050-bed0-08952682d551" containerName="heat-cfnapi" containerID="cri-o://aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea" gracePeriod=60 Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.943352 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-766f7d9fb9-kf8rg"] Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.944532 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.963305 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Jan 30 06:51:28 crc kubenswrapper[4841]: I0130 06:51:28.963801 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.009472 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-67dcbb9cbb-qvgxj"] Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.027421 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-67dcbb9cbb-qvgxj"] Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.027530 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.029989 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.032614 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.070865 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-766f7d9fb9-kf8rg"] Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.077362 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-internal-tls-certs\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.077416 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-config-data-custom\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.077443 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-combined-ca-bundle\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.077482 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-public-tls-certs\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.077547 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl5ck\" (UniqueName: \"kubernetes.io/projected/23b34f45-6116-42be-b368-f81b715edee4-kube-api-access-jl5ck\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.077584 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-config-data\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.179440 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-internal-tls-certs\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.179714 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-config-data-custom\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.179770 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-combined-ca-bundle\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.179797 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltj5v\" (UniqueName: \"kubernetes.io/projected/baef3ad1-d37a-4782-af64-1e87771092cd-kube-api-access-ltj5v\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.179852 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-config-data-custom\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.179880 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-public-tls-certs\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.179960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-combined-ca-bundle\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.180042 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl5ck\" (UniqueName: \"kubernetes.io/projected/23b34f45-6116-42be-b368-f81b715edee4-kube-api-access-jl5ck\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.180454 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-internal-tls-certs\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.180473 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-public-tls-certs\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.180528 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-config-data\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.180605 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-config-data\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.184361 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-combined-ca-bundle\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.184552 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-config-data-custom\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.184992 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-config-data\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.185350 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-public-tls-certs\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.190136 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b34f45-6116-42be-b368-f81b715edee4-internal-tls-certs\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.195292 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl5ck\" (UniqueName: \"kubernetes.io/projected/23b34f45-6116-42be-b368-f81b715edee4-kube-api-access-jl5ck\") pod \"heat-api-766f7d9fb9-kf8rg\" (UID: \"23b34f45-6116-42be-b368-f81b715edee4\") " pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.276550 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.283505 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltj5v\" (UniqueName: \"kubernetes.io/projected/baef3ad1-d37a-4782-af64-1e87771092cd-kube-api-access-ltj5v\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.283554 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-config-data-custom\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.283610 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-combined-ca-bundle\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.283642 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-internal-tls-certs\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.283668 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-public-tls-certs\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.283862 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-config-data\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.288136 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-config-data\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.289541 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-public-tls-certs\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.290101 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-combined-ca-bundle\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.293868 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-internal-tls-certs\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.301003 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baef3ad1-d37a-4782-af64-1e87771092cd-config-data-custom\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.305050 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltj5v\" (UniqueName: \"kubernetes.io/projected/baef3ad1-d37a-4782-af64-1e87771092cd-kube-api-access-ltj5v\") pod \"heat-cfnapi-67dcbb9cbb-qvgxj\" (UID: \"baef3ad1-d37a-4782-af64-1e87771092cd\") " pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.437517 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.593596 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.682309 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8ngxh" podUID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" containerName="registry-server" probeResult="failure" output=< Jan 30 06:51:29 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 06:51:29 crc kubenswrapper[4841]: > Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.683347 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.687008 4841 generic.go:334] "Generic (PLEG): container finished" podID="095b719b-f5af-480d-a260-f44e526d65ee" containerID="485f48808c8b57a109f900e015d22e66965bf6719d5f6fa0187bd1285593e0de" exitCode=1 Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.687082 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-65545bbfd6-drrjx" event={"ID":"095b719b-f5af-480d-a260-f44e526d65ee","Type":"ContainerDied","Data":"485f48808c8b57a109f900e015d22e66965bf6719d5f6fa0187bd1285593e0de"} Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.687111 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-65545bbfd6-drrjx" event={"ID":"095b719b-f5af-480d-a260-f44e526d65ee","Type":"ContainerStarted","Data":"abd913d2b5d31669758505abb21840f3e6e6cadcc12c875d0ae9ebf8b05a761c"} Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.690502 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-combined-ca-bundle\") pod \"098c43da-f08f-4050-bed0-08952682d551\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.690536 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-config-data\") pod \"098c43da-f08f-4050-bed0-08952682d551\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.690746 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-config-data-custom\") pod \"098c43da-f08f-4050-bed0-08952682d551\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.690847 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdwlj\" (UniqueName: \"kubernetes.io/projected/098c43da-f08f-4050-bed0-08952682d551-kube-api-access-fdwlj\") pod \"098c43da-f08f-4050-bed0-08952682d551\" (UID: \"098c43da-f08f-4050-bed0-08952682d551\") " Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.697863 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "098c43da-f08f-4050-bed0-08952682d551" (UID: "098c43da-f08f-4050-bed0-08952682d551"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.700363 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/098c43da-f08f-4050-bed0-08952682d551-kube-api-access-fdwlj" (OuterVolumeSpecName: "kube-api-access-fdwlj") pod "098c43da-f08f-4050-bed0-08952682d551" (UID: "098c43da-f08f-4050-bed0-08952682d551"). InnerVolumeSpecName "kube-api-access-fdwlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.701943 4841 scope.go:117] "RemoveContainer" containerID="485f48808c8b57a109f900e015d22e66965bf6719d5f6fa0187bd1285593e0de" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.704164 4841 generic.go:334] "Generic (PLEG): container finished" podID="dafd8dae-b1e5-4c27-9644-0d1ebf5fb446" containerID="e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84" exitCode=0 Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.704238 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6fd5789c7-qwll9" event={"ID":"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446","Type":"ContainerDied","Data":"e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84"} Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.704263 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6fd5789c7-qwll9" event={"ID":"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446","Type":"ContainerDied","Data":"686cede759085a3573f951d6ee7b65d242bd00f3be933784eea1772bedab1e71"} Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.704279 4841 scope.go:117] "RemoveContainer" containerID="e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.705897 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6fd5789c7-qwll9" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.728381 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-69b8748bf7-r7xx5" event={"ID":"484c8903-064d-425e-b01b-2d61dbe306da","Type":"ContainerStarted","Data":"4aaa58be4ff12091b7fbb3a60e1caa2bac920278bfaa407117c557df7632a196"} Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.728428 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-69b8748bf7-r7xx5" event={"ID":"484c8903-064d-425e-b01b-2d61dbe306da","Type":"ContainerStarted","Data":"dd66aba8d05e763e12f997045385cc9ab57a4e06ce779bbd51579872e30ad976"} Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.729155 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.730491 4841 generic.go:334] "Generic (PLEG): container finished" podID="76af154e-00d9-4cf0-80dd-2b39090fb44e" containerID="5efc0d5d48aecf712320e720a0714f3e769e5ab04937d27742a4a30fe994f739" exitCode=1 Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.730524 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64c974b8d6-smh57" event={"ID":"76af154e-00d9-4cf0-80dd-2b39090fb44e","Type":"ContainerDied","Data":"5efc0d5d48aecf712320e720a0714f3e769e5ab04937d27742a4a30fe994f739"} Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.730541 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64c974b8d6-smh57" event={"ID":"76af154e-00d9-4cf0-80dd-2b39090fb44e","Type":"ContainerStarted","Data":"75baba782775a936718d246f25f8145f681ac55bbc9fa4cd455a56988ef7cb59"} Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.730794 4841 scope.go:117] "RemoveContainer" containerID="5efc0d5d48aecf712320e720a0714f3e769e5ab04937d27742a4a30fe994f739" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.738393 4841 generic.go:334] "Generic (PLEG): container finished" podID="098c43da-f08f-4050-bed0-08952682d551" containerID="aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea" exitCode=0 Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.738443 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" event={"ID":"098c43da-f08f-4050-bed0-08952682d551","Type":"ContainerDied","Data":"aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea"} Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.738467 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" event={"ID":"098c43da-f08f-4050-bed0-08952682d551","Type":"ContainerDied","Data":"965efbcc6d37ea2dbc50149e8298fbdaa162298a95a1b1f9d9901c96c9d49f97"} Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.738510 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fdd9f8874-d7x78" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.757809 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "098c43da-f08f-4050-bed0-08952682d551" (UID: "098c43da-f08f-4050-bed0-08952682d551"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.792335 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-69b8748bf7-r7xx5" podStartSLOduration=2.792314951 podStartE2EDuration="2.792314951s" podCreationTimestamp="2026-01-30 06:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:51:29.779164679 +0000 UTC m=+6226.772637317" watchObservedRunningTime="2026-01-30 06:51:29.792314951 +0000 UTC m=+6226.785787589" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.795162 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-combined-ca-bundle\") pod \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.795301 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-config-data-custom\") pod \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.795342 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-config-data\") pod \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.795441 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx5mk\" (UniqueName: \"kubernetes.io/projected/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-kube-api-access-zx5mk\") pod \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\" (UID: \"dafd8dae-b1e5-4c27-9644-0d1ebf5fb446\") " Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.797647 4841 scope.go:117] "RemoveContainer" containerID="e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84" Jan 30 06:51:29 crc kubenswrapper[4841]: E0130 06:51:29.799342 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84\": container with ID starting with e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84 not found: ID does not exist" containerID="e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.799380 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84"} err="failed to get container status \"e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84\": rpc error: code = NotFound desc = could not find container \"e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84\": container with ID starting with e98c1d001917d2c2fda58d99485d4d7e09916ae236f5cdb56040a77223b22c84 not found: ID does not exist" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.799417 4841 scope.go:117] "RemoveContainer" containerID="aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.799512 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-config-data" (OuterVolumeSpecName: "config-data") pod "098c43da-f08f-4050-bed0-08952682d551" (UID: "098c43da-f08f-4050-bed0-08952682d551"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.800772 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.800795 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.800804 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/098c43da-f08f-4050-bed0-08952682d551-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.800813 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdwlj\" (UniqueName: \"kubernetes.io/projected/098c43da-f08f-4050-bed0-08952682d551-kube-api-access-fdwlj\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.804334 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dafd8dae-b1e5-4c27-9644-0d1ebf5fb446" (UID: "dafd8dae-b1e5-4c27-9644-0d1ebf5fb446"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.806830 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-kube-api-access-zx5mk" (OuterVolumeSpecName: "kube-api-access-zx5mk") pod "dafd8dae-b1e5-4c27-9644-0d1ebf5fb446" (UID: "dafd8dae-b1e5-4c27-9644-0d1ebf5fb446"). InnerVolumeSpecName "kube-api-access-zx5mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.834600 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-766f7d9fb9-kf8rg"] Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.881229 4841 scope.go:117] "RemoveContainer" containerID="aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea" Jan 30 06:51:29 crc kubenswrapper[4841]: E0130 06:51:29.881732 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea\": container with ID starting with aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea not found: ID does not exist" containerID="aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.881786 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea"} err="failed to get container status \"aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea\": rpc error: code = NotFound desc = could not find container \"aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea\": container with ID starting with aea1804bac35e70d74e89e8b0a0f1ca3a216905799bb12e0edad613adbf677ea not found: ID does not exist" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.906289 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.906318 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx5mk\" (UniqueName: \"kubernetes.io/projected/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-kube-api-access-zx5mk\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.960586 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dafd8dae-b1e5-4c27-9644-0d1ebf5fb446" (UID: "dafd8dae-b1e5-4c27-9644-0d1ebf5fb446"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:29 crc kubenswrapper[4841]: I0130 06:51:29.962518 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-config-data" (OuterVolumeSpecName: "config-data") pod "dafd8dae-b1e5-4c27-9644-0d1ebf5fb446" (UID: "dafd8dae-b1e5-4c27-9644-0d1ebf5fb446"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.008231 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.008264 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.060335 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6fd5789c7-qwll9"] Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.074453 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6fd5789c7-qwll9"] Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.086243 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-67dcbb9cbb-qvgxj"] Jan 30 06:51:30 crc kubenswrapper[4841]: W0130 06:51:30.089735 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaef3ad1_d37a_4782_af64_1e87771092cd.slice/crio-6c53766c672b8e077676e6f2ec5b441e013c9ee71a1403a69d9c197dd2ba138f WatchSource:0}: Error finding container 6c53766c672b8e077676e6f2ec5b441e013c9ee71a1403a69d9c197dd2ba138f: Status 404 returned error can't find the container with id 6c53766c672b8e077676e6f2ec5b441e013c9ee71a1403a69d9c197dd2ba138f Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.463782 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dafd8dae-b1e5-4c27-9644-0d1ebf5fb446" path="/var/lib/kubelet/pods/dafd8dae-b1e5-4c27-9644-0d1ebf5fb446/volumes" Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.464678 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6fdd9f8874-d7x78"] Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.472279 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6fdd9f8874-d7x78"] Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.749817 4841 generic.go:334] "Generic (PLEG): container finished" podID="76af154e-00d9-4cf0-80dd-2b39090fb44e" containerID="9c4d57a9299f3a46afb5d17afcc3420d36852d863361443faefaf8ccbe724954" exitCode=1 Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.749878 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64c974b8d6-smh57" event={"ID":"76af154e-00d9-4cf0-80dd-2b39090fb44e","Type":"ContainerDied","Data":"9c4d57a9299f3a46afb5d17afcc3420d36852d863361443faefaf8ccbe724954"} Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.749913 4841 scope.go:117] "RemoveContainer" containerID="5efc0d5d48aecf712320e720a0714f3e769e5ab04937d27742a4a30fe994f739" Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.751015 4841 scope.go:117] "RemoveContainer" containerID="9c4d57a9299f3a46afb5d17afcc3420d36852d863361443faefaf8ccbe724954" Jan 30 06:51:30 crc kubenswrapper[4841]: E0130 06:51:30.751519 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-64c974b8d6-smh57_openstack(76af154e-00d9-4cf0-80dd-2b39090fb44e)\"" pod="openstack/heat-cfnapi-64c974b8d6-smh57" podUID="76af154e-00d9-4cf0-80dd-2b39090fb44e" Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.753548 4841 generic.go:334] "Generic (PLEG): container finished" podID="095b719b-f5af-480d-a260-f44e526d65ee" containerID="183fbe802cd4ddfa191bb52437f674dfa7bc66c9a7dc05127ce5d45a68f29335" exitCode=1 Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.753608 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-65545bbfd6-drrjx" event={"ID":"095b719b-f5af-480d-a260-f44e526d65ee","Type":"ContainerDied","Data":"183fbe802cd4ddfa191bb52437f674dfa7bc66c9a7dc05127ce5d45a68f29335"} Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.754478 4841 scope.go:117] "RemoveContainer" containerID="183fbe802cd4ddfa191bb52437f674dfa7bc66c9a7dc05127ce5d45a68f29335" Jan 30 06:51:30 crc kubenswrapper[4841]: E0130 06:51:30.754745 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-65545bbfd6-drrjx_openstack(095b719b-f5af-480d-a260-f44e526d65ee)\"" pod="openstack/heat-api-65545bbfd6-drrjx" podUID="095b719b-f5af-480d-a260-f44e526d65ee" Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.755850 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-766f7d9fb9-kf8rg" event={"ID":"23b34f45-6116-42be-b368-f81b715edee4","Type":"ContainerStarted","Data":"f0392888539fb39a16dc206a4d1a9df6879e505d8ab484398ada697bc0797a36"} Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.755877 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-766f7d9fb9-kf8rg" event={"ID":"23b34f45-6116-42be-b368-f81b715edee4","Type":"ContainerStarted","Data":"891e6bed11b6d1e0e5fc38224145aeb6d5514902d5220808e1501b49ba22fad8"} Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.756364 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.759352 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" event={"ID":"baef3ad1-d37a-4782-af64-1e87771092cd","Type":"ContainerStarted","Data":"e86fd85502538ec7f1d17f30c2c2d672f428893e3ccdf12cd2ad814c8db585d5"} Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.759378 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" event={"ID":"baef3ad1-d37a-4782-af64-1e87771092cd","Type":"ContainerStarted","Data":"6c53766c672b8e077676e6f2ec5b441e013c9ee71a1403a69d9c197dd2ba138f"} Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.759514 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.831972 4841 scope.go:117] "RemoveContainer" containerID="485f48808c8b57a109f900e015d22e66965bf6719d5f6fa0187bd1285593e0de" Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.859303 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-766f7d9fb9-kf8rg" podStartSLOduration=2.8592831199999997 podStartE2EDuration="2.85928312s" podCreationTimestamp="2026-01-30 06:51:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:51:30.851108462 +0000 UTC m=+6227.844581100" watchObservedRunningTime="2026-01-30 06:51:30.85928312 +0000 UTC m=+6227.852755758" Jan 30 06:51:30 crc kubenswrapper[4841]: I0130 06:51:30.876138 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" podStartSLOduration=2.876119519 podStartE2EDuration="2.876119519s" podCreationTimestamp="2026-01-30 06:51:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:51:30.865614569 +0000 UTC m=+6227.859087197" watchObservedRunningTime="2026-01-30 06:51:30.876119519 +0000 UTC m=+6227.869592157" Jan 30 06:51:31 crc kubenswrapper[4841]: I0130 06:51:31.029658 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-nh96x"] Jan 30 06:51:31 crc kubenswrapper[4841]: I0130 06:51:31.038287 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-nh96x"] Jan 30 06:51:31 crc kubenswrapper[4841]: I0130 06:51:31.188136 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:31 crc kubenswrapper[4841]: I0130 06:51:31.770820 4841 scope.go:117] "RemoveContainer" containerID="9c4d57a9299f3a46afb5d17afcc3420d36852d863361443faefaf8ccbe724954" Jan 30 06:51:31 crc kubenswrapper[4841]: E0130 06:51:31.771194 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-64c974b8d6-smh57_openstack(76af154e-00d9-4cf0-80dd-2b39090fb44e)\"" pod="openstack/heat-cfnapi-64c974b8d6-smh57" podUID="76af154e-00d9-4cf0-80dd-2b39090fb44e" Jan 30 06:51:31 crc kubenswrapper[4841]: I0130 06:51:31.774237 4841 scope.go:117] "RemoveContainer" containerID="183fbe802cd4ddfa191bb52437f674dfa7bc66c9a7dc05127ce5d45a68f29335" Jan 30 06:51:31 crc kubenswrapper[4841]: E0130 06:51:31.774471 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-65545bbfd6-drrjx_openstack(095b719b-f5af-480d-a260-f44e526d65ee)\"" pod="openstack/heat-api-65545bbfd6-drrjx" podUID="095b719b-f5af-480d-a260-f44e526d65ee" Jan 30 06:51:32 crc kubenswrapper[4841]: I0130 06:51:32.443688 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06c6cefc-90f2-482c-9436-12542e71f13c" path="/var/lib/kubelet/pods/06c6cefc-90f2-482c-9436-12542e71f13c/volumes" Jan 30 06:51:32 crc kubenswrapper[4841]: I0130 06:51:32.445639 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="098c43da-f08f-4050-bed0-08952682d551" path="/var/lib/kubelet/pods/098c43da-f08f-4050-bed0-08952682d551/volumes" Jan 30 06:51:33 crc kubenswrapper[4841]: I0130 06:51:33.196897 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:33 crc kubenswrapper[4841]: I0130 06:51:33.197175 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:33 crc kubenswrapper[4841]: I0130 06:51:33.198079 4841 scope.go:117] "RemoveContainer" containerID="183fbe802cd4ddfa191bb52437f674dfa7bc66c9a7dc05127ce5d45a68f29335" Jan 30 06:51:33 crc kubenswrapper[4841]: E0130 06:51:33.198328 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-65545bbfd6-drrjx_openstack(095b719b-f5af-480d-a260-f44e526d65ee)\"" pod="openstack/heat-api-65545bbfd6-drrjx" podUID="095b719b-f5af-480d-a260-f44e526d65ee" Jan 30 06:51:33 crc kubenswrapper[4841]: I0130 06:51:33.221894 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:33 crc kubenswrapper[4841]: I0130 06:51:33.221970 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:33 crc kubenswrapper[4841]: I0130 06:51:33.222703 4841 scope.go:117] "RemoveContainer" containerID="9c4d57a9299f3a46afb5d17afcc3420d36852d863361443faefaf8ccbe724954" Jan 30 06:51:33 crc kubenswrapper[4841]: E0130 06:51:33.222917 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-64c974b8d6-smh57_openstack(76af154e-00d9-4cf0-80dd-2b39090fb44e)\"" pod="openstack/heat-cfnapi-64c974b8d6-smh57" podUID="76af154e-00d9-4cf0-80dd-2b39090fb44e" Jan 30 06:51:34 crc kubenswrapper[4841]: I0130 06:51:34.793898 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c6697b658-bhtrl" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.121:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.121:8443: connect: connection refused" Jan 30 06:51:38 crc kubenswrapper[4841]: I0130 06:51:38.189316 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-69b8748bf7-r7xx5" Jan 30 06:51:38 crc kubenswrapper[4841]: I0130 06:51:38.270295 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6f87f7d7cf-94vvt"] Jan 30 06:51:38 crc kubenswrapper[4841]: I0130 06:51:38.270588 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6f87f7d7cf-94vvt" podUID="cc77bd65-bd9c-4a2e-842d-3606ce65a12b" containerName="heat-engine" containerID="cri-o://dd2543abbce5d1875b5229d80c68b6c6476076d303ca792ce54ce4cf5e1877a4" gracePeriod=60 Jan 30 06:51:38 crc kubenswrapper[4841]: I0130 06:51:38.710634 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:38 crc kubenswrapper[4841]: I0130 06:51:38.808450 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:38 crc kubenswrapper[4841]: I0130 06:51:38.976139 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ngxh"] Jan 30 06:51:39 crc kubenswrapper[4841]: I0130 06:51:39.890782 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8ngxh" podUID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" containerName="registry-server" containerID="cri-o://0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2" gracePeriod=2 Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.376679 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.462165 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btnk7\" (UniqueName: \"kubernetes.io/projected/00800c5e-91fd-4437-b1cf-6c5dece7d02c-kube-api-access-btnk7\") pod \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\" (UID: \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\") " Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.462349 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00800c5e-91fd-4437-b1cf-6c5dece7d02c-catalog-content\") pod \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\" (UID: \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\") " Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.462459 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00800c5e-91fd-4437-b1cf-6c5dece7d02c-utilities\") pod \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\" (UID: \"00800c5e-91fd-4437-b1cf-6c5dece7d02c\") " Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.462998 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00800c5e-91fd-4437-b1cf-6c5dece7d02c-utilities" (OuterVolumeSpecName: "utilities") pod "00800c5e-91fd-4437-b1cf-6c5dece7d02c" (UID: "00800c5e-91fd-4437-b1cf-6c5dece7d02c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.468983 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00800c5e-91fd-4437-b1cf-6c5dece7d02c-kube-api-access-btnk7" (OuterVolumeSpecName: "kube-api-access-btnk7") pod "00800c5e-91fd-4437-b1cf-6c5dece7d02c" (UID: "00800c5e-91fd-4437-b1cf-6c5dece7d02c"). InnerVolumeSpecName "kube-api-access-btnk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.519176 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00800c5e-91fd-4437-b1cf-6c5dece7d02c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00800c5e-91fd-4437-b1cf-6c5dece7d02c" (UID: "00800c5e-91fd-4437-b1cf-6c5dece7d02c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.565377 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00800c5e-91fd-4437-b1cf-6c5dece7d02c-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.565431 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btnk7\" (UniqueName: \"kubernetes.io/projected/00800c5e-91fd-4437-b1cf-6c5dece7d02c-kube-api-access-btnk7\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.565446 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00800c5e-91fd-4437-b1cf-6c5dece7d02c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.734461 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-766f7d9fb9-kf8rg" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.746718 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-67dcbb9cbb-qvgxj" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.843856 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-65545bbfd6-drrjx"] Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.871501 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-64c974b8d6-smh57"] Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.928059 4841 generic.go:334] "Generic (PLEG): container finished" podID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" containerID="0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2" exitCode=0 Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.928105 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngxh" event={"ID":"00800c5e-91fd-4437-b1cf-6c5dece7d02c","Type":"ContainerDied","Data":"0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2"} Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.928908 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ngxh" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.929221 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ngxh" event={"ID":"00800c5e-91fd-4437-b1cf-6c5dece7d02c","Type":"ContainerDied","Data":"8dfb67f0800f4b4e56d7f3f67cf7255dd7f89ee5665392495d29cb532b97d8fe"} Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.929249 4841 scope.go:117] "RemoveContainer" containerID="0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.964159 4841 scope.go:117] "RemoveContainer" containerID="8a6d6817790f4cfb4f68c08ffb7af36afe1e7b2f53150e206f46727dd5acbecc" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.984515 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ngxh"] Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.988787 4841 scope.go:117] "RemoveContainer" containerID="ab383f4a10622bde029a29293874a4c1eeea8bb226c9f0784f52a2234a3b0885" Jan 30 06:51:40 crc kubenswrapper[4841]: I0130 06:51:40.995818 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8ngxh"] Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.052001 4841 scope.go:117] "RemoveContainer" containerID="0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2" Jan 30 06:51:41 crc kubenswrapper[4841]: E0130 06:51:41.056941 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2\": container with ID starting with 0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2 not found: ID does not exist" containerID="0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.056972 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2"} err="failed to get container status \"0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2\": rpc error: code = NotFound desc = could not find container \"0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2\": container with ID starting with 0b0dab9833e413f0561f97defaeb715012a410ee2eeabe46149da7ce45797eb2 not found: ID does not exist" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.056993 4841 scope.go:117] "RemoveContainer" containerID="8a6d6817790f4cfb4f68c08ffb7af36afe1e7b2f53150e206f46727dd5acbecc" Jan 30 06:51:41 crc kubenswrapper[4841]: E0130 06:51:41.057266 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a6d6817790f4cfb4f68c08ffb7af36afe1e7b2f53150e206f46727dd5acbecc\": container with ID starting with 8a6d6817790f4cfb4f68c08ffb7af36afe1e7b2f53150e206f46727dd5acbecc not found: ID does not exist" containerID="8a6d6817790f4cfb4f68c08ffb7af36afe1e7b2f53150e206f46727dd5acbecc" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.057306 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a6d6817790f4cfb4f68c08ffb7af36afe1e7b2f53150e206f46727dd5acbecc"} err="failed to get container status \"8a6d6817790f4cfb4f68c08ffb7af36afe1e7b2f53150e206f46727dd5acbecc\": rpc error: code = NotFound desc = could not find container \"8a6d6817790f4cfb4f68c08ffb7af36afe1e7b2f53150e206f46727dd5acbecc\": container with ID starting with 8a6d6817790f4cfb4f68c08ffb7af36afe1e7b2f53150e206f46727dd5acbecc not found: ID does not exist" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.057333 4841 scope.go:117] "RemoveContainer" containerID="ab383f4a10622bde029a29293874a4c1eeea8bb226c9f0784f52a2234a3b0885" Jan 30 06:51:41 crc kubenswrapper[4841]: E0130 06:51:41.057609 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab383f4a10622bde029a29293874a4c1eeea8bb226c9f0784f52a2234a3b0885\": container with ID starting with ab383f4a10622bde029a29293874a4c1eeea8bb226c9f0784f52a2234a3b0885 not found: ID does not exist" containerID="ab383f4a10622bde029a29293874a4c1eeea8bb226c9f0784f52a2234a3b0885" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.057634 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab383f4a10622bde029a29293874a4c1eeea8bb226c9f0784f52a2234a3b0885"} err="failed to get container status \"ab383f4a10622bde029a29293874a4c1eeea8bb226c9f0784f52a2234a3b0885\": rpc error: code = NotFound desc = could not find container \"ab383f4a10622bde029a29293874a4c1eeea8bb226c9f0784f52a2234a3b0885\": container with ID starting with ab383f4a10622bde029a29293874a4c1eeea8bb226c9f0784f52a2234a3b0885 not found: ID does not exist" Jan 30 06:51:41 crc kubenswrapper[4841]: E0130 06:51:41.163048 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd2543abbce5d1875b5229d80c68b6c6476076d303ca792ce54ce4cf5e1877a4" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 30 06:51:41 crc kubenswrapper[4841]: E0130 06:51:41.164626 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd2543abbce5d1875b5229d80c68b6c6476076d303ca792ce54ce4cf5e1877a4" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 30 06:51:41 crc kubenswrapper[4841]: E0130 06:51:41.165865 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd2543abbce5d1875b5229d80c68b6c6476076d303ca792ce54ce4cf5e1877a4" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 30 06:51:41 crc kubenswrapper[4841]: E0130 06:51:41.165902 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6f87f7d7cf-94vvt" podUID="cc77bd65-bd9c-4a2e-842d-3606ce65a12b" containerName="heat-engine" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.340786 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.350619 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.394655 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-combined-ca-bundle\") pod \"095b719b-f5af-480d-a260-f44e526d65ee\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.394717 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-config-data\") pod \"095b719b-f5af-480d-a260-f44e526d65ee\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.394758 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8lrb\" (UniqueName: \"kubernetes.io/projected/095b719b-f5af-480d-a260-f44e526d65ee-kube-api-access-v8lrb\") pod \"095b719b-f5af-480d-a260-f44e526d65ee\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.395102 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-config-data-custom\") pod \"095b719b-f5af-480d-a260-f44e526d65ee\" (UID: \"095b719b-f5af-480d-a260-f44e526d65ee\") " Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.400031 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "095b719b-f5af-480d-a260-f44e526d65ee" (UID: "095b719b-f5af-480d-a260-f44e526d65ee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.401730 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095b719b-f5af-480d-a260-f44e526d65ee-kube-api-access-v8lrb" (OuterVolumeSpecName: "kube-api-access-v8lrb") pod "095b719b-f5af-480d-a260-f44e526d65ee" (UID: "095b719b-f5af-480d-a260-f44e526d65ee"). InnerVolumeSpecName "kube-api-access-v8lrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.437570 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "095b719b-f5af-480d-a260-f44e526d65ee" (UID: "095b719b-f5af-480d-a260-f44e526d65ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.460533 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-config-data" (OuterVolumeSpecName: "config-data") pod "095b719b-f5af-480d-a260-f44e526d65ee" (UID: "095b719b-f5af-480d-a260-f44e526d65ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.497327 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-config-data\") pod \"76af154e-00d9-4cf0-80dd-2b39090fb44e\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.497424 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-combined-ca-bundle\") pod \"76af154e-00d9-4cf0-80dd-2b39090fb44e\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.497594 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vzr2\" (UniqueName: \"kubernetes.io/projected/76af154e-00d9-4cf0-80dd-2b39090fb44e-kube-api-access-5vzr2\") pod \"76af154e-00d9-4cf0-80dd-2b39090fb44e\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.497709 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-config-data-custom\") pod \"76af154e-00d9-4cf0-80dd-2b39090fb44e\" (UID: \"76af154e-00d9-4cf0-80dd-2b39090fb44e\") " Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.499111 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.499137 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.499148 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095b719b-f5af-480d-a260-f44e526d65ee-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.499159 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8lrb\" (UniqueName: \"kubernetes.io/projected/095b719b-f5af-480d-a260-f44e526d65ee-kube-api-access-v8lrb\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.500423 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76af154e-00d9-4cf0-80dd-2b39090fb44e-kube-api-access-5vzr2" (OuterVolumeSpecName: "kube-api-access-5vzr2") pod "76af154e-00d9-4cf0-80dd-2b39090fb44e" (UID: "76af154e-00d9-4cf0-80dd-2b39090fb44e"). InnerVolumeSpecName "kube-api-access-5vzr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.502053 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "76af154e-00d9-4cf0-80dd-2b39090fb44e" (UID: "76af154e-00d9-4cf0-80dd-2b39090fb44e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.536381 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76af154e-00d9-4cf0-80dd-2b39090fb44e" (UID: "76af154e-00d9-4cf0-80dd-2b39090fb44e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.549350 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-config-data" (OuterVolumeSpecName: "config-data") pod "76af154e-00d9-4cf0-80dd-2b39090fb44e" (UID: "76af154e-00d9-4cf0-80dd-2b39090fb44e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.600975 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.601024 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vzr2\" (UniqueName: \"kubernetes.io/projected/76af154e-00d9-4cf0-80dd-2b39090fb44e-kube-api-access-5vzr2\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.601039 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.601051 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76af154e-00d9-4cf0-80dd-2b39090fb44e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.940280 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64c974b8d6-smh57" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.940283 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64c974b8d6-smh57" event={"ID":"76af154e-00d9-4cf0-80dd-2b39090fb44e","Type":"ContainerDied","Data":"75baba782775a936718d246f25f8145f681ac55bbc9fa4cd455a56988ef7cb59"} Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.940455 4841 scope.go:117] "RemoveContainer" containerID="9c4d57a9299f3a46afb5d17afcc3420d36852d863361443faefaf8ccbe724954" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.942115 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-65545bbfd6-drrjx" event={"ID":"095b719b-f5af-480d-a260-f44e526d65ee","Type":"ContainerDied","Data":"abd913d2b5d31669758505abb21840f3e6e6cadcc12c875d0ae9ebf8b05a761c"} Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.942173 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-65545bbfd6-drrjx" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.965734 4841 scope.go:117] "RemoveContainer" containerID="183fbe802cd4ddfa191bb52437f674dfa7bc66c9a7dc05127ce5d45a68f29335" Jan 30 06:51:41 crc kubenswrapper[4841]: I0130 06:51:41.980905 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-64c974b8d6-smh57"] Jan 30 06:51:42 crc kubenswrapper[4841]: I0130 06:51:42.003686 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-64c974b8d6-smh57"] Jan 30 06:51:42 crc kubenswrapper[4841]: I0130 06:51:42.020700 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-65545bbfd6-drrjx"] Jan 30 06:51:42 crc kubenswrapper[4841]: I0130 06:51:42.041804 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-65545bbfd6-drrjx"] Jan 30 06:51:42 crc kubenswrapper[4841]: I0130 06:51:42.453798 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" path="/var/lib/kubelet/pods/00800c5e-91fd-4437-b1cf-6c5dece7d02c/volumes" Jan 30 06:51:42 crc kubenswrapper[4841]: I0130 06:51:42.457234 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095b719b-f5af-480d-a260-f44e526d65ee" path="/var/lib/kubelet/pods/095b719b-f5af-480d-a260-f44e526d65ee/volumes" Jan 30 06:51:42 crc kubenswrapper[4841]: I0130 06:51:42.458529 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76af154e-00d9-4cf0-80dd-2b39090fb44e" path="/var/lib/kubelet/pods/76af154e-00d9-4cf0-80dd-2b39090fb44e/volumes" Jan 30 06:51:44 crc kubenswrapper[4841]: I0130 06:51:44.798543 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c6697b658-bhtrl" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.121:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.121:8443: connect: connection refused" Jan 30 06:51:44 crc kubenswrapper[4841]: I0130 06:51:44.798881 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.031902 4841 generic.go:334] "Generic (PLEG): container finished" podID="cc77bd65-bd9c-4a2e-842d-3606ce65a12b" containerID="dd2543abbce5d1875b5229d80c68b6c6476076d303ca792ce54ce4cf5e1877a4" exitCode=0 Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.031981 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6f87f7d7cf-94vvt" event={"ID":"cc77bd65-bd9c-4a2e-842d-3606ce65a12b","Type":"ContainerDied","Data":"dd2543abbce5d1875b5229d80c68b6c6476076d303ca792ce54ce4cf5e1877a4"} Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.305522 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.386238 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-config-data\") pod \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.386504 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-config-data-custom\") pod \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.386633 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-668rv\" (UniqueName: \"kubernetes.io/projected/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-kube-api-access-668rv\") pod \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.386675 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-combined-ca-bundle\") pod \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\" (UID: \"cc77bd65-bd9c-4a2e-842d-3606ce65a12b\") " Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.393469 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-kube-api-access-668rv" (OuterVolumeSpecName: "kube-api-access-668rv") pod "cc77bd65-bd9c-4a2e-842d-3606ce65a12b" (UID: "cc77bd65-bd9c-4a2e-842d-3606ce65a12b"). InnerVolumeSpecName "kube-api-access-668rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.394310 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cc77bd65-bd9c-4a2e-842d-3606ce65a12b" (UID: "cc77bd65-bd9c-4a2e-842d-3606ce65a12b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.435151 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc77bd65-bd9c-4a2e-842d-3606ce65a12b" (UID: "cc77bd65-bd9c-4a2e-842d-3606ce65a12b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.451074 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-config-data" (OuterVolumeSpecName: "config-data") pod "cc77bd65-bd9c-4a2e-842d-3606ce65a12b" (UID: "cc77bd65-bd9c-4a2e-842d-3606ce65a12b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.489150 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.489183 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-668rv\" (UniqueName: \"kubernetes.io/projected/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-kube-api-access-668rv\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.489195 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:49 crc kubenswrapper[4841]: I0130 06:51:49.489204 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc77bd65-bd9c-4a2e-842d-3606ce65a12b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:50 crc kubenswrapper[4841]: I0130 06:51:50.048793 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6f87f7d7cf-94vvt" event={"ID":"cc77bd65-bd9c-4a2e-842d-3606ce65a12b","Type":"ContainerDied","Data":"4d0d989186d7b25b1fa1c3c7d68cecdb36d3639e714c76cac31ec46bc40c1247"} Jan 30 06:51:50 crc kubenswrapper[4841]: I0130 06:51:50.049118 4841 scope.go:117] "RemoveContainer" containerID="dd2543abbce5d1875b5229d80c68b6c6476076d303ca792ce54ce4cf5e1877a4" Jan 30 06:51:50 crc kubenswrapper[4841]: I0130 06:51:50.048842 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6f87f7d7cf-94vvt" Jan 30 06:51:50 crc kubenswrapper[4841]: I0130 06:51:50.095132 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6f87f7d7cf-94vvt"] Jan 30 06:51:50 crc kubenswrapper[4841]: I0130 06:51:50.105987 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6f87f7d7cf-94vvt"] Jan 30 06:51:50 crc kubenswrapper[4841]: I0130 06:51:50.456427 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc77bd65-bd9c-4a2e-842d-3606ce65a12b" path="/var/lib/kubelet/pods/cc77bd65-bd9c-4a2e-842d-3606ce65a12b/volumes" Jan 30 06:51:53 crc kubenswrapper[4841]: I0130 06:51:53.504215 4841 scope.go:117] "RemoveContainer" containerID="70b1b32b149b58c6eed23d2aa01c089ac5a2727f76ae45632700dd9e2d709f9e" Jan 30 06:51:53 crc kubenswrapper[4841]: I0130 06:51:53.554954 4841 scope.go:117] "RemoveContainer" containerID="eda2754cf0f3fbd0436f21c07e9cfbc7d78955b14f44a4941ef6db6def2ff46c" Jan 30 06:51:53 crc kubenswrapper[4841]: I0130 06:51:53.591936 4841 scope.go:117] "RemoveContainer" containerID="4ace0a36f73702b5c9588fb2b4ac8ca3801a27ccfbcb1fb94475b8a13c78744a" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.109213 4841 generic.go:334] "Generic (PLEG): container finished" podID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerID="26bf91e83f7f783284c5ae3f95e4da9624775557e77d94dea27183e0d82dd032" exitCode=137 Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.109315 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6697b658-bhtrl" event={"ID":"cfa7cd93-66af-4dae-a610-d61b6235af81","Type":"ContainerDied","Data":"26bf91e83f7f783284c5ae3f95e4da9624775557e77d94dea27183e0d82dd032"} Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.216928 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.307289 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfa7cd93-66af-4dae-a610-d61b6235af81-scripts\") pod \"cfa7cd93-66af-4dae-a610-d61b6235af81\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.307381 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfa7cd93-66af-4dae-a610-d61b6235af81-config-data\") pod \"cfa7cd93-66af-4dae-a610-d61b6235af81\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.307469 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjt9n\" (UniqueName: \"kubernetes.io/projected/cfa7cd93-66af-4dae-a610-d61b6235af81-kube-api-access-hjt9n\") pod \"cfa7cd93-66af-4dae-a610-d61b6235af81\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.307497 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-horizon-secret-key\") pod \"cfa7cd93-66af-4dae-a610-d61b6235af81\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.307523 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-combined-ca-bundle\") pod \"cfa7cd93-66af-4dae-a610-d61b6235af81\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.307702 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-horizon-tls-certs\") pod \"cfa7cd93-66af-4dae-a610-d61b6235af81\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.307781 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfa7cd93-66af-4dae-a610-d61b6235af81-logs\") pod \"cfa7cd93-66af-4dae-a610-d61b6235af81\" (UID: \"cfa7cd93-66af-4dae-a610-d61b6235af81\") " Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.308666 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa7cd93-66af-4dae-a610-d61b6235af81-logs" (OuterVolumeSpecName: "logs") pod "cfa7cd93-66af-4dae-a610-d61b6235af81" (UID: "cfa7cd93-66af-4dae-a610-d61b6235af81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.314305 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "cfa7cd93-66af-4dae-a610-d61b6235af81" (UID: "cfa7cd93-66af-4dae-a610-d61b6235af81"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.314793 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa7cd93-66af-4dae-a610-d61b6235af81-kube-api-access-hjt9n" (OuterVolumeSpecName: "kube-api-access-hjt9n") pod "cfa7cd93-66af-4dae-a610-d61b6235af81" (UID: "cfa7cd93-66af-4dae-a610-d61b6235af81"). InnerVolumeSpecName "kube-api-access-hjt9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.344032 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfa7cd93-66af-4dae-a610-d61b6235af81" (UID: "cfa7cd93-66af-4dae-a610-d61b6235af81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.348536 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa7cd93-66af-4dae-a610-d61b6235af81-scripts" (OuterVolumeSpecName: "scripts") pod "cfa7cd93-66af-4dae-a610-d61b6235af81" (UID: "cfa7cd93-66af-4dae-a610-d61b6235af81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.375376 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa7cd93-66af-4dae-a610-d61b6235af81-config-data" (OuterVolumeSpecName: "config-data") pod "cfa7cd93-66af-4dae-a610-d61b6235af81" (UID: "cfa7cd93-66af-4dae-a610-d61b6235af81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.377515 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "cfa7cd93-66af-4dae-a610-d61b6235af81" (UID: "cfa7cd93-66af-4dae-a610-d61b6235af81"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.411634 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjt9n\" (UniqueName: \"kubernetes.io/projected/cfa7cd93-66af-4dae-a610-d61b6235af81-kube-api-access-hjt9n\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.411681 4841 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.411694 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.411705 4841 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfa7cd93-66af-4dae-a610-d61b6235af81-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.411717 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfa7cd93-66af-4dae-a610-d61b6235af81-logs\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.411728 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfa7cd93-66af-4dae-a610-d61b6235af81-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:54 crc kubenswrapper[4841]: I0130 06:51:54.411739 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfa7cd93-66af-4dae-a610-d61b6235af81-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:51:55 crc kubenswrapper[4841]: I0130 06:51:55.127054 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6697b658-bhtrl" event={"ID":"cfa7cd93-66af-4dae-a610-d61b6235af81","Type":"ContainerDied","Data":"07be13e42f8f1e277979d6d1e4463c7ce9aa70bf75e5a8466f4b98fd69193687"} Jan 30 06:51:55 crc kubenswrapper[4841]: I0130 06:51:55.127127 4841 scope.go:117] "RemoveContainer" containerID="d49ea3d715b7b663c44877634bae9eb11ba651f1df614d0a17cf00d282c2a96c" Jan 30 06:51:55 crc kubenswrapper[4841]: I0130 06:51:55.127127 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c6697b658-bhtrl" Jan 30 06:51:55 crc kubenswrapper[4841]: I0130 06:51:55.158527 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c6697b658-bhtrl"] Jan 30 06:51:55 crc kubenswrapper[4841]: I0130 06:51:55.165940 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-c6697b658-bhtrl"] Jan 30 06:51:55 crc kubenswrapper[4841]: I0130 06:51:55.364720 4841 scope.go:117] "RemoveContainer" containerID="26bf91e83f7f783284c5ae3f95e4da9624775557e77d94dea27183e0d82dd032" Jan 30 06:51:56 crc kubenswrapper[4841]: I0130 06:51:56.447137 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" path="/var/lib/kubelet/pods/cfa7cd93-66af-4dae-a610-d61b6235af81/volumes" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.190202 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9"] Jan 30 06:52:05 crc kubenswrapper[4841]: E0130 06:52:05.191522 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc77bd65-bd9c-4a2e-842d-3606ce65a12b" containerName="heat-engine" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.191544 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc77bd65-bd9c-4a2e-842d-3606ce65a12b" containerName="heat-engine" Jan 30 06:52:05 crc kubenswrapper[4841]: E0130 06:52:05.191563 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76af154e-00d9-4cf0-80dd-2b39090fb44e" containerName="heat-cfnapi" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.191576 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="76af154e-00d9-4cf0-80dd-2b39090fb44e" containerName="heat-cfnapi" Jan 30 06:52:05 crc kubenswrapper[4841]: E0130 06:52:05.191590 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" containerName="registry-server" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.191601 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" containerName="registry-server" Jan 30 06:52:05 crc kubenswrapper[4841]: E0130 06:52:05.191624 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095b719b-f5af-480d-a260-f44e526d65ee" containerName="heat-api" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.191634 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="095b719b-f5af-480d-a260-f44e526d65ee" containerName="heat-api" Jan 30 06:52:05 crc kubenswrapper[4841]: E0130 06:52:05.191666 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerName="horizon" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.191680 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerName="horizon" Jan 30 06:52:05 crc kubenswrapper[4841]: E0130 06:52:05.191700 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" containerName="extract-utilities" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.191711 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" containerName="extract-utilities" Jan 30 06:52:05 crc kubenswrapper[4841]: E0130 06:52:05.191730 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095b719b-f5af-480d-a260-f44e526d65ee" containerName="heat-api" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.191741 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="095b719b-f5af-480d-a260-f44e526d65ee" containerName="heat-api" Jan 30 06:52:05 crc kubenswrapper[4841]: E0130 06:52:05.191773 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76af154e-00d9-4cf0-80dd-2b39090fb44e" containerName="heat-cfnapi" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.191785 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="76af154e-00d9-4cf0-80dd-2b39090fb44e" containerName="heat-cfnapi" Jan 30 06:52:05 crc kubenswrapper[4841]: E0130 06:52:05.191804 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerName="horizon-log" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.191813 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerName="horizon-log" Jan 30 06:52:05 crc kubenswrapper[4841]: E0130 06:52:05.191836 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="098c43da-f08f-4050-bed0-08952682d551" containerName="heat-cfnapi" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.191846 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="098c43da-f08f-4050-bed0-08952682d551" containerName="heat-cfnapi" Jan 30 06:52:05 crc kubenswrapper[4841]: E0130 06:52:05.191869 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" containerName="extract-content" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.191880 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" containerName="extract-content" Jan 30 06:52:05 crc kubenswrapper[4841]: E0130 06:52:05.191904 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafd8dae-b1e5-4c27-9644-0d1ebf5fb446" containerName="heat-api" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.191914 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafd8dae-b1e5-4c27-9644-0d1ebf5fb446" containerName="heat-api" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.192207 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="76af154e-00d9-4cf0-80dd-2b39090fb44e" containerName="heat-cfnapi" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.192232 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="095b719b-f5af-480d-a260-f44e526d65ee" containerName="heat-api" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.192256 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="098c43da-f08f-4050-bed0-08952682d551" containerName="heat-cfnapi" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.192269 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="76af154e-00d9-4cf0-80dd-2b39090fb44e" containerName="heat-cfnapi" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.192281 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="dafd8dae-b1e5-4c27-9644-0d1ebf5fb446" containerName="heat-api" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.192291 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="00800c5e-91fd-4437-b1cf-6c5dece7d02c" containerName="registry-server" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.192307 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc77bd65-bd9c-4a2e-842d-3606ce65a12b" containerName="heat-engine" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.192325 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerName="horizon" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.192343 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa7cd93-66af-4dae-a610-d61b6235af81" containerName="horizon-log" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.192359 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="095b719b-f5af-480d-a260-f44e526d65ee" containerName="heat-api" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.195625 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.200919 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.238141 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9"] Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.378681 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfrgd\" (UniqueName: \"kubernetes.io/projected/ac910689-bd23-493a-9b20-89e21df5d758-kube-api-access-xfrgd\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9\" (UID: \"ac910689-bd23-493a-9b20-89e21df5d758\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.378783 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac910689-bd23-493a-9b20-89e21df5d758-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9\" (UID: \"ac910689-bd23-493a-9b20-89e21df5d758\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.378881 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac910689-bd23-493a-9b20-89e21df5d758-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9\" (UID: \"ac910689-bd23-493a-9b20-89e21df5d758\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.480781 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac910689-bd23-493a-9b20-89e21df5d758-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9\" (UID: \"ac910689-bd23-493a-9b20-89e21df5d758\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.481620 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac910689-bd23-493a-9b20-89e21df5d758-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9\" (UID: \"ac910689-bd23-493a-9b20-89e21df5d758\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.481771 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfrgd\" (UniqueName: \"kubernetes.io/projected/ac910689-bd23-493a-9b20-89e21df5d758-kube-api-access-xfrgd\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9\" (UID: \"ac910689-bd23-493a-9b20-89e21df5d758\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.481839 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac910689-bd23-493a-9b20-89e21df5d758-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9\" (UID: \"ac910689-bd23-493a-9b20-89e21df5d758\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.482203 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac910689-bd23-493a-9b20-89e21df5d758-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9\" (UID: \"ac910689-bd23-493a-9b20-89e21df5d758\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.504939 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfrgd\" (UniqueName: \"kubernetes.io/projected/ac910689-bd23-493a-9b20-89e21df5d758-kube-api-access-xfrgd\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9\" (UID: \"ac910689-bd23-493a-9b20-89e21df5d758\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:05 crc kubenswrapper[4841]: I0130 06:52:05.551066 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:06 crc kubenswrapper[4841]: I0130 06:52:06.027099 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9"] Jan 30 06:52:06 crc kubenswrapper[4841]: I0130 06:52:06.270344 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" event={"ID":"ac910689-bd23-493a-9b20-89e21df5d758","Type":"ContainerStarted","Data":"49e44d2ac73ef3e39133eb48ad16e8c8425871534d7f24485825998bcd993bcc"} Jan 30 06:52:06 crc kubenswrapper[4841]: I0130 06:52:06.270468 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" event={"ID":"ac910689-bd23-493a-9b20-89e21df5d758","Type":"ContainerStarted","Data":"cb6adfe0c8dad4bde2f17ce17aa1df91cac19a0575e4a25a10556509d1d28bc4"} Jan 30 06:52:07 crc kubenswrapper[4841]: I0130 06:52:07.288887 4841 generic.go:334] "Generic (PLEG): container finished" podID="ac910689-bd23-493a-9b20-89e21df5d758" containerID="49e44d2ac73ef3e39133eb48ad16e8c8425871534d7f24485825998bcd993bcc" exitCode=0 Jan 30 06:52:07 crc kubenswrapper[4841]: I0130 06:52:07.288979 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" event={"ID":"ac910689-bd23-493a-9b20-89e21df5d758","Type":"ContainerDied","Data":"49e44d2ac73ef3e39133eb48ad16e8c8425871534d7f24485825998bcd993bcc"} Jan 30 06:52:09 crc kubenswrapper[4841]: I0130 06:52:09.324291 4841 generic.go:334] "Generic (PLEG): container finished" podID="ac910689-bd23-493a-9b20-89e21df5d758" containerID="995681ef50277c9b340e1d51cd3698c8d86e5baa2c4eb1825e3546940e195d45" exitCode=0 Jan 30 06:52:09 crc kubenswrapper[4841]: I0130 06:52:09.324432 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" event={"ID":"ac910689-bd23-493a-9b20-89e21df5d758","Type":"ContainerDied","Data":"995681ef50277c9b340e1d51cd3698c8d86e5baa2c4eb1825e3546940e195d45"} Jan 30 06:52:10 crc kubenswrapper[4841]: I0130 06:52:10.341537 4841 generic.go:334] "Generic (PLEG): container finished" podID="ac910689-bd23-493a-9b20-89e21df5d758" containerID="40c38a81935afc4fb76ec44d3b7c603fbf5c101590e6ed8f0717c17384370945" exitCode=0 Jan 30 06:52:10 crc kubenswrapper[4841]: I0130 06:52:10.341664 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" event={"ID":"ac910689-bd23-493a-9b20-89e21df5d758","Type":"ContainerDied","Data":"40c38a81935afc4fb76ec44d3b7c603fbf5c101590e6ed8f0717c17384370945"} Jan 30 06:52:11 crc kubenswrapper[4841]: I0130 06:52:11.805222 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:11 crc kubenswrapper[4841]: I0130 06:52:11.933449 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfrgd\" (UniqueName: \"kubernetes.io/projected/ac910689-bd23-493a-9b20-89e21df5d758-kube-api-access-xfrgd\") pod \"ac910689-bd23-493a-9b20-89e21df5d758\" (UID: \"ac910689-bd23-493a-9b20-89e21df5d758\") " Jan 30 06:52:11 crc kubenswrapper[4841]: I0130 06:52:11.933571 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac910689-bd23-493a-9b20-89e21df5d758-bundle\") pod \"ac910689-bd23-493a-9b20-89e21df5d758\" (UID: \"ac910689-bd23-493a-9b20-89e21df5d758\") " Jan 30 06:52:11 crc kubenswrapper[4841]: I0130 06:52:11.933756 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac910689-bd23-493a-9b20-89e21df5d758-util\") pod \"ac910689-bd23-493a-9b20-89e21df5d758\" (UID: \"ac910689-bd23-493a-9b20-89e21df5d758\") " Jan 30 06:52:11 crc kubenswrapper[4841]: I0130 06:52:11.936308 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac910689-bd23-493a-9b20-89e21df5d758-bundle" (OuterVolumeSpecName: "bundle") pod "ac910689-bd23-493a-9b20-89e21df5d758" (UID: "ac910689-bd23-493a-9b20-89e21df5d758"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:52:11 crc kubenswrapper[4841]: I0130 06:52:11.945768 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac910689-bd23-493a-9b20-89e21df5d758-kube-api-access-xfrgd" (OuterVolumeSpecName: "kube-api-access-xfrgd") pod "ac910689-bd23-493a-9b20-89e21df5d758" (UID: "ac910689-bd23-493a-9b20-89e21df5d758"). InnerVolumeSpecName "kube-api-access-xfrgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:52:11 crc kubenswrapper[4841]: I0130 06:52:11.950506 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac910689-bd23-493a-9b20-89e21df5d758-util" (OuterVolumeSpecName: "util") pod "ac910689-bd23-493a-9b20-89e21df5d758" (UID: "ac910689-bd23-493a-9b20-89e21df5d758"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:52:12 crc kubenswrapper[4841]: I0130 06:52:12.036839 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac910689-bd23-493a-9b20-89e21df5d758-util\") on node \"crc\" DevicePath \"\"" Jan 30 06:52:12 crc kubenswrapper[4841]: I0130 06:52:12.036891 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfrgd\" (UniqueName: \"kubernetes.io/projected/ac910689-bd23-493a-9b20-89e21df5d758-kube-api-access-xfrgd\") on node \"crc\" DevicePath \"\"" Jan 30 06:52:12 crc kubenswrapper[4841]: I0130 06:52:12.036911 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac910689-bd23-493a-9b20-89e21df5d758-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:52:12 crc kubenswrapper[4841]: I0130 06:52:12.362901 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" event={"ID":"ac910689-bd23-493a-9b20-89e21df5d758","Type":"ContainerDied","Data":"cb6adfe0c8dad4bde2f17ce17aa1df91cac19a0575e4a25a10556509d1d28bc4"} Jan 30 06:52:12 crc kubenswrapper[4841]: I0130 06:52:12.362945 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb6adfe0c8dad4bde2f17ce17aa1df91cac19a0575e4a25a10556509d1d28bc4" Jan 30 06:52:12 crc kubenswrapper[4841]: I0130 06:52:12.363013 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.007318 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b"] Jan 30 06:52:22 crc kubenswrapper[4841]: E0130 06:52:22.008123 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac910689-bd23-493a-9b20-89e21df5d758" containerName="pull" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.008135 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac910689-bd23-493a-9b20-89e21df5d758" containerName="pull" Jan 30 06:52:22 crc kubenswrapper[4841]: E0130 06:52:22.008145 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac910689-bd23-493a-9b20-89e21df5d758" containerName="util" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.008151 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac910689-bd23-493a-9b20-89e21df5d758" containerName="util" Jan 30 06:52:22 crc kubenswrapper[4841]: E0130 06:52:22.008181 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac910689-bd23-493a-9b20-89e21df5d758" containerName="extract" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.008187 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac910689-bd23-493a-9b20-89e21df5d758" containerName="extract" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.008341 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac910689-bd23-493a-9b20-89e21df5d758" containerName="extract" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.008993 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.011208 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.011211 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.014510 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-x5s7q" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.030808 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b"] Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.127146 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns"] Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.128591 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.131188 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-hxd2w" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.134193 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5"] Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.135391 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.141999 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.145217 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns"] Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.161971 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4prb\" (UniqueName: \"kubernetes.io/projected/226c058c-6fb2-493d-ac46-d42aeec0a369-kube-api-access-p4prb\") pod \"obo-prometheus-operator-68bc856cb9-t9p5b\" (UID: \"226c058c-6fb2-493d-ac46-d42aeec0a369\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.162486 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5"] Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.263837 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4prb\" (UniqueName: \"kubernetes.io/projected/226c058c-6fb2-493d-ac46-d42aeec0a369-kube-api-access-p4prb\") pod \"obo-prometheus-operator-68bc856cb9-t9p5b\" (UID: \"226c058c-6fb2-493d-ac46-d42aeec0a369\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.263895 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04deced0-d0da-4612-a8d3-7c03ec537275-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5\" (UID: \"04deced0-d0da-4612-a8d3-7c03ec537275\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.263932 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04deced0-d0da-4612-a8d3-7c03ec537275-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5\" (UID: \"04deced0-d0da-4612-a8d3-7c03ec537275\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.264019 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3039e990-e132-43ec-bef0-22d0c3c66705-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns\" (UID: \"3039e990-e132-43ec-bef0-22d0c3c66705\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.264070 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3039e990-e132-43ec-bef0-22d0c3c66705-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns\" (UID: \"3039e990-e132-43ec-bef0-22d0c3c66705\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.292818 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4prb\" (UniqueName: \"kubernetes.io/projected/226c058c-6fb2-493d-ac46-d42aeec0a369-kube-api-access-p4prb\") pod \"obo-prometheus-operator-68bc856cb9-t9p5b\" (UID: \"226c058c-6fb2-493d-ac46-d42aeec0a369\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.333649 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.337675 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ql57f"] Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.338807 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ql57f" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.341929 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.342103 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-lz6cx" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.360072 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ql57f"] Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.366527 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04deced0-d0da-4612-a8d3-7c03ec537275-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5\" (UID: \"04deced0-d0da-4612-a8d3-7c03ec537275\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.366561 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04deced0-d0da-4612-a8d3-7c03ec537275-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5\" (UID: \"04deced0-d0da-4612-a8d3-7c03ec537275\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.366628 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3039e990-e132-43ec-bef0-22d0c3c66705-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns\" (UID: \"3039e990-e132-43ec-bef0-22d0c3c66705\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.366668 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3039e990-e132-43ec-bef0-22d0c3c66705-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns\" (UID: \"3039e990-e132-43ec-bef0-22d0c3c66705\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.391963 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04deced0-d0da-4612-a8d3-7c03ec537275-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5\" (UID: \"04deced0-d0da-4612-a8d3-7c03ec537275\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.394049 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04deced0-d0da-4612-a8d3-7c03ec537275-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5\" (UID: \"04deced0-d0da-4612-a8d3-7c03ec537275\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.404958 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3039e990-e132-43ec-bef0-22d0c3c66705-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns\" (UID: \"3039e990-e132-43ec-bef0-22d0c3c66705\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.407569 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3039e990-e132-43ec-bef0-22d0c3c66705-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns\" (UID: \"3039e990-e132-43ec-bef0-22d0c3c66705\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.448178 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.458793 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.469674 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t5xz\" (UniqueName: \"kubernetes.io/projected/971ec121-d790-4ee9-b43a-6e924e45fd27-kube-api-access-2t5xz\") pod \"observability-operator-59bdc8b94-ql57f\" (UID: \"971ec121-d790-4ee9-b43a-6e924e45fd27\") " pod="openshift-operators/observability-operator-59bdc8b94-ql57f" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.469749 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/971ec121-d790-4ee9-b43a-6e924e45fd27-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ql57f\" (UID: \"971ec121-d790-4ee9-b43a-6e924e45fd27\") " pod="openshift-operators/observability-operator-59bdc8b94-ql57f" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.574538 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t5xz\" (UniqueName: \"kubernetes.io/projected/971ec121-d790-4ee9-b43a-6e924e45fd27-kube-api-access-2t5xz\") pod \"observability-operator-59bdc8b94-ql57f\" (UID: \"971ec121-d790-4ee9-b43a-6e924e45fd27\") " pod="openshift-operators/observability-operator-59bdc8b94-ql57f" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.574656 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/971ec121-d790-4ee9-b43a-6e924e45fd27-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ql57f\" (UID: \"971ec121-d790-4ee9-b43a-6e924e45fd27\") " pod="openshift-operators/observability-operator-59bdc8b94-ql57f" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.594174 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/971ec121-d790-4ee9-b43a-6e924e45fd27-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ql57f\" (UID: \"971ec121-d790-4ee9-b43a-6e924e45fd27\") " pod="openshift-operators/observability-operator-59bdc8b94-ql57f" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.613071 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t5xz\" (UniqueName: \"kubernetes.io/projected/971ec121-d790-4ee9-b43a-6e924e45fd27-kube-api-access-2t5xz\") pod \"observability-operator-59bdc8b94-ql57f\" (UID: \"971ec121-d790-4ee9-b43a-6e924e45fd27\") " pod="openshift-operators/observability-operator-59bdc8b94-ql57f" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.613132 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-8ljvk"] Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.614371 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.617032 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-nghsp" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.634503 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-8ljvk"] Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.781617 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gtbs\" (UniqueName: \"kubernetes.io/projected/9d5142af-ee4e-4290-bb79-e7ee3e20fca3-kube-api-access-7gtbs\") pod \"perses-operator-5bf474d74f-8ljvk\" (UID: \"9d5142af-ee4e-4290-bb79-e7ee3e20fca3\") " pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.781686 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/9d5142af-ee4e-4290-bb79-e7ee3e20fca3-openshift-service-ca\") pod \"perses-operator-5bf474d74f-8ljvk\" (UID: \"9d5142af-ee4e-4290-bb79-e7ee3e20fca3\") " pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.882964 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gtbs\" (UniqueName: \"kubernetes.io/projected/9d5142af-ee4e-4290-bb79-e7ee3e20fca3-kube-api-access-7gtbs\") pod \"perses-operator-5bf474d74f-8ljvk\" (UID: \"9d5142af-ee4e-4290-bb79-e7ee3e20fca3\") " pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.883287 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/9d5142af-ee4e-4290-bb79-e7ee3e20fca3-openshift-service-ca\") pod \"perses-operator-5bf474d74f-8ljvk\" (UID: \"9d5142af-ee4e-4290-bb79-e7ee3e20fca3\") " pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.884105 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/9d5142af-ee4e-4290-bb79-e7ee3e20fca3-openshift-service-ca\") pod \"perses-operator-5bf474d74f-8ljvk\" (UID: \"9d5142af-ee4e-4290-bb79-e7ee3e20fca3\") " pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.896170 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ql57f" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.902977 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gtbs\" (UniqueName: \"kubernetes.io/projected/9d5142af-ee4e-4290-bb79-e7ee3e20fca3-kube-api-access-7gtbs\") pod \"perses-operator-5bf474d74f-8ljvk\" (UID: \"9d5142af-ee4e-4290-bb79-e7ee3e20fca3\") " pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" Jan 30 06:52:22 crc kubenswrapper[4841]: I0130 06:52:22.945895 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" Jan 30 06:52:23 crc kubenswrapper[4841]: I0130 06:52:23.046994 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b"] Jan 30 06:52:23 crc kubenswrapper[4841]: I0130 06:52:23.161799 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5"] Jan 30 06:52:23 crc kubenswrapper[4841]: I0130 06:52:23.179224 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns"] Jan 30 06:52:23 crc kubenswrapper[4841]: W0130 06:52:23.191438 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3039e990_e132_43ec_bef0_22d0c3c66705.slice/crio-c78d6167ab01842e34329208c9bbdfdaa968bf9d5ac05d5e9b98b26cc91e43ed WatchSource:0}: Error finding container c78d6167ab01842e34329208c9bbdfdaa968bf9d5ac05d5e9b98b26cc91e43ed: Status 404 returned error can't find the container with id c78d6167ab01842e34329208c9bbdfdaa968bf9d5ac05d5e9b98b26cc91e43ed Jan 30 06:52:23 crc kubenswrapper[4841]: I0130 06:52:23.433233 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ql57f"] Jan 30 06:52:23 crc kubenswrapper[4841]: W0130 06:52:23.441547 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod971ec121_d790_4ee9_b43a_6e924e45fd27.slice/crio-43c62bafb995eebf2e7bf33e13b8182ab90182644ed80f6c81467b8737da7165 WatchSource:0}: Error finding container 43c62bafb995eebf2e7bf33e13b8182ab90182644ed80f6c81467b8737da7165: Status 404 returned error can't find the container with id 43c62bafb995eebf2e7bf33e13b8182ab90182644ed80f6c81467b8737da7165 Jan 30 06:52:23 crc kubenswrapper[4841]: I0130 06:52:23.486120 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b" event={"ID":"226c058c-6fb2-493d-ac46-d42aeec0a369","Type":"ContainerStarted","Data":"8e11b82ade47aa143b7b64f041b82c372701160de182e912af78a7c0f0da18b3"} Jan 30 06:52:23 crc kubenswrapper[4841]: I0130 06:52:23.488261 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns" event={"ID":"3039e990-e132-43ec-bef0-22d0c3c66705","Type":"ContainerStarted","Data":"c78d6167ab01842e34329208c9bbdfdaa968bf9d5ac05d5e9b98b26cc91e43ed"} Jan 30 06:52:23 crc kubenswrapper[4841]: I0130 06:52:23.490024 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ql57f" event={"ID":"971ec121-d790-4ee9-b43a-6e924e45fd27","Type":"ContainerStarted","Data":"43c62bafb995eebf2e7bf33e13b8182ab90182644ed80f6c81467b8737da7165"} Jan 30 06:52:23 crc kubenswrapper[4841]: I0130 06:52:23.491248 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5" event={"ID":"04deced0-d0da-4612-a8d3-7c03ec537275","Type":"ContainerStarted","Data":"db245fb2936e2e732fd45fda732fe8b94f8baa8a6f4bf4c8107f4638dcc1870d"} Jan 30 06:52:23 crc kubenswrapper[4841]: I0130 06:52:23.543360 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-8ljvk"] Jan 30 06:52:23 crc kubenswrapper[4841]: W0130 06:52:23.545418 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d5142af_ee4e_4290_bb79_e7ee3e20fca3.slice/crio-77c2b311e3edc2a9eefcec79b0f6862f1a4f8a27b419af56b287fd0bb1e52342 WatchSource:0}: Error finding container 77c2b311e3edc2a9eefcec79b0f6862f1a4f8a27b419af56b287fd0bb1e52342: Status 404 returned error can't find the container with id 77c2b311e3edc2a9eefcec79b0f6862f1a4f8a27b419af56b287fd0bb1e52342 Jan 30 06:52:24 crc kubenswrapper[4841]: I0130 06:52:24.627098 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" event={"ID":"9d5142af-ee4e-4290-bb79-e7ee3e20fca3","Type":"ContainerStarted","Data":"77c2b311e3edc2a9eefcec79b0f6862f1a4f8a27b419af56b287fd0bb1e52342"} Jan 30 06:52:37 crc kubenswrapper[4841]: E0130 06:52:37.677610 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a" Jan 30 06:52:37 crc kubenswrapper[4841]: E0130 06:52:37.678693 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator --watch-referenced-objects-in-all-namespaces=true --disable-unmanaged-prometheus-configuration=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p4prb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-68bc856cb9-t9p5b_openshift-operators(226c058c-6fb2-493d-ac46-d42aeec0a369): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 06:52:37 crc kubenswrapper[4841]: E0130 06:52:37.680005 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b" podUID="226c058c-6fb2-493d-ac46-d42aeec0a369" Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.047421 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9431-account-create-update-fv8z4"] Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.058173 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-knpsp"] Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.067291 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-knpsp"] Jan 30 06:52:38 crc kubenswrapper[4841]: E0130 06:52:38.076130 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a\\\"\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b" podUID="226c058c-6fb2-493d-ac46-d42aeec0a369" Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.076696 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9431-account-create-update-fv8z4"] Jan 30 06:52:38 crc kubenswrapper[4841]: E0130 06:52:38.091603 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Jan 30 06:52:38 crc kubenswrapper[4841]: E0130 06:52:38.091749 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns_openshift-operators(3039e990-e132-43ec-bef0-22d0c3c66705): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 06:52:38 crc kubenswrapper[4841]: E0130 06:52:38.093295 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns" podUID="3039e990-e132-43ec-bef0-22d0c3c66705" Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.441797 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c71e210-3b42-46f6-ab07-55bcbd35e2d1" path="/var/lib/kubelet/pods/0c71e210-3b42-46f6-ab07-55bcbd35e2d1/volumes" Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.442863 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80" path="/var/lib/kubelet/pods/b82e5ef1-3deb-4d3b-8ca7-fbd50c8d3e80/volumes" Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.789028 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5" event={"ID":"04deced0-d0da-4612-a8d3-7c03ec537275","Type":"ContainerStarted","Data":"6d43cd3996e72b53921323dcf9baaa93a0d249b094f1559e1acfc10d529e0074"} Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.793042 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" event={"ID":"9d5142af-ee4e-4290-bb79-e7ee3e20fca3","Type":"ContainerStarted","Data":"92900d0cb3d3491043b4b8337b263bc11e8a674287dc41308913e3277ba9e558"} Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.793183 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.799349 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ql57f" event={"ID":"971ec121-d790-4ee9-b43a-6e924e45fd27","Type":"ContainerStarted","Data":"fe3694f15eb30eee3089d37e54ae636586d30a18b0b78255715efb617577fb8c"} Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.799605 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-ql57f" Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.848321 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5" podStartSLOduration=1.8966427879999999 podStartE2EDuration="16.848295786s" podCreationTimestamp="2026-01-30 06:52:22 +0000 UTC" firstStartedPulling="2026-01-30 06:52:23.166609027 +0000 UTC m=+6280.160081665" lastFinishedPulling="2026-01-30 06:52:38.118262025 +0000 UTC m=+6295.111734663" observedRunningTime="2026-01-30 06:52:38.822011866 +0000 UTC m=+6295.815484504" watchObservedRunningTime="2026-01-30 06:52:38.848295786 +0000 UTC m=+6295.841768434" Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.866693 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-ql57f" podStartSLOduration=2.184704374 podStartE2EDuration="16.866674177s" podCreationTimestamp="2026-01-30 06:52:22 +0000 UTC" firstStartedPulling="2026-01-30 06:52:23.446683318 +0000 UTC m=+6280.440155956" lastFinishedPulling="2026-01-30 06:52:38.128653121 +0000 UTC m=+6295.122125759" observedRunningTime="2026-01-30 06:52:38.854032189 +0000 UTC m=+6295.847504837" watchObservedRunningTime="2026-01-30 06:52:38.866674177 +0000 UTC m=+6295.860146815" Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.897567 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-ql57f" Jan 30 06:52:38 crc kubenswrapper[4841]: I0130 06:52:38.906611 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" podStartSLOduration=2.357667197 podStartE2EDuration="16.906593362s" podCreationTimestamp="2026-01-30 06:52:22 +0000 UTC" firstStartedPulling="2026-01-30 06:52:23.551179255 +0000 UTC m=+6280.544651893" lastFinishedPulling="2026-01-30 06:52:38.10010542 +0000 UTC m=+6295.093578058" observedRunningTime="2026-01-30 06:52:38.889814124 +0000 UTC m=+6295.883286762" watchObservedRunningTime="2026-01-30 06:52:38.906593362 +0000 UTC m=+6295.900065990" Jan 30 06:52:39 crc kubenswrapper[4841]: I0130 06:52:39.820538 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns" event={"ID":"3039e990-e132-43ec-bef0-22d0c3c66705","Type":"ContainerStarted","Data":"02ea30a74c08366ed427bc46d83e4a58ffecbcff4af43fb293080c153a3b92c9"} Jan 30 06:52:49 crc kubenswrapper[4841]: I0130 06:52:49.471639 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns" podStartSLOduration=-9223372009.383158 podStartE2EDuration="27.471617854s" podCreationTimestamp="2026-01-30 06:52:22 +0000 UTC" firstStartedPulling="2026-01-30 06:52:23.202873275 +0000 UTC m=+6280.196345913" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:52:39.84976385 +0000 UTC m=+6296.843236498" watchObservedRunningTime="2026-01-30 06:52:49.471617854 +0000 UTC m=+6306.465090502" Jan 30 06:52:51 crc kubenswrapper[4841]: I0130 06:52:51.042577 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8pzcg"] Jan 30 06:52:51 crc kubenswrapper[4841]: I0130 06:52:51.053280 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8pzcg"] Jan 30 06:52:51 crc kubenswrapper[4841]: I0130 06:52:51.992545 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b" event={"ID":"226c058c-6fb2-493d-ac46-d42aeec0a369","Type":"ContainerStarted","Data":"defa30859bf3d6ce4c873cefb64246e03176095a07fc3570a1037324831ef6f6"} Jan 30 06:52:52 crc kubenswrapper[4841]: I0130 06:52:52.025035 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t9p5b" podStartSLOduration=3.850974264 podStartE2EDuration="31.02500666s" podCreationTimestamp="2026-01-30 06:52:21 +0000 UTC" firstStartedPulling="2026-01-30 06:52:23.07822865 +0000 UTC m=+6280.071701288" lastFinishedPulling="2026-01-30 06:52:50.252261006 +0000 UTC m=+6307.245733684" observedRunningTime="2026-01-30 06:52:52.016032311 +0000 UTC m=+6309.009504979" watchObservedRunningTime="2026-01-30 06:52:52.02500666 +0000 UTC m=+6309.018479338" Jan 30 06:52:52 crc kubenswrapper[4841]: I0130 06:52:52.446699 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21d66874-b86c-45c1-80b3-23f6f4741c37" path="/var/lib/kubelet/pods/21d66874-b86c-45c1-80b3-23f6f4741c37/volumes" Jan 30 06:52:52 crc kubenswrapper[4841]: I0130 06:52:52.950366 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-8ljvk" Jan 30 06:52:53 crc kubenswrapper[4841]: I0130 06:52:53.873953 4841 scope.go:117] "RemoveContainer" containerID="fa2ca2b693605bd47db25d5b8fbb79d075832829fe0ba13ef71ba32be9c736a4" Jan 30 06:52:53 crc kubenswrapper[4841]: I0130 06:52:53.916674 4841 scope.go:117] "RemoveContainer" containerID="eb94a0c2c11530f01bce52ca38fa5f747ab7a54fb94a92cba3fdf274f6c21662" Jan 30 06:52:53 crc kubenswrapper[4841]: I0130 06:52:53.976718 4841 scope.go:117] "RemoveContainer" containerID="1c64e1f4c727edeb66150143ae8a5d7dbb6d77d5dc8cdbfd3be87b1c00dc58a4" Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.899616 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.900065 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="4daf8ad7-5722-443b-9ac6-f9742ba7db0a" containerName="openstackclient" containerID="cri-o://cf7fa48e8b80bd88ae241deae17a448b0c4ee4e2d53672773ea5fc32fecfcec4" gracePeriod=2 Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.911301 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.931185 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 06:52:57 crc kubenswrapper[4841]: E0130 06:52:57.931600 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4daf8ad7-5722-443b-9ac6-f9742ba7db0a" containerName="openstackclient" Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.931617 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4daf8ad7-5722-443b-9ac6-f9742ba7db0a" containerName="openstackclient" Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.931787 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4daf8ad7-5722-443b-9ac6-f9742ba7db0a" containerName="openstackclient" Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.932384 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.948684 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.962818 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkdtm\" (UniqueName: \"kubernetes.io/projected/8ab7aae2-b5ed-4399-8016-5ec0e9101070-kube-api-access-pkdtm\") pod \"openstackclient\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " pod="openstack/openstackclient" Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.962929 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab7aae2-b5ed-4399-8016-5ec0e9101070-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " pod="openstack/openstackclient" Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.962951 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8ab7aae2-b5ed-4399-8016-5ec0e9101070-openstack-config\") pod \"openstackclient\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " pod="openstack/openstackclient" Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.963028 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8ab7aae2-b5ed-4399-8016-5ec0e9101070-openstack-config-secret\") pod \"openstackclient\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " pod="openstack/openstackclient" Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.973288 4841 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ab7aae2-b5ed-4399-8016-5ec0e9101070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T06:52:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T06:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T06:52:57Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T06:52:57Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:2b4f8494513a3af102066fec5868ab167ac8664aceb2f0c639d7a0b60260a944\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem\\\",\\\"name\\\":\\\"combined-ca-bundle\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T06:52:57Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.982544 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 30 06:52:57 crc kubenswrapper[4841]: E0130 06:52:57.983297 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-pkdtm openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="8ab7aae2-b5ed-4399-8016-5ec0e9101070" Jan 30 06:52:57 crc kubenswrapper[4841]: I0130 06:52:57.996489 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.014427 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.015794 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.018275 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4daf8ad7-5722-443b-9ac6-f9742ba7db0a" podUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.019808 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8ab7aae2-b5ed-4399-8016-5ec0e9101070" podUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.034674 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.064934 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq5mp\" (UniqueName: \"kubernetes.io/projected/5f8c4ab1-6430-4b53-8021-0bfeba020584-kube-api-access-nq5mp\") pod \"openstackclient\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.064988 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8ab7aae2-b5ed-4399-8016-5ec0e9101070-openstack-config-secret\") pod \"openstackclient\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.065011 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5f8c4ab1-6430-4b53-8021-0bfeba020584-openstack-config-secret\") pod \"openstackclient\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.065039 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5f8c4ab1-6430-4b53-8021-0bfeba020584-openstack-config\") pod \"openstackclient\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.065088 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c4ab1-6430-4b53-8021-0bfeba020584-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.065109 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkdtm\" (UniqueName: \"kubernetes.io/projected/8ab7aae2-b5ed-4399-8016-5ec0e9101070-kube-api-access-pkdtm\") pod \"openstackclient\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.065177 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab7aae2-b5ed-4399-8016-5ec0e9101070-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.065197 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8ab7aae2-b5ed-4399-8016-5ec0e9101070-openstack-config\") pod \"openstackclient\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.067037 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8ab7aae2-b5ed-4399-8016-5ec0e9101070-openstack-config\") pod \"openstackclient\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: E0130 06:52:58.088969 4841 projected.go:194] Error preparing data for projected volume kube-api-access-pkdtm for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (8ab7aae2-b5ed-4399-8016-5ec0e9101070) does not match the UID in record. The object might have been deleted and then recreated Jan 30 06:52:58 crc kubenswrapper[4841]: E0130 06:52:58.089271 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ab7aae2-b5ed-4399-8016-5ec0e9101070-kube-api-access-pkdtm podName:8ab7aae2-b5ed-4399-8016-5ec0e9101070 nodeName:}" failed. No retries permitted until 2026-01-30 06:52:58.589252593 +0000 UTC m=+6315.582725231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pkdtm" (UniqueName: "kubernetes.io/projected/8ab7aae2-b5ed-4399-8016-5ec0e9101070-kube-api-access-pkdtm") pod "openstackclient" (UID: "8ab7aae2-b5ed-4399-8016-5ec0e9101070") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (8ab7aae2-b5ed-4399-8016-5ec0e9101070) does not match the UID in record. The object might have been deleted and then recreated Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.090658 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab7aae2-b5ed-4399-8016-5ec0e9101070-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.089093 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8ab7aae2-b5ed-4399-8016-5ec0e9101070-openstack-config-secret\") pod \"openstackclient\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.092503 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.096854 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8ab7aae2-b5ed-4399-8016-5ec0e9101070" podUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.106476 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.107838 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.109614 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-rjp5m" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.131296 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.135176 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.144776 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8ab7aae2-b5ed-4399-8016-5ec0e9101070" podUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.169698 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab7aae2-b5ed-4399-8016-5ec0e9101070-combined-ca-bundle\") pod \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.169778 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8ab7aae2-b5ed-4399-8016-5ec0e9101070-openstack-config\") pod \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.169839 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8ab7aae2-b5ed-4399-8016-5ec0e9101070-openstack-config-secret\") pod \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\" (UID: \"8ab7aae2-b5ed-4399-8016-5ec0e9101070\") " Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.170204 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmg5g\" (UniqueName: \"kubernetes.io/projected/f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce-kube-api-access-fmg5g\") pod \"kube-state-metrics-0\" (UID: \"f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce\") " pod="openstack/kube-state-metrics-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.170334 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq5mp\" (UniqueName: \"kubernetes.io/projected/5f8c4ab1-6430-4b53-8021-0bfeba020584-kube-api-access-nq5mp\") pod \"openstackclient\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.170388 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5f8c4ab1-6430-4b53-8021-0bfeba020584-openstack-config-secret\") pod \"openstackclient\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.170445 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5f8c4ab1-6430-4b53-8021-0bfeba020584-openstack-config\") pod \"openstackclient\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.170562 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c4ab1-6430-4b53-8021-0bfeba020584-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.170774 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkdtm\" (UniqueName: \"kubernetes.io/projected/8ab7aae2-b5ed-4399-8016-5ec0e9101070-kube-api-access-pkdtm\") on node \"crc\" DevicePath \"\"" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.174032 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab7aae2-b5ed-4399-8016-5ec0e9101070-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8ab7aae2-b5ed-4399-8016-5ec0e9101070" (UID: "8ab7aae2-b5ed-4399-8016-5ec0e9101070"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.174177 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5f8c4ab1-6430-4b53-8021-0bfeba020584-openstack-config\") pod \"openstackclient\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.187520 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab7aae2-b5ed-4399-8016-5ec0e9101070-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8ab7aae2-b5ed-4399-8016-5ec0e9101070" (UID: "8ab7aae2-b5ed-4399-8016-5ec0e9101070"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.193808 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5f8c4ab1-6430-4b53-8021-0bfeba020584-openstack-config-secret\") pod \"openstackclient\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.202673 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab7aae2-b5ed-4399-8016-5ec0e9101070-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ab7aae2-b5ed-4399-8016-5ec0e9101070" (UID: "8ab7aae2-b5ed-4399-8016-5ec0e9101070"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.235984 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c4ab1-6430-4b53-8021-0bfeba020584-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.237707 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq5mp\" (UniqueName: \"kubernetes.io/projected/5f8c4ab1-6430-4b53-8021-0bfeba020584-kube-api-access-nq5mp\") pod \"openstackclient\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.276743 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmg5g\" (UniqueName: \"kubernetes.io/projected/f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce-kube-api-access-fmg5g\") pod \"kube-state-metrics-0\" (UID: \"f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce\") " pod="openstack/kube-state-metrics-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.277062 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab7aae2-b5ed-4399-8016-5ec0e9101070-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.277137 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8ab7aae2-b5ed-4399-8016-5ec0e9101070-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.277195 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8ab7aae2-b5ed-4399-8016-5ec0e9101070-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.298846 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmg5g\" (UniqueName: \"kubernetes.io/projected/f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce-kube-api-access-fmg5g\") pod \"kube-state-metrics-0\" (UID: \"f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce\") " pod="openstack/kube-state-metrics-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.338815 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.450729 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.462511 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab7aae2-b5ed-4399-8016-5ec0e9101070" path="/var/lib/kubelet/pods/8ab7aae2-b5ed-4399-8016-5ec0e9101070/volumes" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.918243 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.922722 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.927975 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.928124 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-lv8gx" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.928256 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.928367 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.928432 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.939626 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.997611 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bbb5c437-a583-437d-9d16-34a0f8ed492e-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.997656 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bbb5c437-a583-437d-9d16-34a0f8ed492e-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.997681 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bbb5c437-a583-437d-9d16-34a0f8ed492e-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.997816 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bbb5c437-a583-437d-9d16-34a0f8ed492e-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.997948 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42fkm\" (UniqueName: \"kubernetes.io/projected/bbb5c437-a583-437d-9d16-34a0f8ed492e-kube-api-access-42fkm\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.998083 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/bbb5c437-a583-437d-9d16-34a0f8ed492e-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:58 crc kubenswrapper[4841]: I0130 06:52:58.998356 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bbb5c437-a583-437d-9d16-34a0f8ed492e-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.101579 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bbb5c437-a583-437d-9d16-34a0f8ed492e-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.101694 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bbb5c437-a583-437d-9d16-34a0f8ed492e-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.101721 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bbb5c437-a583-437d-9d16-34a0f8ed492e-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.101740 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bbb5c437-a583-437d-9d16-34a0f8ed492e-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.101765 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bbb5c437-a583-437d-9d16-34a0f8ed492e-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.101801 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42fkm\" (UniqueName: \"kubernetes.io/projected/bbb5c437-a583-437d-9d16-34a0f8ed492e-kube-api-access-42fkm\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.101833 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/bbb5c437-a583-437d-9d16-34a0f8ed492e-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.103499 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/bbb5c437-a583-437d-9d16-34a0f8ed492e-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.103772 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.107464 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bbb5c437-a583-437d-9d16-34a0f8ed492e-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.107478 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bbb5c437-a583-437d-9d16-34a0f8ed492e-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.107591 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bbb5c437-a583-437d-9d16-34a0f8ed492e-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.107785 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bbb5c437-a583-437d-9d16-34a0f8ed492e-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.111857 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8ab7aae2-b5ed-4399-8016-5ec0e9101070" podUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.112104 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bbb5c437-a583-437d-9d16-34a0f8ed492e-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.126200 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42fkm\" (UniqueName: \"kubernetes.io/projected/bbb5c437-a583-437d-9d16-34a0f8ed492e-kube-api-access-42fkm\") pod \"alertmanager-metric-storage-0\" (UID: \"bbb5c437-a583-437d-9d16-34a0f8ed492e\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.128950 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.183066 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8ab7aae2-b5ed-4399-8016-5ec0e9101070" podUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.245768 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.345487 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.483668 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.486186 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.490171 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-54crc" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.490235 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.490335 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.490390 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.490543 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.490660 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.490756 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.490818 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.500676 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.615512 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7d1f3e8-04da-441b-8338-5b4d221487a3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.615591 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.615615 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.615646 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7d1f3e8-04da-441b-8338-5b4d221487a3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.615693 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-872j6\" (UniqueName: \"kubernetes.io/projected/d7d1f3e8-04da-441b-8338-5b4d221487a3-kube-api-access-872j6\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.615739 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.615758 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.615790 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.615825 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.615844 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-config\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.717594 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.717640 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.717685 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.717726 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.717746 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-config\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.717795 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7d1f3e8-04da-441b-8338-5b4d221487a3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.717845 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.717859 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.717897 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7d1f3e8-04da-441b-8338-5b4d221487a3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.717939 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-872j6\" (UniqueName: \"kubernetes.io/projected/d7d1f3e8-04da-441b-8338-5b4d221487a3-kube-api-access-872j6\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.718496 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.718892 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.719301 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.722546 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7d1f3e8-04da-441b-8338-5b4d221487a3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.722578 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.724265 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.724298 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-config\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.730925 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7d1f3e8-04da-441b-8338-5b4d221487a3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.732054 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.732083 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d997d63680c35cfa1d0fbaee39f8f1084d441010eb2256442bcfe3f15841c762/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.733741 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-872j6\" (UniqueName: \"kubernetes.io/projected/d7d1f3e8-04da-441b-8338-5b4d221487a3-kube-api-access-872j6\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.771133 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\") pod \"prometheus-metric-storage-0\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.809043 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 30 06:52:59 crc kubenswrapper[4841]: I0130 06:52:59.811900 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.115471 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce","Type":"ContainerStarted","Data":"a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933"} Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.115789 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.115800 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce","Type":"ContainerStarted","Data":"074a63acf427a8aa24b831e6f39c392a2dd28f24ff402e9fffa2b804ea6736be"} Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.120234 4841 generic.go:334] "Generic (PLEG): container finished" podID="4daf8ad7-5722-443b-9ac6-f9742ba7db0a" containerID="cf7fa48e8b80bd88ae241deae17a448b0c4ee4e2d53672773ea5fc32fecfcec4" exitCode=137 Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.123994 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"bbb5c437-a583-437d-9d16-34a0f8ed492e","Type":"ContainerStarted","Data":"dd9875ed42251049b9e86082f55af93f2ac24dd21a0ca2de18682a9b74c94313"} Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.125821 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5f8c4ab1-6430-4b53-8021-0bfeba020584","Type":"ContainerStarted","Data":"09322f27038208e17d1fa8f8585d994181f56f9797a3b4065facf8b25e406138"} Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.125842 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5f8c4ab1-6430-4b53-8021-0bfeba020584","Type":"ContainerStarted","Data":"08dc91985b0933d6fafe91f56b15535a95885ae34e01b9aeaf8e53a0ca6effa6"} Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.148961 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.6786331159999999 podStartE2EDuration="2.148941281s" podCreationTimestamp="2026-01-30 06:52:58 +0000 UTC" firstStartedPulling="2026-01-30 06:52:59.137055871 +0000 UTC m=+6316.130528509" lastFinishedPulling="2026-01-30 06:52:59.607364036 +0000 UTC m=+6316.600836674" observedRunningTime="2026-01-30 06:53:00.12903091 +0000 UTC m=+6317.122503548" watchObservedRunningTime="2026-01-30 06:53:00.148941281 +0000 UTC m=+6317.142413919" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.185968 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.185945878 podStartE2EDuration="3.185945878s" podCreationTimestamp="2026-01-30 06:52:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:53:00.145695744 +0000 UTC m=+6317.139168372" watchObservedRunningTime="2026-01-30 06:53:00.185945878 +0000 UTC m=+6317.179418516" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.257776 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.265383 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.327605 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-combined-ca-bundle\") pod \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.327647 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-openstack-config-secret\") pod \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.328392 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsrp9\" (UniqueName: \"kubernetes.io/projected/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-kube-api-access-zsrp9\") pod \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.328501 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-openstack-config\") pod \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\" (UID: \"4daf8ad7-5722-443b-9ac6-f9742ba7db0a\") " Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.332900 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-kube-api-access-zsrp9" (OuterVolumeSpecName: "kube-api-access-zsrp9") pod "4daf8ad7-5722-443b-9ac6-f9742ba7db0a" (UID: "4daf8ad7-5722-443b-9ac6-f9742ba7db0a"). InnerVolumeSpecName "kube-api-access-zsrp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.367573 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4daf8ad7-5722-443b-9ac6-f9742ba7db0a" (UID: "4daf8ad7-5722-443b-9ac6-f9742ba7db0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.379315 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4daf8ad7-5722-443b-9ac6-f9742ba7db0a" (UID: "4daf8ad7-5722-443b-9ac6-f9742ba7db0a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.382819 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4daf8ad7-5722-443b-9ac6-f9742ba7db0a" (UID: "4daf8ad7-5722-443b-9ac6-f9742ba7db0a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.430543 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.430571 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.430583 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsrp9\" (UniqueName: \"kubernetes.io/projected/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-kube-api-access-zsrp9\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.430592 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4daf8ad7-5722-443b-9ac6-f9742ba7db0a-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:00 crc kubenswrapper[4841]: I0130 06:53:00.441808 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4daf8ad7-5722-443b-9ac6-f9742ba7db0a" path="/var/lib/kubelet/pods/4daf8ad7-5722-443b-9ac6-f9742ba7db0a/volumes" Jan 30 06:53:01 crc kubenswrapper[4841]: I0130 06:53:01.158270 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d7d1f3e8-04da-441b-8338-5b4d221487a3","Type":"ContainerStarted","Data":"7f50e39f8cdb467f4ad71dacc6e652831fb624175eb4f6c49572c4701a166d12"} Jan 30 06:53:01 crc kubenswrapper[4841]: I0130 06:53:01.160625 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:53:01 crc kubenswrapper[4841]: I0130 06:53:01.161024 4841 scope.go:117] "RemoveContainer" containerID="cf7fa48e8b80bd88ae241deae17a448b0c4ee4e2d53672773ea5fc32fecfcec4" Jan 30 06:53:06 crc kubenswrapper[4841]: I0130 06:53:06.234108 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d7d1f3e8-04da-441b-8338-5b4d221487a3","Type":"ContainerStarted","Data":"ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566"} Jan 30 06:53:06 crc kubenswrapper[4841]: I0130 06:53:06.236876 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"bbb5c437-a583-437d-9d16-34a0f8ed492e","Type":"ContainerStarted","Data":"1bf24c961f7938db2443e7b2293b390f00a791b7d9517f43cd82ee58108200ba"} Jan 30 06:53:08 crc kubenswrapper[4841]: I0130 06:53:08.463524 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 06:53:14 crc kubenswrapper[4841]: I0130 06:53:14.350239 4841 generic.go:334] "Generic (PLEG): container finished" podID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerID="ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566" exitCode=0 Jan 30 06:53:14 crc kubenswrapper[4841]: I0130 06:53:14.350352 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d7d1f3e8-04da-441b-8338-5b4d221487a3","Type":"ContainerDied","Data":"ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566"} Jan 30 06:53:16 crc kubenswrapper[4841]: I0130 06:53:16.379729 4841 generic.go:334] "Generic (PLEG): container finished" podID="bbb5c437-a583-437d-9d16-34a0f8ed492e" containerID="1bf24c961f7938db2443e7b2293b390f00a791b7d9517f43cd82ee58108200ba" exitCode=0 Jan 30 06:53:16 crc kubenswrapper[4841]: I0130 06:53:16.379823 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"bbb5c437-a583-437d-9d16-34a0f8ed492e","Type":"ContainerDied","Data":"1bf24c961f7938db2443e7b2293b390f00a791b7d9517f43cd82ee58108200ba"} Jan 30 06:53:20 crc kubenswrapper[4841]: I0130 06:53:20.049621 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-458jh"] Jan 30 06:53:20 crc kubenswrapper[4841]: I0130 06:53:20.062633 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e656-account-create-update-hbqxm"] Jan 30 06:53:20 crc kubenswrapper[4841]: I0130 06:53:20.079517 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-458jh"] Jan 30 06:53:20 crc kubenswrapper[4841]: I0130 06:53:20.089692 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e656-account-create-update-hbqxm"] Jan 30 06:53:20 crc kubenswrapper[4841]: I0130 06:53:20.443951 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6db6f32a-2832-4c6d-a811-172f210560d4" path="/var/lib/kubelet/pods/6db6f32a-2832-4c6d-a811-172f210560d4/volumes" Jan 30 06:53:20 crc kubenswrapper[4841]: I0130 06:53:20.446139 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784a3ed6-714b-4646-9070-4b660e0ee354" path="/var/lib/kubelet/pods/784a3ed6-714b-4646-9070-4b660e0ee354/volumes" Jan 30 06:53:23 crc kubenswrapper[4841]: I0130 06:53:23.474896 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"bbb5c437-a583-437d-9d16-34a0f8ed492e","Type":"ContainerStarted","Data":"7cc623ba0e37d2b060389fdaf8634366f0f212665f398dd03e30216a071959c1"} Jan 30 06:53:23 crc kubenswrapper[4841]: I0130 06:53:23.477681 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d7d1f3e8-04da-441b-8338-5b4d221487a3","Type":"ContainerStarted","Data":"c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0"} Jan 30 06:53:26 crc kubenswrapper[4841]: I0130 06:53:26.053685 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-84ztj"] Jan 30 06:53:26 crc kubenswrapper[4841]: I0130 06:53:26.071999 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-84ztj"] Jan 30 06:53:26 crc kubenswrapper[4841]: I0130 06:53:26.455964 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34d65dc-52e3-4321-8d51-a5657da47566" path="/var/lib/kubelet/pods/f34d65dc-52e3-4321-8d51-a5657da47566/volumes" Jan 30 06:53:27 crc kubenswrapper[4841]: I0130 06:53:27.556249 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"bbb5c437-a583-437d-9d16-34a0f8ed492e","Type":"ContainerStarted","Data":"bac0d8b135b49c1fb43a7f1b038e43b147ef4baeb60b4d53b757723ed119e55f"} Jan 30 06:53:27 crc kubenswrapper[4841]: I0130 06:53:27.556625 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Jan 30 06:53:27 crc kubenswrapper[4841]: I0130 06:53:27.561882 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Jan 30 06:53:27 crc kubenswrapper[4841]: I0130 06:53:27.597377 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.9214870699999995 podStartE2EDuration="29.597353495s" podCreationTimestamp="2026-01-30 06:52:58 +0000 UTC" firstStartedPulling="2026-01-30 06:52:59.824384365 +0000 UTC m=+6316.817857003" lastFinishedPulling="2026-01-30 06:53:22.50025078 +0000 UTC m=+6339.493723428" observedRunningTime="2026-01-30 06:53:27.586521266 +0000 UTC m=+6344.579993934" watchObservedRunningTime="2026-01-30 06:53:27.597353495 +0000 UTC m=+6344.590826173" Jan 30 06:53:28 crc kubenswrapper[4841]: I0130 06:53:28.573010 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d7d1f3e8-04da-441b-8338-5b4d221487a3","Type":"ContainerStarted","Data":"e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a"} Jan 30 06:53:31 crc kubenswrapper[4841]: I0130 06:53:31.609549 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d7d1f3e8-04da-441b-8338-5b4d221487a3","Type":"ContainerStarted","Data":"1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4"} Jan 30 06:53:31 crc kubenswrapper[4841]: I0130 06:53:31.671086 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=2.813042364 podStartE2EDuration="33.671060133s" podCreationTimestamp="2026-01-30 06:52:58 +0000 UTC" firstStartedPulling="2026-01-30 06:53:00.271983713 +0000 UTC m=+6317.265456351" lastFinishedPulling="2026-01-30 06:53:31.130001442 +0000 UTC m=+6348.123474120" observedRunningTime="2026-01-30 06:53:31.661515668 +0000 UTC m=+6348.654988336" watchObservedRunningTime="2026-01-30 06:53:31.671060133 +0000 UTC m=+6348.664532801" Jan 30 06:53:34 crc kubenswrapper[4841]: I0130 06:53:34.813516 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.463851 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.464448 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.763224 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.766491 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.767977 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.768500 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.773150 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.876607 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/686c7916-64bf-4289-b82f-1890c7475246-log-httpd\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.876795 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7nvg\" (UniqueName: \"kubernetes.io/projected/686c7916-64bf-4289-b82f-1890c7475246-kube-api-access-n7nvg\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.876977 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.877096 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-scripts\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.877174 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.877306 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/686c7916-64bf-4289-b82f-1890c7475246-run-httpd\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.877378 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-config-data\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.979328 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.979414 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-scripts\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.979453 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.979481 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/686c7916-64bf-4289-b82f-1890c7475246-run-httpd\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.979505 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-config-data\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.979552 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/686c7916-64bf-4289-b82f-1890c7475246-log-httpd\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.979590 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nvg\" (UniqueName: \"kubernetes.io/projected/686c7916-64bf-4289-b82f-1890c7475246-kube-api-access-n7nvg\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.980427 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/686c7916-64bf-4289-b82f-1890c7475246-log-httpd\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.980450 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/686c7916-64bf-4289-b82f-1890c7475246-run-httpd\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.985249 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-scripts\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.985566 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-config-data\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.985961 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.993370 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:40 crc kubenswrapper[4841]: I0130 06:53:40.998162 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7nvg\" (UniqueName: \"kubernetes.io/projected/686c7916-64bf-4289-b82f-1890c7475246-kube-api-access-n7nvg\") pod \"ceilometer-0\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " pod="openstack/ceilometer-0" Jan 30 06:53:41 crc kubenswrapper[4841]: I0130 06:53:41.084424 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:53:41 crc kubenswrapper[4841]: I0130 06:53:41.584371 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:53:41 crc kubenswrapper[4841]: I0130 06:53:41.594084 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 06:53:41 crc kubenswrapper[4841]: I0130 06:53:41.732153 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"686c7916-64bf-4289-b82f-1890c7475246","Type":"ContainerStarted","Data":"75ca51b4142be2124983a1c3343ab1374703539231e6c9a6ef1954cab82c2a00"} Jan 30 06:53:42 crc kubenswrapper[4841]: I0130 06:53:42.742857 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"686c7916-64bf-4289-b82f-1890c7475246","Type":"ContainerStarted","Data":"248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda"} Jan 30 06:53:43 crc kubenswrapper[4841]: I0130 06:53:43.761736 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"686c7916-64bf-4289-b82f-1890c7475246","Type":"ContainerStarted","Data":"e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77"} Jan 30 06:53:44 crc kubenswrapper[4841]: I0130 06:53:44.785660 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"686c7916-64bf-4289-b82f-1890c7475246","Type":"ContainerStarted","Data":"e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4"} Jan 30 06:53:44 crc kubenswrapper[4841]: I0130 06:53:44.813220 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:44 crc kubenswrapper[4841]: I0130 06:53:44.816456 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:45 crc kubenswrapper[4841]: I0130 06:53:45.802234 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:46 crc kubenswrapper[4841]: I0130 06:53:46.817514 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"686c7916-64bf-4289-b82f-1890c7475246","Type":"ContainerStarted","Data":"42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa"} Jan 30 06:53:46 crc kubenswrapper[4841]: I0130 06:53:46.817815 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 06:53:46 crc kubenswrapper[4841]: I0130 06:53:46.861949 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.673367409 podStartE2EDuration="6.86192669s" podCreationTimestamp="2026-01-30 06:53:40 +0000 UTC" firstStartedPulling="2026-01-30 06:53:41.593870375 +0000 UTC m=+6358.587343013" lastFinishedPulling="2026-01-30 06:53:45.782429636 +0000 UTC m=+6362.775902294" observedRunningTime="2026-01-30 06:53:46.852082887 +0000 UTC m=+6363.845555535" watchObservedRunningTime="2026-01-30 06:53:46.86192669 +0000 UTC m=+6363.855399338" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.173455 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.173664 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" containerName="openstackclient" containerID="cri-o://09322f27038208e17d1fa8f8585d994181f56f9797a3b4065facf8b25e406138" gracePeriod=2 Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.185097 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.282964 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 06:53:47 crc kubenswrapper[4841]: E0130 06:53:47.283317 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" containerName="openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.283336 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" containerName="openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.293707 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" containerName="openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.294293 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.294367 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.310999 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" podUID="207e1540-ff6c-44b7-8d66-a6a4572fcbb2" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.344716 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/207e1540-ff6c-44b7-8d66-a6a4572fcbb2-openstack-config\") pod \"openstackclient\" (UID: \"207e1540-ff6c-44b7-8d66-a6a4572fcbb2\") " pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.344778 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207e1540-ff6c-44b7-8d66-a6a4572fcbb2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"207e1540-ff6c-44b7-8d66-a6a4572fcbb2\") " pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.344950 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/207e1540-ff6c-44b7-8d66-a6a4572fcbb2-openstack-config-secret\") pod \"openstackclient\" (UID: \"207e1540-ff6c-44b7-8d66-a6a4572fcbb2\") " pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.345078 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st5rq\" (UniqueName: \"kubernetes.io/projected/207e1540-ff6c-44b7-8d66-a6a4572fcbb2-kube-api-access-st5rq\") pod \"openstackclient\" (UID: \"207e1540-ff6c-44b7-8d66-a6a4572fcbb2\") " pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.447351 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/207e1540-ff6c-44b7-8d66-a6a4572fcbb2-openstack-config\") pod \"openstackclient\" (UID: \"207e1540-ff6c-44b7-8d66-a6a4572fcbb2\") " pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.447434 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207e1540-ff6c-44b7-8d66-a6a4572fcbb2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"207e1540-ff6c-44b7-8d66-a6a4572fcbb2\") " pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.447575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/207e1540-ff6c-44b7-8d66-a6a4572fcbb2-openstack-config-secret\") pod \"openstackclient\" (UID: \"207e1540-ff6c-44b7-8d66-a6a4572fcbb2\") " pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.447646 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st5rq\" (UniqueName: \"kubernetes.io/projected/207e1540-ff6c-44b7-8d66-a6a4572fcbb2-kube-api-access-st5rq\") pod \"openstackclient\" (UID: \"207e1540-ff6c-44b7-8d66-a6a4572fcbb2\") " pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.448250 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/207e1540-ff6c-44b7-8d66-a6a4572fcbb2-openstack-config\") pod \"openstackclient\" (UID: \"207e1540-ff6c-44b7-8d66-a6a4572fcbb2\") " pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.454758 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207e1540-ff6c-44b7-8d66-a6a4572fcbb2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"207e1540-ff6c-44b7-8d66-a6a4572fcbb2\") " pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.455045 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/207e1540-ff6c-44b7-8d66-a6a4572fcbb2-openstack-config-secret\") pod \"openstackclient\" (UID: \"207e1540-ff6c-44b7-8d66-a6a4572fcbb2\") " pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.470816 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st5rq\" (UniqueName: \"kubernetes.io/projected/207e1540-ff6c-44b7-8d66-a6a4572fcbb2-kube-api-access-st5rq\") pod \"openstackclient\" (UID: \"207e1540-ff6c-44b7-8d66-a6a4572fcbb2\") " pod="openstack/openstackclient" Jan 30 06:53:47 crc kubenswrapper[4841]: I0130 06:53:47.624661 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:53:48 crc kubenswrapper[4841]: I0130 06:53:48.171379 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 06:53:48 crc kubenswrapper[4841]: W0130 06:53:48.188236 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod207e1540_ff6c_44b7_8d66_a6a4572fcbb2.slice/crio-57bbde16cfba60fab495803f1d8997b76e0ba8fa404a0664513e89ddd76fe7fa WatchSource:0}: Error finding container 57bbde16cfba60fab495803f1d8997b76e0ba8fa404a0664513e89ddd76fe7fa: Status 404 returned error can't find the container with id 57bbde16cfba60fab495803f1d8997b76e0ba8fa404a0664513e89ddd76fe7fa Jan 30 06:53:48 crc kubenswrapper[4841]: I0130 06:53:48.717995 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 06:53:48 crc kubenswrapper[4841]: I0130 06:53:48.718635 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="thanos-sidecar" containerID="cri-o://1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4" gracePeriod=600 Jan 30 06:53:48 crc kubenswrapper[4841]: I0130 06:53:48.718650 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="config-reloader" containerID="cri-o://e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a" gracePeriod=600 Jan 30 06:53:48 crc kubenswrapper[4841]: I0130 06:53:48.718547 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="prometheus" containerID="cri-o://c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0" gracePeriod=600 Jan 30 06:53:48 crc kubenswrapper[4841]: I0130 06:53:48.853972 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"207e1540-ff6c-44b7-8d66-a6a4572fcbb2","Type":"ContainerStarted","Data":"513269c3e64d11267a411958f59479147c7b66c94ce22b9fe6b7c5855d197fc1"} Jan 30 06:53:48 crc kubenswrapper[4841]: I0130 06:53:48.854216 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"207e1540-ff6c-44b7-8d66-a6a4572fcbb2","Type":"ContainerStarted","Data":"57bbde16cfba60fab495803f1d8997b76e0ba8fa404a0664513e89ddd76fe7fa"} Jan 30 06:53:48 crc kubenswrapper[4841]: I0130 06:53:48.879896 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.879881846 podStartE2EDuration="1.879881846s" podCreationTimestamp="2026-01-30 06:53:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:53:48.877842761 +0000 UTC m=+6365.871315409" watchObservedRunningTime="2026-01-30 06:53:48.879881846 +0000 UTC m=+6365.873354484" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.420583 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.590738 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5f8c4ab1-6430-4b53-8021-0bfeba020584-openstack-config-secret\") pod \"5f8c4ab1-6430-4b53-8021-0bfeba020584\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.590806 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5f8c4ab1-6430-4b53-8021-0bfeba020584-openstack-config\") pod \"5f8c4ab1-6430-4b53-8021-0bfeba020584\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.590845 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c4ab1-6430-4b53-8021-0bfeba020584-combined-ca-bundle\") pod \"5f8c4ab1-6430-4b53-8021-0bfeba020584\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.590929 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq5mp\" (UniqueName: \"kubernetes.io/projected/5f8c4ab1-6430-4b53-8021-0bfeba020584-kube-api-access-nq5mp\") pod \"5f8c4ab1-6430-4b53-8021-0bfeba020584\" (UID: \"5f8c4ab1-6430-4b53-8021-0bfeba020584\") " Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.606816 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8c4ab1-6430-4b53-8021-0bfeba020584-kube-api-access-nq5mp" (OuterVolumeSpecName: "kube-api-access-nq5mp") pod "5f8c4ab1-6430-4b53-8021-0bfeba020584" (UID: "5f8c4ab1-6430-4b53-8021-0bfeba020584"). InnerVolumeSpecName "kube-api-access-nq5mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.626309 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8c4ab1-6430-4b53-8021-0bfeba020584-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f8c4ab1-6430-4b53-8021-0bfeba020584" (UID: "5f8c4ab1-6430-4b53-8021-0bfeba020584"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.652321 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f8c4ab1-6430-4b53-8021-0bfeba020584-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5f8c4ab1-6430-4b53-8021-0bfeba020584" (UID: "5f8c4ab1-6430-4b53-8021-0bfeba020584"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.657937 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8c4ab1-6430-4b53-8021-0bfeba020584-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5f8c4ab1-6430-4b53-8021-0bfeba020584" (UID: "5f8c4ab1-6430-4b53-8021-0bfeba020584"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.693922 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq5mp\" (UniqueName: \"kubernetes.io/projected/5f8c4ab1-6430-4b53-8021-0bfeba020584-kube-api-access-nq5mp\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.693955 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5f8c4ab1-6430-4b53-8021-0bfeba020584-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.693969 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5f8c4ab1-6430-4b53-8021-0bfeba020584-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.693982 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8c4ab1-6430-4b53-8021-0bfeba020584-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.822202 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.866888 4841 generic.go:334] "Generic (PLEG): container finished" podID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerID="1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4" exitCode=0 Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.866930 4841 generic.go:334] "Generic (PLEG): container finished" podID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerID="e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a" exitCode=0 Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.866947 4841 generic.go:334] "Generic (PLEG): container finished" podID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerID="c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0" exitCode=0 Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.867011 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d7d1f3e8-04da-441b-8338-5b4d221487a3","Type":"ContainerDied","Data":"1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4"} Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.867234 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d7d1f3e8-04da-441b-8338-5b4d221487a3","Type":"ContainerDied","Data":"e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a"} Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.867248 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d7d1f3e8-04da-441b-8338-5b4d221487a3","Type":"ContainerDied","Data":"c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0"} Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.867261 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d7d1f3e8-04da-441b-8338-5b4d221487a3","Type":"ContainerDied","Data":"7f50e39f8cdb467f4ad71dacc6e652831fb624175eb4f6c49572c4701a166d12"} Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.867282 4841 scope.go:117] "RemoveContainer" containerID="1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.867453 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.871159 4841 generic.go:334] "Generic (PLEG): container finished" podID="5f8c4ab1-6430-4b53-8021-0bfeba020584" containerID="09322f27038208e17d1fa8f8585d994181f56f9797a3b4065facf8b25e406138" exitCode=137 Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.871204 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.891310 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" podUID="207e1540-ff6c-44b7-8d66-a6a4572fcbb2" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.896558 4841 scope.go:117] "RemoveContainer" containerID="e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.942662 4841 scope.go:117] "RemoveContainer" containerID="c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.963130 4841 scope.go:117] "RemoveContainer" containerID="ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.980515 4841 scope.go:117] "RemoveContainer" containerID="1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4" Jan 30 06:53:49 crc kubenswrapper[4841]: E0130 06:53:49.981034 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4\": container with ID starting with 1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4 not found: ID does not exist" containerID="1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.981084 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4"} err="failed to get container status \"1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4\": rpc error: code = NotFound desc = could not find container \"1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4\": container with ID starting with 1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4 not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.981117 4841 scope.go:117] "RemoveContainer" containerID="e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a" Jan 30 06:53:49 crc kubenswrapper[4841]: E0130 06:53:49.981480 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a\": container with ID starting with e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a not found: ID does not exist" containerID="e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.981551 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a"} err="failed to get container status \"e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a\": rpc error: code = NotFound desc = could not find container \"e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a\": container with ID starting with e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.981603 4841 scope.go:117] "RemoveContainer" containerID="c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0" Jan 30 06:53:49 crc kubenswrapper[4841]: E0130 06:53:49.981940 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0\": container with ID starting with c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0 not found: ID does not exist" containerID="c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.981968 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0"} err="failed to get container status \"c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0\": rpc error: code = NotFound desc = could not find container \"c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0\": container with ID starting with c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0 not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.981986 4841 scope.go:117] "RemoveContainer" containerID="ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566" Jan 30 06:53:49 crc kubenswrapper[4841]: E0130 06:53:49.982296 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566\": container with ID starting with ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566 not found: ID does not exist" containerID="ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.982338 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566"} err="failed to get container status \"ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566\": rpc error: code = NotFound desc = could not find container \"ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566\": container with ID starting with ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566 not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.982366 4841 scope.go:117] "RemoveContainer" containerID="1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.982730 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4"} err="failed to get container status \"1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4\": rpc error: code = NotFound desc = could not find container \"1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4\": container with ID starting with 1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4 not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.982775 4841 scope.go:117] "RemoveContainer" containerID="e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.983073 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a"} err="failed to get container status \"e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a\": rpc error: code = NotFound desc = could not find container \"e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a\": container with ID starting with e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.983092 4841 scope.go:117] "RemoveContainer" containerID="c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.983965 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0"} err="failed to get container status \"c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0\": rpc error: code = NotFound desc = could not find container \"c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0\": container with ID starting with c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0 not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.983989 4841 scope.go:117] "RemoveContainer" containerID="ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.984296 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566"} err="failed to get container status \"ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566\": rpc error: code = NotFound desc = could not find container \"ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566\": container with ID starting with ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566 not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.984313 4841 scope.go:117] "RemoveContainer" containerID="1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.984670 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4"} err="failed to get container status \"1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4\": rpc error: code = NotFound desc = could not find container \"1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4\": container with ID starting with 1ac0e29e1f73a3dd2577057ed77a4fa68a7ddb65d43eaef57a3a677dcaccf3a4 not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.984690 4841 scope.go:117] "RemoveContainer" containerID="e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.986820 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a"} err="failed to get container status \"e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a\": rpc error: code = NotFound desc = could not find container \"e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a\": container with ID starting with e30b1fcbc6e049208d30f4486eba3077646941ba15ae8f66b3ccfdffa72d841a not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.986842 4841 scope.go:117] "RemoveContainer" containerID="c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.987205 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0"} err="failed to get container status \"c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0\": rpc error: code = NotFound desc = could not find container \"c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0\": container with ID starting with c28678d3ea6e5f6c2c44ff603c14b9f1bb97d7598f9c2c95c23c56d1994238e0 not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.987247 4841 scope.go:117] "RemoveContainer" containerID="ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.987578 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566"} err="failed to get container status \"ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566\": rpc error: code = NotFound desc = could not find container \"ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566\": container with ID starting with ada76a5ace4557a46ac69a4bf0c338fc01c83aaf1c8ac75877b955b3bfc55566 not found: ID does not exist" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.987599 4841 scope.go:117] "RemoveContainer" containerID="09322f27038208e17d1fa8f8585d994181f56f9797a3b4065facf8b25e406138" Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.999096 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-config\") pod \"d7d1f3e8-04da-441b-8338-5b4d221487a3\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.999340 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-0\") pod \"d7d1f3e8-04da-441b-8338-5b4d221487a3\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.999429 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-thanos-prometheus-http-client-file\") pod \"d7d1f3e8-04da-441b-8338-5b4d221487a3\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.999469 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7d1f3e8-04da-441b-8338-5b4d221487a3-tls-assets\") pod \"d7d1f3e8-04da-441b-8338-5b4d221487a3\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.999510 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-1\") pod \"d7d1f3e8-04da-441b-8338-5b4d221487a3\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.999536 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-872j6\" (UniqueName: \"kubernetes.io/projected/d7d1f3e8-04da-441b-8338-5b4d221487a3-kube-api-access-872j6\") pod \"d7d1f3e8-04da-441b-8338-5b4d221487a3\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.999573 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-2\") pod \"d7d1f3e8-04da-441b-8338-5b4d221487a3\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.999644 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7d1f3e8-04da-441b-8338-5b4d221487a3-config-out\") pod \"d7d1f3e8-04da-441b-8338-5b4d221487a3\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " Jan 30 06:53:49 crc kubenswrapper[4841]: I0130 06:53:49.999683 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-web-config\") pod \"d7d1f3e8-04da-441b-8338-5b4d221487a3\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:49.999803 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\") pod \"d7d1f3e8-04da-441b-8338-5b4d221487a3\" (UID: \"d7d1f3e8-04da-441b-8338-5b4d221487a3\") " Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:49.999909 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "d7d1f3e8-04da-441b-8338-5b4d221487a3" (UID: "d7d1f3e8-04da-441b-8338-5b4d221487a3"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.000220 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "d7d1f3e8-04da-441b-8338-5b4d221487a3" (UID: "d7d1f3e8-04da-441b-8338-5b4d221487a3"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.000536 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "d7d1f3e8-04da-441b-8338-5b4d221487a3" (UID: "d7d1f3e8-04da-441b-8338-5b4d221487a3"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.000702 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.000717 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.006910 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d1f3e8-04da-441b-8338-5b4d221487a3-kube-api-access-872j6" (OuterVolumeSpecName: "kube-api-access-872j6") pod "d7d1f3e8-04da-441b-8338-5b4d221487a3" (UID: "d7d1f3e8-04da-441b-8338-5b4d221487a3"). InnerVolumeSpecName "kube-api-access-872j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.006915 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-config" (OuterVolumeSpecName: "config") pod "d7d1f3e8-04da-441b-8338-5b4d221487a3" (UID: "d7d1f3e8-04da-441b-8338-5b4d221487a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.007001 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d7d1f3e8-04da-441b-8338-5b4d221487a3" (UID: "d7d1f3e8-04da-441b-8338-5b4d221487a3"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.008072 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d1f3e8-04da-441b-8338-5b4d221487a3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d7d1f3e8-04da-441b-8338-5b4d221487a3" (UID: "d7d1f3e8-04da-441b-8338-5b4d221487a3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.008158 4841 scope.go:117] "RemoveContainer" containerID="09322f27038208e17d1fa8f8585d994181f56f9797a3b4065facf8b25e406138" Jan 30 06:53:50 crc kubenswrapper[4841]: E0130 06:53:50.008572 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09322f27038208e17d1fa8f8585d994181f56f9797a3b4065facf8b25e406138\": container with ID starting with 09322f27038208e17d1fa8f8585d994181f56f9797a3b4065facf8b25e406138 not found: ID does not exist" containerID="09322f27038208e17d1fa8f8585d994181f56f9797a3b4065facf8b25e406138" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.008619 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09322f27038208e17d1fa8f8585d994181f56f9797a3b4065facf8b25e406138"} err="failed to get container status \"09322f27038208e17d1fa8f8585d994181f56f9797a3b4065facf8b25e406138\": rpc error: code = NotFound desc = could not find container \"09322f27038208e17d1fa8f8585d994181f56f9797a3b4065facf8b25e406138\": container with ID starting with 09322f27038208e17d1fa8f8585d994181f56f9797a3b4065facf8b25e406138 not found: ID does not exist" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.008574 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7d1f3e8-04da-441b-8338-5b4d221487a3-config-out" (OuterVolumeSpecName: "config-out") pod "d7d1f3e8-04da-441b-8338-5b4d221487a3" (UID: "d7d1f3e8-04da-441b-8338-5b4d221487a3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.029471 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-web-config" (OuterVolumeSpecName: "web-config") pod "d7d1f3e8-04da-441b-8338-5b4d221487a3" (UID: "d7d1f3e8-04da-441b-8338-5b4d221487a3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.030742 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "d7d1f3e8-04da-441b-8338-5b4d221487a3" (UID: "d7d1f3e8-04da-441b-8338-5b4d221487a3"). InnerVolumeSpecName "pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.103042 4841 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.103079 4841 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7d1f3e8-04da-441b-8338-5b4d221487a3-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.103090 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d7d1f3e8-04da-441b-8338-5b4d221487a3-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.103100 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-872j6\" (UniqueName: \"kubernetes.io/projected/d7d1f3e8-04da-441b-8338-5b4d221487a3-kube-api-access-872j6\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.103114 4841 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7d1f3e8-04da-441b-8338-5b4d221487a3-config-out\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.103124 4841 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-web-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.103161 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\") on node \"crc\" " Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.103173 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7d1f3e8-04da-441b-8338-5b4d221487a3-config\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.145180 4841 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.145569 4841 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc") on node "crc" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.206846 4841 reconciler_common.go:293] "Volume detached for volume \"pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.230531 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.239510 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.274788 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 06:53:50 crc kubenswrapper[4841]: E0130 06:53:50.275311 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="init-config-reloader" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.275336 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="init-config-reloader" Jan 30 06:53:50 crc kubenswrapper[4841]: E0130 06:53:50.275363 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="config-reloader" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.275370 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="config-reloader" Jan 30 06:53:50 crc kubenswrapper[4841]: E0130 06:53:50.275382 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="prometheus" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.275388 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="prometheus" Jan 30 06:53:50 crc kubenswrapper[4841]: E0130 06:53:50.275428 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="thanos-sidecar" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.275437 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="thanos-sidecar" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.275686 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="config-reloader" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.275726 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="prometheus" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.275739 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="thanos-sidecar" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.277948 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.282781 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-96zrp"] Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.282803 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.282849 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.282858 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.282879 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.282803 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.284053 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-96zrp" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.284328 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.284686 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.284858 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-54crc" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.291014 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.295738 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.303498 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-96zrp"] Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.389806 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-1557-account-create-update-jbrq7"] Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.391167 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1557-account-create-update-jbrq7" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.393383 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.401006 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-1557-account-create-update-jbrq7"] Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.411127 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fb572435-b17a-4caf-afb1-a2c334f6bc35-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.411163 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.411185 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.411234 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb572435-b17a-4caf-afb1-a2c334f6bc35-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.412341 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.412814 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dqlb\" (UniqueName: \"kubernetes.io/projected/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44-kube-api-access-2dqlb\") pod \"aodh-db-create-96zrp\" (UID: \"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44\") " pod="openstack/aodh-db-create-96zrp" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.412864 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.412903 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb572435-b17a-4caf-afb1-a2c334f6bc35-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.412922 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.412937 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fb572435-b17a-4caf-afb1-a2c334f6bc35-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.412990 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxsms\" (UniqueName: \"kubernetes.io/projected/fb572435-b17a-4caf-afb1-a2c334f6bc35-kube-api-access-fxsms\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.413016 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.413124 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44-operator-scripts\") pod \"aodh-db-create-96zrp\" (UID: \"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44\") " pod="openstack/aodh-db-create-96zrp" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.413142 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.413160 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb572435-b17a-4caf-afb1-a2c334f6bc35-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.442681 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8c4ab1-6430-4b53-8021-0bfeba020584" path="/var/lib/kubelet/pods/5f8c4ab1-6430-4b53-8021-0bfeba020584/volumes" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.443982 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" path="/var/lib/kubelet/pods/d7d1f3e8-04da-441b-8338-5b4d221487a3/volumes" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.514964 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515024 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb572435-b17a-4caf-afb1-a2c334f6bc35-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515044 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515062 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fb572435-b17a-4caf-afb1-a2c334f6bc35-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515092 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d248d34a-8ccf-48dd-bb30-9ad79bd380c8-operator-scripts\") pod \"aodh-1557-account-create-update-jbrq7\" (UID: \"d248d34a-8ccf-48dd-bb30-9ad79bd380c8\") " pod="openstack/aodh-1557-account-create-update-jbrq7" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515126 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxsms\" (UniqueName: \"kubernetes.io/projected/fb572435-b17a-4caf-afb1-a2c334f6bc35-kube-api-access-fxsms\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515148 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515179 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6xct\" (UniqueName: \"kubernetes.io/projected/d248d34a-8ccf-48dd-bb30-9ad79bd380c8-kube-api-access-p6xct\") pod \"aodh-1557-account-create-update-jbrq7\" (UID: \"d248d34a-8ccf-48dd-bb30-9ad79bd380c8\") " pod="openstack/aodh-1557-account-create-update-jbrq7" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515244 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44-operator-scripts\") pod \"aodh-db-create-96zrp\" (UID: \"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44\") " pod="openstack/aodh-db-create-96zrp" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515265 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb572435-b17a-4caf-afb1-a2c334f6bc35-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515316 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fb572435-b17a-4caf-afb1-a2c334f6bc35-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515333 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515352 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515388 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb572435-b17a-4caf-afb1-a2c334f6bc35-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515440 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.515459 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dqlb\" (UniqueName: \"kubernetes.io/projected/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44-kube-api-access-2dqlb\") pod \"aodh-db-create-96zrp\" (UID: \"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44\") " pod="openstack/aodh-db-create-96zrp" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.516625 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44-operator-scripts\") pod \"aodh-db-create-96zrp\" (UID: \"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44\") " pod="openstack/aodh-db-create-96zrp" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.517879 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb572435-b17a-4caf-afb1-a2c334f6bc35-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.518177 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fb572435-b17a-4caf-afb1-a2c334f6bc35-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.518595 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fb572435-b17a-4caf-afb1-a2c334f6bc35-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.520096 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.522056 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.522096 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb572435-b17a-4caf-afb1-a2c334f6bc35-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.522871 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.523098 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.523130 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d997d63680c35cfa1d0fbaee39f8f1084d441010eb2256442bcfe3f15841c762/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.529611 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb572435-b17a-4caf-afb1-a2c334f6bc35-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.532380 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dqlb\" (UniqueName: \"kubernetes.io/projected/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44-kube-api-access-2dqlb\") pod \"aodh-db-create-96zrp\" (UID: \"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44\") " pod="openstack/aodh-db-create-96zrp" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.532799 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.535973 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.541018 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fb572435-b17a-4caf-afb1-a2c334f6bc35-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.550145 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxsms\" (UniqueName: \"kubernetes.io/projected/fb572435-b17a-4caf-afb1-a2c334f6bc35-kube-api-access-fxsms\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.567167 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a23dc1d5-35b7-4fb8-8469-d2a35f163efc\") pod \"prometheus-metric-storage-0\" (UID: \"fb572435-b17a-4caf-afb1-a2c334f6bc35\") " pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.617803 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d248d34a-8ccf-48dd-bb30-9ad79bd380c8-operator-scripts\") pod \"aodh-1557-account-create-update-jbrq7\" (UID: \"d248d34a-8ccf-48dd-bb30-9ad79bd380c8\") " pod="openstack/aodh-1557-account-create-update-jbrq7" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.617905 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6xct\" (UniqueName: \"kubernetes.io/projected/d248d34a-8ccf-48dd-bb30-9ad79bd380c8-kube-api-access-p6xct\") pod \"aodh-1557-account-create-update-jbrq7\" (UID: \"d248d34a-8ccf-48dd-bb30-9ad79bd380c8\") " pod="openstack/aodh-1557-account-create-update-jbrq7" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.618984 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d248d34a-8ccf-48dd-bb30-9ad79bd380c8-operator-scripts\") pod \"aodh-1557-account-create-update-jbrq7\" (UID: \"d248d34a-8ccf-48dd-bb30-9ad79bd380c8\") " pod="openstack/aodh-1557-account-create-update-jbrq7" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.628798 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.636684 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6xct\" (UniqueName: \"kubernetes.io/projected/d248d34a-8ccf-48dd-bb30-9ad79bd380c8-kube-api-access-p6xct\") pod \"aodh-1557-account-create-update-jbrq7\" (UID: \"d248d34a-8ccf-48dd-bb30-9ad79bd380c8\") " pod="openstack/aodh-1557-account-create-update-jbrq7" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.650224 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-96zrp" Jan 30 06:53:50 crc kubenswrapper[4841]: I0130 06:53:50.718979 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1557-account-create-update-jbrq7" Jan 30 06:53:51 crc kubenswrapper[4841]: I0130 06:53:51.126981 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 06:53:51 crc kubenswrapper[4841]: I0130 06:53:51.221537 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-96zrp"] Jan 30 06:53:51 crc kubenswrapper[4841]: W0130 06:53:51.222987 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dbd7f71_0dfa_40ca_b8a4_1455a31e3f44.slice/crio-ccdc20a83a72bd3f4fc423cf2fc62ad1af0b1f0e4cfd1b4f87d4338b06fa8425 WatchSource:0}: Error finding container ccdc20a83a72bd3f4fc423cf2fc62ad1af0b1f0e4cfd1b4f87d4338b06fa8425: Status 404 returned error can't find the container with id ccdc20a83a72bd3f4fc423cf2fc62ad1af0b1f0e4cfd1b4f87d4338b06fa8425 Jan 30 06:53:51 crc kubenswrapper[4841]: W0130 06:53:51.228166 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd248d34a_8ccf_48dd_bb30_9ad79bd380c8.slice/crio-0c8d5115f401238eca4fe4ec21e1ce4d6a6e65e1b8de6e2da56db970baa1868a WatchSource:0}: Error finding container 0c8d5115f401238eca4fe4ec21e1ce4d6a6e65e1b8de6e2da56db970baa1868a: Status 404 returned error can't find the container with id 0c8d5115f401238eca4fe4ec21e1ce4d6a6e65e1b8de6e2da56db970baa1868a Jan 30 06:53:51 crc kubenswrapper[4841]: I0130 06:53:51.229430 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-1557-account-create-update-jbrq7"] Jan 30 06:53:51 crc kubenswrapper[4841]: I0130 06:53:51.893448 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb572435-b17a-4caf-afb1-a2c334f6bc35","Type":"ContainerStarted","Data":"fca7947b6a8c8ef38f6a4f1e40bccf03794aaecd4899458a75fdb263311db582"} Jan 30 06:53:51 crc kubenswrapper[4841]: I0130 06:53:51.896688 4841 generic.go:334] "Generic (PLEG): container finished" podID="4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44" containerID="424f52dbd0bc876630cdb01ccdafa8905c99c6c3802758ba1afc93b9ed6e00cc" exitCode=0 Jan 30 06:53:51 crc kubenswrapper[4841]: I0130 06:53:51.896748 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-96zrp" event={"ID":"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44","Type":"ContainerDied","Data":"424f52dbd0bc876630cdb01ccdafa8905c99c6c3802758ba1afc93b9ed6e00cc"} Jan 30 06:53:51 crc kubenswrapper[4841]: I0130 06:53:51.896792 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-96zrp" event={"ID":"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44","Type":"ContainerStarted","Data":"ccdc20a83a72bd3f4fc423cf2fc62ad1af0b1f0e4cfd1b4f87d4338b06fa8425"} Jan 30 06:53:51 crc kubenswrapper[4841]: I0130 06:53:51.899806 4841 generic.go:334] "Generic (PLEG): container finished" podID="d248d34a-8ccf-48dd-bb30-9ad79bd380c8" containerID="31f586e4fcccb4d5bc63337210744e4a8360bff4d48c0ae8692117a061c2023f" exitCode=0 Jan 30 06:53:51 crc kubenswrapper[4841]: I0130 06:53:51.899831 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1557-account-create-update-jbrq7" event={"ID":"d248d34a-8ccf-48dd-bb30-9ad79bd380c8","Type":"ContainerDied","Data":"31f586e4fcccb4d5bc63337210744e4a8360bff4d48c0ae8692117a061c2023f"} Jan 30 06:53:51 crc kubenswrapper[4841]: I0130 06:53:51.899845 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1557-account-create-update-jbrq7" event={"ID":"d248d34a-8ccf-48dd-bb30-9ad79bd380c8","Type":"ContainerStarted","Data":"0c8d5115f401238eca4fe4ec21e1ce4d6a6e65e1b8de6e2da56db970baa1868a"} Jan 30 06:53:52 crc kubenswrapper[4841]: I0130 06:53:52.814160 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="d7d1f3e8-04da-441b-8338-5b4d221487a3" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.1.147:9090/-/ready\": dial tcp 10.217.1.147:9090: i/o timeout (Client.Timeout exceeded while awaiting headers)" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.686001 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-96zrp" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.691921 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1557-account-create-update-jbrq7" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.782228 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d248d34a-8ccf-48dd-bb30-9ad79bd380c8-operator-scripts\") pod \"d248d34a-8ccf-48dd-bb30-9ad79bd380c8\" (UID: \"d248d34a-8ccf-48dd-bb30-9ad79bd380c8\") " Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.782482 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44-operator-scripts\") pod \"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44\" (UID: \"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44\") " Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.782567 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dqlb\" (UniqueName: \"kubernetes.io/projected/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44-kube-api-access-2dqlb\") pod \"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44\" (UID: \"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44\") " Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.782722 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6xct\" (UniqueName: \"kubernetes.io/projected/d248d34a-8ccf-48dd-bb30-9ad79bd380c8-kube-api-access-p6xct\") pod \"d248d34a-8ccf-48dd-bb30-9ad79bd380c8\" (UID: \"d248d34a-8ccf-48dd-bb30-9ad79bd380c8\") " Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.783087 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d248d34a-8ccf-48dd-bb30-9ad79bd380c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d248d34a-8ccf-48dd-bb30-9ad79bd380c8" (UID: "d248d34a-8ccf-48dd-bb30-9ad79bd380c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.783101 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44" (UID: "4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.785099 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d248d34a-8ccf-48dd-bb30-9ad79bd380c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.785203 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.864379 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d248d34a-8ccf-48dd-bb30-9ad79bd380c8-kube-api-access-p6xct" (OuterVolumeSpecName: "kube-api-access-p6xct") pod "d248d34a-8ccf-48dd-bb30-9ad79bd380c8" (UID: "d248d34a-8ccf-48dd-bb30-9ad79bd380c8"). InnerVolumeSpecName "kube-api-access-p6xct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.888302 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6xct\" (UniqueName: \"kubernetes.io/projected/d248d34a-8ccf-48dd-bb30-9ad79bd380c8-kube-api-access-p6xct\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.926842 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-96zrp" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.926853 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-96zrp" event={"ID":"4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44","Type":"ContainerDied","Data":"ccdc20a83a72bd3f4fc423cf2fc62ad1af0b1f0e4cfd1b4f87d4338b06fa8425"} Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.927010 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccdc20a83a72bd3f4fc423cf2fc62ad1af0b1f0e4cfd1b4f87d4338b06fa8425" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.929590 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1557-account-create-update-jbrq7" event={"ID":"d248d34a-8ccf-48dd-bb30-9ad79bd380c8","Type":"ContainerDied","Data":"0c8d5115f401238eca4fe4ec21e1ce4d6a6e65e1b8de6e2da56db970baa1868a"} Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.929638 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c8d5115f401238eca4fe4ec21e1ce4d6a6e65e1b8de6e2da56db970baa1868a" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.929657 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1557-account-create-update-jbrq7" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.963131 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44-kube-api-access-2dqlb" (OuterVolumeSpecName: "kube-api-access-2dqlb") pod "4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44" (UID: "4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44"). InnerVolumeSpecName "kube-api-access-2dqlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:53:53 crc kubenswrapper[4841]: I0130 06:53:53.990572 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dqlb\" (UniqueName: \"kubernetes.io/projected/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44-kube-api-access-2dqlb\") on node \"crc\" DevicePath \"\"" Jan 30 06:53:54 crc kubenswrapper[4841]: I0130 06:53:54.163257 4841 scope.go:117] "RemoveContainer" containerID="5feca6095bf49629173a7a7a17b80a89b2f2c3a7c6b0e0948420042eccf5d4fc" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.111431 4841 scope.go:117] "RemoveContainer" containerID="03fd3fa2289acdd2fc71a7b7c1a7396e9786a4b0b35bae38706e87599b64b962" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.153323 4841 scope.go:117] "RemoveContainer" containerID="60f9c1d8b3bcccb14da8a6e1fce165f56e460f764bc86eae5015fcbf64d02369" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.205070 4841 scope.go:117] "RemoveContainer" containerID="0ad4ef0bcb1ebca85479ccfcd113fdac37df6c082ca09a839d66babd6d174eca" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.666901 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-npgfr"] Jan 30 06:53:55 crc kubenswrapper[4841]: E0130 06:53:55.667754 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d248d34a-8ccf-48dd-bb30-9ad79bd380c8" containerName="mariadb-account-create-update" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.667787 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d248d34a-8ccf-48dd-bb30-9ad79bd380c8" containerName="mariadb-account-create-update" Jan 30 06:53:55 crc kubenswrapper[4841]: E0130 06:53:55.667827 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44" containerName="mariadb-database-create" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.667840 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44" containerName="mariadb-database-create" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.668231 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d248d34a-8ccf-48dd-bb30-9ad79bd380c8" containerName="mariadb-account-create-update" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.668275 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44" containerName="mariadb-database-create" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.669463 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.674272 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.674444 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.674571 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9swpr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.679697 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.683757 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-npgfr"] Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.834438 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5smmc\" (UniqueName: \"kubernetes.io/projected/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-kube-api-access-5smmc\") pod \"aodh-db-sync-npgfr\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.834762 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-combined-ca-bundle\") pod \"aodh-db-sync-npgfr\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.834911 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-config-data\") pod \"aodh-db-sync-npgfr\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.835046 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-scripts\") pod \"aodh-db-sync-npgfr\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.937680 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5smmc\" (UniqueName: \"kubernetes.io/projected/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-kube-api-access-5smmc\") pod \"aodh-db-sync-npgfr\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.937783 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-combined-ca-bundle\") pod \"aodh-db-sync-npgfr\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.937827 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-config-data\") pod \"aodh-db-sync-npgfr\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.937871 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-scripts\") pod \"aodh-db-sync-npgfr\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.943879 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-scripts\") pod \"aodh-db-sync-npgfr\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.945276 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-combined-ca-bundle\") pod \"aodh-db-sync-npgfr\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.947024 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-config-data\") pod \"aodh-db-sync-npgfr\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:55 crc kubenswrapper[4841]: I0130 06:53:55.969257 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5smmc\" (UniqueName: \"kubernetes.io/projected/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-kube-api-access-5smmc\") pod \"aodh-db-sync-npgfr\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:56 crc kubenswrapper[4841]: I0130 06:53:56.006808 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-npgfr" Jan 30 06:53:56 crc kubenswrapper[4841]: I0130 06:53:56.504724 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-npgfr"] Jan 30 06:53:56 crc kubenswrapper[4841]: I0130 06:53:56.980040 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-npgfr" event={"ID":"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c","Type":"ContainerStarted","Data":"aaee468981fbacd57f2217331cbf7a0ddcf6112b508fe5a8fe1b249e193cc5cf"} Jan 30 06:53:56 crc kubenswrapper[4841]: I0130 06:53:56.982043 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb572435-b17a-4caf-afb1-a2c334f6bc35","Type":"ContainerStarted","Data":"6030135fa7a9b50de27f8a8fb524570805347d5adc11cf3c995c7e8814a87b85"} Jan 30 06:54:02 crc kubenswrapper[4841]: I0130 06:54:02.039149 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-npgfr" event={"ID":"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c","Type":"ContainerStarted","Data":"7d3802b1eec8b13571d4cc3babe84e4f32c139308006e2f7141bbeb194e5e5ec"} Jan 30 06:54:02 crc kubenswrapper[4841]: I0130 06:54:02.064524 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-npgfr" podStartSLOduration=2.705842312 podStartE2EDuration="7.0645006s" podCreationTimestamp="2026-01-30 06:53:55 +0000 UTC" firstStartedPulling="2026-01-30 06:53:56.51466705 +0000 UTC m=+6373.508139688" lastFinishedPulling="2026-01-30 06:54:00.873325318 +0000 UTC m=+6377.866797976" observedRunningTime="2026-01-30 06:54:02.063431502 +0000 UTC m=+6379.056904170" watchObservedRunningTime="2026-01-30 06:54:02.0645006 +0000 UTC m=+6379.057973268" Jan 30 06:54:04 crc kubenswrapper[4841]: I0130 06:54:04.067112 4841 generic.go:334] "Generic (PLEG): container finished" podID="6f73848a-d9bc-4486-b7f5-f9f3fca5e13c" containerID="7d3802b1eec8b13571d4cc3babe84e4f32c139308006e2f7141bbeb194e5e5ec" exitCode=0 Jan 30 06:54:04 crc kubenswrapper[4841]: I0130 06:54:04.067882 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-npgfr" event={"ID":"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c","Type":"ContainerDied","Data":"7d3802b1eec8b13571d4cc3babe84e4f32c139308006e2f7141bbeb194e5e5ec"} Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.083753 4841 generic.go:334] "Generic (PLEG): container finished" podID="fb572435-b17a-4caf-afb1-a2c334f6bc35" containerID="6030135fa7a9b50de27f8a8fb524570805347d5adc11cf3c995c7e8814a87b85" exitCode=0 Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.083871 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb572435-b17a-4caf-afb1-a2c334f6bc35","Type":"ContainerDied","Data":"6030135fa7a9b50de27f8a8fb524570805347d5adc11cf3c995c7e8814a87b85"} Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.530141 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-npgfr" Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.697861 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-config-data\") pod \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.697947 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-scripts\") pod \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.698019 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-combined-ca-bundle\") pod \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.698063 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5smmc\" (UniqueName: \"kubernetes.io/projected/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-kube-api-access-5smmc\") pod \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\" (UID: \"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c\") " Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.702851 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-scripts" (OuterVolumeSpecName: "scripts") pod "6f73848a-d9bc-4486-b7f5-f9f3fca5e13c" (UID: "6f73848a-d9bc-4486-b7f5-f9f3fca5e13c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.703642 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-kube-api-access-5smmc" (OuterVolumeSpecName: "kube-api-access-5smmc") pod "6f73848a-d9bc-4486-b7f5-f9f3fca5e13c" (UID: "6f73848a-d9bc-4486-b7f5-f9f3fca5e13c"). InnerVolumeSpecName "kube-api-access-5smmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.735105 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f73848a-d9bc-4486-b7f5-f9f3fca5e13c" (UID: "6f73848a-d9bc-4486-b7f5-f9f3fca5e13c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.745840 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-config-data" (OuterVolumeSpecName: "config-data") pod "6f73848a-d9bc-4486-b7f5-f9f3fca5e13c" (UID: "6f73848a-d9bc-4486-b7f5-f9f3fca5e13c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.801224 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.801280 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.801297 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:05 crc kubenswrapper[4841]: I0130 06:54:05.801313 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5smmc\" (UniqueName: \"kubernetes.io/projected/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c-kube-api-access-5smmc\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:06 crc kubenswrapper[4841]: I0130 06:54:06.098632 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-npgfr" event={"ID":"6f73848a-d9bc-4486-b7f5-f9f3fca5e13c","Type":"ContainerDied","Data":"aaee468981fbacd57f2217331cbf7a0ddcf6112b508fe5a8fe1b249e193cc5cf"} Jan 30 06:54:06 crc kubenswrapper[4841]: I0130 06:54:06.099033 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaee468981fbacd57f2217331cbf7a0ddcf6112b508fe5a8fe1b249e193cc5cf" Jan 30 06:54:06 crc kubenswrapper[4841]: I0130 06:54:06.098678 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-npgfr" Jan 30 06:54:06 crc kubenswrapper[4841]: I0130 06:54:06.101495 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb572435-b17a-4caf-afb1-a2c334f6bc35","Type":"ContainerStarted","Data":"0b3b598055a9fdb443aa0f566f2daa36c02f2d911cb055d8ed272babfa7f098c"} Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.172135 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb572435-b17a-4caf-afb1-a2c334f6bc35","Type":"ContainerStarted","Data":"824e9432095b57d6abe144acb1ad0e9b4cee6c534362faf082519058a660f2f5"} Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.172600 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb572435-b17a-4caf-afb1-a2c334f6bc35","Type":"ContainerStarted","Data":"c4c985966de8e64ba66f855788adc35c3138b39d371198efa5332727bd8162da"} Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.226756 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=20.226729591 podStartE2EDuration="20.226729591s" podCreationTimestamp="2026-01-30 06:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 06:54:10.209777129 +0000 UTC m=+6387.203249767" watchObservedRunningTime="2026-01-30 06:54:10.226729591 +0000 UTC m=+6387.220202259" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.470194 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.470448 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.480972 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 30 06:54:10 crc kubenswrapper[4841]: E0130 06:54:10.481293 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f73848a-d9bc-4486-b7f5-f9f3fca5e13c" containerName="aodh-db-sync" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.481308 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f73848a-d9bc-4486-b7f5-f9f3fca5e13c" containerName="aodh-db-sync" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.481495 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f73848a-d9bc-4486-b7f5-f9f3fca5e13c" containerName="aodh-db-sync" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.483229 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.483317 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.485934 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.485956 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9swpr" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.486622 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.613259 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.613309 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-scripts\") pod \"aodh-0\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.613330 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-config-data\") pod \"aodh-0\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.613537 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjfzm\" (UniqueName: \"kubernetes.io/projected/19fdd47a-240f-41ce-98a6-b331af293ca1-kube-api-access-zjfzm\") pod \"aodh-0\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.629430 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.716062 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.716115 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-scripts\") pod \"aodh-0\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.716139 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-config-data\") pod \"aodh-0\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.716164 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjfzm\" (UniqueName: \"kubernetes.io/projected/19fdd47a-240f-41ce-98a6-b331af293ca1-kube-api-access-zjfzm\") pod \"aodh-0\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.722087 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.725790 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-scripts\") pod \"aodh-0\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.730776 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-config-data\") pod \"aodh-0\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.746031 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjfzm\" (UniqueName: \"kubernetes.io/projected/19fdd47a-240f-41ce-98a6-b331af293ca1-kube-api-access-zjfzm\") pod \"aodh-0\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " pod="openstack/aodh-0" Jan 30 06:54:10 crc kubenswrapper[4841]: I0130 06:54:10.811092 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 30 06:54:11 crc kubenswrapper[4841]: I0130 06:54:11.144889 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 06:54:11 crc kubenswrapper[4841]: I0130 06:54:11.329936 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 30 06:54:12 crc kubenswrapper[4841]: I0130 06:54:12.210545 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"19fdd47a-240f-41ce-98a6-b331af293ca1","Type":"ContainerStarted","Data":"6b92d2dab1150328abb26122eb6c6a11e84b01e00f25928108cbed68c74ecdf9"} Jan 30 06:54:12 crc kubenswrapper[4841]: I0130 06:54:12.211067 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"19fdd47a-240f-41ce-98a6-b331af293ca1","Type":"ContainerStarted","Data":"3998174449c5d4a2a64101603791d7e3fb95911d632898f0f830f3a3d5e82920"} Jan 30 06:54:12 crc kubenswrapper[4841]: I0130 06:54:12.517544 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:54:12 crc kubenswrapper[4841]: I0130 06:54:12.517795 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="ceilometer-central-agent" containerID="cri-o://248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda" gracePeriod=30 Jan 30 06:54:12 crc kubenswrapper[4841]: I0130 06:54:12.518176 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="proxy-httpd" containerID="cri-o://42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa" gracePeriod=30 Jan 30 06:54:12 crc kubenswrapper[4841]: I0130 06:54:12.518224 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="sg-core" containerID="cri-o://e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4" gracePeriod=30 Jan 30 06:54:12 crc kubenswrapper[4841]: I0130 06:54:12.518262 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="ceilometer-notification-agent" containerID="cri-o://e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77" gracePeriod=30 Jan 30 06:54:13 crc kubenswrapper[4841]: E0130 06:54:13.104984 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod686c7916_64bf_4289_b82f_1890c7475246.slice/crio-248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod686c7916_64bf_4289_b82f_1890c7475246.slice/crio-conmon-248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda.scope\": RecentStats: unable to find data in memory cache]" Jan 30 06:54:13 crc kubenswrapper[4841]: I0130 06:54:13.280071 4841 generic.go:334] "Generic (PLEG): container finished" podID="686c7916-64bf-4289-b82f-1890c7475246" containerID="42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa" exitCode=0 Jan 30 06:54:13 crc kubenswrapper[4841]: I0130 06:54:13.280102 4841 generic.go:334] "Generic (PLEG): container finished" podID="686c7916-64bf-4289-b82f-1890c7475246" containerID="e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4" exitCode=2 Jan 30 06:54:13 crc kubenswrapper[4841]: I0130 06:54:13.280110 4841 generic.go:334] "Generic (PLEG): container finished" podID="686c7916-64bf-4289-b82f-1890c7475246" containerID="248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda" exitCode=0 Jan 30 06:54:13 crc kubenswrapper[4841]: I0130 06:54:13.280131 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"686c7916-64bf-4289-b82f-1890c7475246","Type":"ContainerDied","Data":"42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa"} Jan 30 06:54:13 crc kubenswrapper[4841]: I0130 06:54:13.280160 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"686c7916-64bf-4289-b82f-1890c7475246","Type":"ContainerDied","Data":"e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4"} Jan 30 06:54:13 crc kubenswrapper[4841]: I0130 06:54:13.280170 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"686c7916-64bf-4289-b82f-1890c7475246","Type":"ContainerDied","Data":"248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda"} Jan 30 06:54:13 crc kubenswrapper[4841]: I0130 06:54:13.709759 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 30 06:54:14 crc kubenswrapper[4841]: I0130 06:54:14.291761 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"19fdd47a-240f-41ce-98a6-b331af293ca1","Type":"ContainerStarted","Data":"09030ae9a7f7917e9a694fb352b6a013fe574302567aa4156e8dd69592df4c03"} Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.301320 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"19fdd47a-240f-41ce-98a6-b331af293ca1","Type":"ContainerStarted","Data":"7dce8f344b41918435532978e310812d59c6ea4a4c82c7dfee5932561ebc05bc"} Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.764072 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.828830 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/686c7916-64bf-4289-b82f-1890c7475246-run-httpd\") pod \"686c7916-64bf-4289-b82f-1890c7475246\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.829074 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-scripts\") pod \"686c7916-64bf-4289-b82f-1890c7475246\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.829129 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-combined-ca-bundle\") pod \"686c7916-64bf-4289-b82f-1890c7475246\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.829202 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686c7916-64bf-4289-b82f-1890c7475246-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "686c7916-64bf-4289-b82f-1890c7475246" (UID: "686c7916-64bf-4289-b82f-1890c7475246"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.829243 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-sg-core-conf-yaml\") pod \"686c7916-64bf-4289-b82f-1890c7475246\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.829324 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-config-data\") pod \"686c7916-64bf-4289-b82f-1890c7475246\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.829342 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/686c7916-64bf-4289-b82f-1890c7475246-log-httpd\") pod \"686c7916-64bf-4289-b82f-1890c7475246\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.829420 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7nvg\" (UniqueName: \"kubernetes.io/projected/686c7916-64bf-4289-b82f-1890c7475246-kube-api-access-n7nvg\") pod \"686c7916-64bf-4289-b82f-1890c7475246\" (UID: \"686c7916-64bf-4289-b82f-1890c7475246\") " Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.829882 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/686c7916-64bf-4289-b82f-1890c7475246-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.830203 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686c7916-64bf-4289-b82f-1890c7475246-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "686c7916-64bf-4289-b82f-1890c7475246" (UID: "686c7916-64bf-4289-b82f-1890c7475246"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.834512 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/686c7916-64bf-4289-b82f-1890c7475246-kube-api-access-n7nvg" (OuterVolumeSpecName: "kube-api-access-n7nvg") pod "686c7916-64bf-4289-b82f-1890c7475246" (UID: "686c7916-64bf-4289-b82f-1890c7475246"). InnerVolumeSpecName "kube-api-access-n7nvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.835567 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-scripts" (OuterVolumeSpecName: "scripts") pod "686c7916-64bf-4289-b82f-1890c7475246" (UID: "686c7916-64bf-4289-b82f-1890c7475246"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.878771 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "686c7916-64bf-4289-b82f-1890c7475246" (UID: "686c7916-64bf-4289-b82f-1890c7475246"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.923125 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "686c7916-64bf-4289-b82f-1890c7475246" (UID: "686c7916-64bf-4289-b82f-1890c7475246"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.937724 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.937766 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.937780 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.937791 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/686c7916-64bf-4289-b82f-1890c7475246-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.937805 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7nvg\" (UniqueName: \"kubernetes.io/projected/686c7916-64bf-4289-b82f-1890c7475246-kube-api-access-n7nvg\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:15 crc kubenswrapper[4841]: I0130 06:54:15.960499 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-config-data" (OuterVolumeSpecName: "config-data") pod "686c7916-64bf-4289-b82f-1890c7475246" (UID: "686c7916-64bf-4289-b82f-1890c7475246"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.039690 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/686c7916-64bf-4289-b82f-1890c7475246-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.316742 4841 generic.go:334] "Generic (PLEG): container finished" podID="686c7916-64bf-4289-b82f-1890c7475246" containerID="e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77" exitCode=0 Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.316786 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"686c7916-64bf-4289-b82f-1890c7475246","Type":"ContainerDied","Data":"e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77"} Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.316816 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"686c7916-64bf-4289-b82f-1890c7475246","Type":"ContainerDied","Data":"75ca51b4142be2124983a1c3343ab1374703539231e6c9a6ef1954cab82c2a00"} Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.316834 4841 scope.go:117] "RemoveContainer" containerID="42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.317767 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.350774 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.352449 4841 scope.go:117] "RemoveContainer" containerID="e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.359460 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.396465 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:54:16 crc kubenswrapper[4841]: E0130 06:54:16.396927 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="proxy-httpd" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.396939 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="proxy-httpd" Jan 30 06:54:16 crc kubenswrapper[4841]: E0130 06:54:16.396964 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="sg-core" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.396971 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="sg-core" Jan 30 06:54:16 crc kubenswrapper[4841]: E0130 06:54:16.396983 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="ceilometer-notification-agent" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.396989 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="ceilometer-notification-agent" Jan 30 06:54:16 crc kubenswrapper[4841]: E0130 06:54:16.397001 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="ceilometer-central-agent" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.397008 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="ceilometer-central-agent" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.397186 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="proxy-httpd" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.397203 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="sg-core" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.397220 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="ceilometer-central-agent" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.397230 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="686c7916-64bf-4289-b82f-1890c7475246" containerName="ceilometer-notification-agent" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.399033 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.401495 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.402031 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.428168 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.493565 4841 scope.go:117] "RemoveContainer" containerID="e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.495921 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="686c7916-64bf-4289-b82f-1890c7475246" path="/var/lib/kubelet/pods/686c7916-64bf-4289-b82f-1890c7475246/volumes" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.551958 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-run-httpd\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.552047 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-scripts\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.552066 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.552097 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-config-data\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.552173 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mrxh\" (UniqueName: \"kubernetes.io/projected/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-kube-api-access-8mrxh\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.552244 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.552380 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-log-httpd\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.560146 4841 scope.go:117] "RemoveContainer" containerID="248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.642120 4841 scope.go:117] "RemoveContainer" containerID="42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa" Jan 30 06:54:16 crc kubenswrapper[4841]: E0130 06:54:16.642608 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa\": container with ID starting with 42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa not found: ID does not exist" containerID="42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.642654 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa"} err="failed to get container status \"42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa\": rpc error: code = NotFound desc = could not find container \"42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa\": container with ID starting with 42aac36adbacce290fb128b196156b4c3324e91eb3760fac07e65b3715acdfaa not found: ID does not exist" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.642683 4841 scope.go:117] "RemoveContainer" containerID="e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4" Jan 30 06:54:16 crc kubenswrapper[4841]: E0130 06:54:16.642921 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4\": container with ID starting with e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4 not found: ID does not exist" containerID="e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.642956 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4"} err="failed to get container status \"e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4\": rpc error: code = NotFound desc = could not find container \"e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4\": container with ID starting with e39b5cb0564056070d3f7f4d0ce5bcf996c2af85e70982ee96dc58c4d2bbd4d4 not found: ID does not exist" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.642975 4841 scope.go:117] "RemoveContainer" containerID="e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77" Jan 30 06:54:16 crc kubenswrapper[4841]: E0130 06:54:16.643207 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77\": container with ID starting with e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77 not found: ID does not exist" containerID="e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.643248 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77"} err="failed to get container status \"e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77\": rpc error: code = NotFound desc = could not find container \"e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77\": container with ID starting with e8a3b681b13f9df07a66670332dcaae93a32baf82decc2d95f177194afba8c77 not found: ID does not exist" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.643266 4841 scope.go:117] "RemoveContainer" containerID="248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda" Jan 30 06:54:16 crc kubenswrapper[4841]: E0130 06:54:16.643656 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda\": container with ID starting with 248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda not found: ID does not exist" containerID="248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.643685 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda"} err="failed to get container status \"248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda\": rpc error: code = NotFound desc = could not find container \"248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda\": container with ID starting with 248398e4779a548813ebacdd3dfdaf0c16b14e85741921578fc31fc30c31ecda not found: ID does not exist" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.655588 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-log-httpd\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.655661 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-run-httpd\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.655694 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-scripts\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.655710 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.655736 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-config-data\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.655783 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mrxh\" (UniqueName: \"kubernetes.io/projected/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-kube-api-access-8mrxh\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.655826 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.661390 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-log-httpd\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.669159 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-scripts\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.680725 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.683584 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-run-httpd\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.684185 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-config-data\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.686054 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.703091 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mrxh\" (UniqueName: \"kubernetes.io/projected/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-kube-api-access-8mrxh\") pod \"ceilometer-0\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " pod="openstack/ceilometer-0" Jan 30 06:54:16 crc kubenswrapper[4841]: I0130 06:54:16.729019 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:54:17 crc kubenswrapper[4841]: I0130 06:54:17.314427 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 06:54:17 crc kubenswrapper[4841]: I0130 06:54:17.314871 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce" containerName="kube-state-metrics" containerID="cri-o://a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933" gracePeriod=30 Jan 30 06:54:17 crc kubenswrapper[4841]: I0130 06:54:17.431728 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:54:17 crc kubenswrapper[4841]: W0130 06:54:17.510915 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5cd70eb_9a2a_4714_8c62_a08e83a7366f.slice/crio-f47b6a8821cc859766f5686ec09eef60d473b071b7a4a84c8b135b7a20dfcf11 WatchSource:0}: Error finding container f47b6a8821cc859766f5686ec09eef60d473b071b7a4a84c8b135b7a20dfcf11: Status 404 returned error can't find the container with id f47b6a8821cc859766f5686ec09eef60d473b071b7a4a84c8b135b7a20dfcf11 Jan 30 06:54:17 crc kubenswrapper[4841]: I0130 06:54:17.905333 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 06:54:17 crc kubenswrapper[4841]: I0130 06:54:17.996582 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmg5g\" (UniqueName: \"kubernetes.io/projected/f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce-kube-api-access-fmg5g\") pod \"f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce\" (UID: \"f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce\") " Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.000866 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce-kube-api-access-fmg5g" (OuterVolumeSpecName: "kube-api-access-fmg5g") pod "f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce" (UID: "f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce"). InnerVolumeSpecName "kube-api-access-fmg5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.100253 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmg5g\" (UniqueName: \"kubernetes.io/projected/f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce-kube-api-access-fmg5g\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.336687 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce" containerID="a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933" exitCode=2 Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.336757 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce","Type":"ContainerDied","Data":"a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933"} Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.336773 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.336800 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce","Type":"ContainerDied","Data":"074a63acf427a8aa24b831e6f39c392a2dd28f24ff402e9fffa2b804ea6736be"} Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.336821 4841 scope.go:117] "RemoveContainer" containerID="a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.339544 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"19fdd47a-240f-41ce-98a6-b331af293ca1","Type":"ContainerStarted","Data":"3562fa2b1c65550d090c7d871bb6738a907af79877a9b89f3331d76a901ed65a"} Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.339696 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-api" containerID="cri-o://6b92d2dab1150328abb26122eb6c6a11e84b01e00f25928108cbed68c74ecdf9" gracePeriod=30 Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.339767 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-listener" containerID="cri-o://3562fa2b1c65550d090c7d871bb6738a907af79877a9b89f3331d76a901ed65a" gracePeriod=30 Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.339813 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-notifier" containerID="cri-o://7dce8f344b41918435532978e310812d59c6ea4a4c82c7dfee5932561ebc05bc" gracePeriod=30 Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.339858 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-evaluator" containerID="cri-o://09030ae9a7f7917e9a694fb352b6a013fe574302567aa4156e8dd69592df4c03" gracePeriod=30 Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.344296 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5cd70eb-9a2a-4714-8c62-a08e83a7366f","Type":"ContainerStarted","Data":"e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b"} Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.344338 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5cd70eb-9a2a-4714-8c62-a08e83a7366f","Type":"ContainerStarted","Data":"f47b6a8821cc859766f5686ec09eef60d473b071b7a4a84c8b135b7a20dfcf11"} Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.369176 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.157548339 podStartE2EDuration="8.369148122s" podCreationTimestamp="2026-01-30 06:54:10 +0000 UTC" firstStartedPulling="2026-01-30 06:54:11.339665077 +0000 UTC m=+6388.333137715" lastFinishedPulling="2026-01-30 06:54:17.55126486 +0000 UTC m=+6394.544737498" observedRunningTime="2026-01-30 06:54:18.363432948 +0000 UTC m=+6395.356905586" watchObservedRunningTime="2026-01-30 06:54:18.369148122 +0000 UTC m=+6395.362620760" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.422138 4841 scope.go:117] "RemoveContainer" containerID="a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933" Jan 30 06:54:18 crc kubenswrapper[4841]: E0130 06:54:18.426782 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933\": container with ID starting with a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933 not found: ID does not exist" containerID="a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.426846 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933"} err="failed to get container status \"a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933\": rpc error: code = NotFound desc = could not find container \"a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933\": container with ID starting with a858dcc0c11dd72ea3a1a30a42b70b68c7130cd43df11fb4853925fbc57d1933 not found: ID does not exist" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.457794 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.462951 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.476910 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 06:54:18 crc kubenswrapper[4841]: E0130 06:54:18.477392 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce" containerName="kube-state-metrics" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.477425 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce" containerName="kube-state-metrics" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.477608 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce" containerName="kube-state-metrics" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.478356 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.487268 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.488981 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.491703 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.612235 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5525028c-d229-479d-a828-a5d33c61d333-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5525028c-d229-479d-a828-a5d33c61d333\") " pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.612598 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5525028c-d229-479d-a828-a5d33c61d333-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5525028c-d229-479d-a828-a5d33c61d333\") " pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.612746 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljsh6\" (UniqueName: \"kubernetes.io/projected/5525028c-d229-479d-a828-a5d33c61d333-kube-api-access-ljsh6\") pod \"kube-state-metrics-0\" (UID: \"5525028c-d229-479d-a828-a5d33c61d333\") " pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.612863 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5525028c-d229-479d-a828-a5d33c61d333-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5525028c-d229-479d-a828-a5d33c61d333\") " pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.714578 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5525028c-d229-479d-a828-a5d33c61d333-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5525028c-d229-479d-a828-a5d33c61d333\") " pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.714715 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5525028c-d229-479d-a828-a5d33c61d333-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5525028c-d229-479d-a828-a5d33c61d333\") " pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.714753 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5525028c-d229-479d-a828-a5d33c61d333-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5525028c-d229-479d-a828-a5d33c61d333\") " pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.714819 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljsh6\" (UniqueName: \"kubernetes.io/projected/5525028c-d229-479d-a828-a5d33c61d333-kube-api-access-ljsh6\") pod \"kube-state-metrics-0\" (UID: \"5525028c-d229-479d-a828-a5d33c61d333\") " pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.718996 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5525028c-d229-479d-a828-a5d33c61d333-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5525028c-d229-479d-a828-a5d33c61d333\") " pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.719814 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5525028c-d229-479d-a828-a5d33c61d333-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5525028c-d229-479d-a828-a5d33c61d333\") " pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.720019 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5525028c-d229-479d-a828-a5d33c61d333-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5525028c-d229-479d-a828-a5d33c61d333\") " pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.737192 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljsh6\" (UniqueName: \"kubernetes.io/projected/5525028c-d229-479d-a828-a5d33c61d333-kube-api-access-ljsh6\") pod \"kube-state-metrics-0\" (UID: \"5525028c-d229-479d-a828-a5d33c61d333\") " pod="openstack/kube-state-metrics-0" Jan 30 06:54:18 crc kubenswrapper[4841]: I0130 06:54:18.804945 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 06:54:19 crc kubenswrapper[4841]: I0130 06:54:19.298849 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 06:54:19 crc kubenswrapper[4841]: W0130 06:54:19.307698 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5525028c_d229_479d_a828_a5d33c61d333.slice/crio-19104f21d6f18294d5a922f9cb7e7d267cd1a69870ba56db26dd90bbeaa04511 WatchSource:0}: Error finding container 19104f21d6f18294d5a922f9cb7e7d267cd1a69870ba56db26dd90bbeaa04511: Status 404 returned error can't find the container with id 19104f21d6f18294d5a922f9cb7e7d267cd1a69870ba56db26dd90bbeaa04511 Jan 30 06:54:19 crc kubenswrapper[4841]: I0130 06:54:19.354801 4841 generic.go:334] "Generic (PLEG): container finished" podID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerID="7dce8f344b41918435532978e310812d59c6ea4a4c82c7dfee5932561ebc05bc" exitCode=0 Jan 30 06:54:19 crc kubenswrapper[4841]: I0130 06:54:19.354833 4841 generic.go:334] "Generic (PLEG): container finished" podID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerID="09030ae9a7f7917e9a694fb352b6a013fe574302567aa4156e8dd69592df4c03" exitCode=0 Jan 30 06:54:19 crc kubenswrapper[4841]: I0130 06:54:19.354842 4841 generic.go:334] "Generic (PLEG): container finished" podID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerID="6b92d2dab1150328abb26122eb6c6a11e84b01e00f25928108cbed68c74ecdf9" exitCode=0 Jan 30 06:54:19 crc kubenswrapper[4841]: I0130 06:54:19.354852 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"19fdd47a-240f-41ce-98a6-b331af293ca1","Type":"ContainerDied","Data":"7dce8f344b41918435532978e310812d59c6ea4a4c82c7dfee5932561ebc05bc"} Jan 30 06:54:19 crc kubenswrapper[4841]: I0130 06:54:19.354880 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"19fdd47a-240f-41ce-98a6-b331af293ca1","Type":"ContainerDied","Data":"09030ae9a7f7917e9a694fb352b6a013fe574302567aa4156e8dd69592df4c03"} Jan 30 06:54:19 crc kubenswrapper[4841]: I0130 06:54:19.354889 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"19fdd47a-240f-41ce-98a6-b331af293ca1","Type":"ContainerDied","Data":"6b92d2dab1150328abb26122eb6c6a11e84b01e00f25928108cbed68c74ecdf9"} Jan 30 06:54:19 crc kubenswrapper[4841]: I0130 06:54:19.356425 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5cd70eb-9a2a-4714-8c62-a08e83a7366f","Type":"ContainerStarted","Data":"21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b"} Jan 30 06:54:19 crc kubenswrapper[4841]: I0130 06:54:19.357678 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5525028c-d229-479d-a828-a5d33c61d333","Type":"ContainerStarted","Data":"19104f21d6f18294d5a922f9cb7e7d267cd1a69870ba56db26dd90bbeaa04511"} Jan 30 06:54:19 crc kubenswrapper[4841]: I0130 06:54:19.422562 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:54:20 crc kubenswrapper[4841]: I0130 06:54:20.368745 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5cd70eb-9a2a-4714-8c62-a08e83a7366f","Type":"ContainerStarted","Data":"07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8"} Jan 30 06:54:20 crc kubenswrapper[4841]: I0130 06:54:20.371057 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5525028c-d229-479d-a828-a5d33c61d333","Type":"ContainerStarted","Data":"e51059c64ea2a571862b8a12f98c69ecbe142810c125d9e54634cb44bc898fbb"} Jan 30 06:54:20 crc kubenswrapper[4841]: I0130 06:54:20.371185 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 06:54:20 crc kubenswrapper[4841]: I0130 06:54:20.388046 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.899358697 podStartE2EDuration="2.388029857s" podCreationTimestamp="2026-01-30 06:54:18 +0000 UTC" firstStartedPulling="2026-01-30 06:54:19.310515562 +0000 UTC m=+6396.303988200" lastFinishedPulling="2026-01-30 06:54:19.799186722 +0000 UTC m=+6396.792659360" observedRunningTime="2026-01-30 06:54:20.385853969 +0000 UTC m=+6397.379326637" watchObservedRunningTime="2026-01-30 06:54:20.388029857 +0000 UTC m=+6397.381502485" Jan 30 06:54:20 crc kubenswrapper[4841]: I0130 06:54:20.445383 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce" path="/var/lib/kubelet/pods/f4b12b8d-c68c-4c1e-aa7b-3bdb63fc57ce/volumes" Jan 30 06:54:20 crc kubenswrapper[4841]: I0130 06:54:20.630799 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 30 06:54:20 crc kubenswrapper[4841]: I0130 06:54:20.650098 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.382790 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5cd70eb-9a2a-4714-8c62-a08e83a7366f","Type":"ContainerStarted","Data":"e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3"} Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.383654 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="proxy-httpd" containerID="cri-o://e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3" gracePeriod=30 Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.383668 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="ceilometer-notification-agent" containerID="cri-o://21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b" gracePeriod=30 Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.383678 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="sg-core" containerID="cri-o://07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8" gracePeriod=30 Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.383866 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="ceilometer-central-agent" containerID="cri-o://e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b" gracePeriod=30 Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.396216 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.500251 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.992221362 podStartE2EDuration="5.500218011s" podCreationTimestamp="2026-01-30 06:54:16 +0000 UTC" firstStartedPulling="2026-01-30 06:54:17.514384314 +0000 UTC m=+6394.507856952" lastFinishedPulling="2026-01-30 06:54:21.022380963 +0000 UTC m=+6398.015853601" observedRunningTime="2026-01-30 06:54:21.420862707 +0000 UTC m=+6398.414335345" watchObservedRunningTime="2026-01-30 06:54:21.500218011 +0000 UTC m=+6398.493690639" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.568027 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b8f2c"] Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.572334 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.584674 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-catalog-content\") pod \"certified-operators-b8f2c\" (UID: \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\") " pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.584784 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-utilities\") pod \"certified-operators-b8f2c\" (UID: \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\") " pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.584829 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcqdz\" (UniqueName: \"kubernetes.io/projected/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-kube-api-access-mcqdz\") pod \"certified-operators-b8f2c\" (UID: \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\") " pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.609624 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b8f2c"] Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.686941 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-catalog-content\") pod \"certified-operators-b8f2c\" (UID: \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\") " pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.687069 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-utilities\") pod \"certified-operators-b8f2c\" (UID: \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\") " pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.687118 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcqdz\" (UniqueName: \"kubernetes.io/projected/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-kube-api-access-mcqdz\") pod \"certified-operators-b8f2c\" (UID: \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\") " pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.687826 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-catalog-content\") pod \"certified-operators-b8f2c\" (UID: \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\") " pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.688136 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-utilities\") pod \"certified-operators-b8f2c\" (UID: \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\") " pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.710524 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcqdz\" (UniqueName: \"kubernetes.io/projected/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-kube-api-access-mcqdz\") pod \"certified-operators-b8f2c\" (UID: \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\") " pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:21 crc kubenswrapper[4841]: I0130 06:54:21.898880 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:22 crc kubenswrapper[4841]: I0130 06:54:22.397579 4841 generic.go:334] "Generic (PLEG): container finished" podID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerID="07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8" exitCode=2 Jan 30 06:54:22 crc kubenswrapper[4841]: I0130 06:54:22.397824 4841 generic.go:334] "Generic (PLEG): container finished" podID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerID="21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b" exitCode=0 Jan 30 06:54:22 crc kubenswrapper[4841]: I0130 06:54:22.397652 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5cd70eb-9a2a-4714-8c62-a08e83a7366f","Type":"ContainerDied","Data":"07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8"} Jan 30 06:54:22 crc kubenswrapper[4841]: I0130 06:54:22.397943 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5cd70eb-9a2a-4714-8c62-a08e83a7366f","Type":"ContainerDied","Data":"21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b"} Jan 30 06:54:22 crc kubenswrapper[4841]: I0130 06:54:22.450572 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b8f2c"] Jan 30 06:54:22 crc kubenswrapper[4841]: W0130 06:54:22.469129 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb416a47_9b95_47e1_ae5c_35bdb775b9f0.slice/crio-267cc02408d3072c5c45bc3fec194bcf8e14c41f5b54d3bdd05fed2500c48be6 WatchSource:0}: Error finding container 267cc02408d3072c5c45bc3fec194bcf8e14c41f5b54d3bdd05fed2500c48be6: Status 404 returned error can't find the container with id 267cc02408d3072c5c45bc3fec194bcf8e14c41f5b54d3bdd05fed2500c48be6 Jan 30 06:54:23 crc kubenswrapper[4841]: I0130 06:54:23.038178 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-85l9n"] Jan 30 06:54:23 crc kubenswrapper[4841]: I0130 06:54:23.048204 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-75xt8"] Jan 30 06:54:23 crc kubenswrapper[4841]: I0130 06:54:23.057358 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c2a2-account-create-update-2892c"] Jan 30 06:54:23 crc kubenswrapper[4841]: I0130 06:54:23.065839 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-85l9n"] Jan 30 06:54:23 crc kubenswrapper[4841]: I0130 06:54:23.074369 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c2a2-account-create-update-2892c"] Jan 30 06:54:23 crc kubenswrapper[4841]: I0130 06:54:23.082575 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-pgdhb"] Jan 30 06:54:23 crc kubenswrapper[4841]: I0130 06:54:23.091580 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-75xt8"] Jan 30 06:54:23 crc kubenswrapper[4841]: I0130 06:54:23.099694 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-pgdhb"] Jan 30 06:54:23 crc kubenswrapper[4841]: I0130 06:54:23.406708 4841 generic.go:334] "Generic (PLEG): container finished" podID="bb416a47-9b95-47e1-ae5c-35bdb775b9f0" containerID="ad09933cdf1dda48ecd911ea33aa48c73e339149c544d0357505243fcbec8502" exitCode=0 Jan 30 06:54:23 crc kubenswrapper[4841]: I0130 06:54:23.406756 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8f2c" event={"ID":"bb416a47-9b95-47e1-ae5c-35bdb775b9f0","Type":"ContainerDied","Data":"ad09933cdf1dda48ecd911ea33aa48c73e339149c544d0357505243fcbec8502"} Jan 30 06:54:23 crc kubenswrapper[4841]: I0130 06:54:23.406816 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8f2c" event={"ID":"bb416a47-9b95-47e1-ae5c-35bdb775b9f0","Type":"ContainerStarted","Data":"267cc02408d3072c5c45bc3fec194bcf8e14c41f5b54d3bdd05fed2500c48be6"} Jan 30 06:54:24 crc kubenswrapper[4841]: I0130 06:54:24.032238 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-390c-account-create-update-jjv5j"] Jan 30 06:54:24 crc kubenswrapper[4841]: I0130 06:54:24.040050 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1599-account-create-update-t5v9q"] Jan 30 06:54:24 crc kubenswrapper[4841]: I0130 06:54:24.048259 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-390c-account-create-update-jjv5j"] Jan 30 06:54:24 crc kubenswrapper[4841]: I0130 06:54:24.055202 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1599-account-create-update-t5v9q"] Jan 30 06:54:24 crc kubenswrapper[4841]: I0130 06:54:24.445157 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55f1a988-8b8f-4799-9efa-b4ec26393ad2" path="/var/lib/kubelet/pods/55f1a988-8b8f-4799-9efa-b4ec26393ad2/volumes" Jan 30 06:54:24 crc kubenswrapper[4841]: I0130 06:54:24.445790 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83eb4b2d-ef33-4018-b164-277863dc9bd6" path="/var/lib/kubelet/pods/83eb4b2d-ef33-4018-b164-277863dc9bd6/volumes" Jan 30 06:54:24 crc kubenswrapper[4841]: I0130 06:54:24.446364 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1de6617-005c-4b5a-b230-b1578d641b2b" path="/var/lib/kubelet/pods/a1de6617-005c-4b5a-b230-b1578d641b2b/volumes" Jan 30 06:54:24 crc kubenswrapper[4841]: I0130 06:54:24.446933 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e01bb5-4f5d-453c-9909-115f85275590" path="/var/lib/kubelet/pods/b3e01bb5-4f5d-453c-9909-115f85275590/volumes" Jan 30 06:54:24 crc kubenswrapper[4841]: I0130 06:54:24.447942 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b80476c5-38e5-46e8-ba13-a999779eca8c" path="/var/lib/kubelet/pods/b80476c5-38e5-46e8-ba13-a999779eca8c/volumes" Jan 30 06:54:24 crc kubenswrapper[4841]: I0130 06:54:24.448510 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8527b4c-dd51-4993-8997-909f5c4fd939" path="/var/lib/kubelet/pods/b8527b4c-dd51-4993-8997-909f5c4fd939/volumes" Jan 30 06:54:25 crc kubenswrapper[4841]: I0130 06:54:25.073436 4841 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podd248d34a-8ccf-48dd-bb30-9ad79bd380c8"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podd248d34a-8ccf-48dd-bb30-9ad79bd380c8] : Timed out while waiting for systemd to remove kubepods-besteffort-podd248d34a_8ccf_48dd_bb30_9ad79bd380c8.slice" Jan 30 06:54:25 crc kubenswrapper[4841]: E0130 06:54:25.073778 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podd248d34a-8ccf-48dd-bb30-9ad79bd380c8] : unable to destroy cgroup paths for cgroup [kubepods besteffort podd248d34a-8ccf-48dd-bb30-9ad79bd380c8] : Timed out while waiting for systemd to remove kubepods-besteffort-podd248d34a_8ccf_48dd_bb30_9ad79bd380c8.slice" pod="openstack/aodh-1557-account-create-update-jbrq7" podUID="d248d34a-8ccf-48dd-bb30-9ad79bd380c8" Jan 30 06:54:25 crc kubenswrapper[4841]: I0130 06:54:25.077133 4841 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4dbd7f71_0dfa_40ca_b8a4_1455a31e3f44.slice" Jan 30 06:54:25 crc kubenswrapper[4841]: E0130 06:54:25.077162 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4dbd7f71_0dfa_40ca_b8a4_1455a31e3f44.slice" pod="openstack/aodh-db-create-96zrp" podUID="4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44" Jan 30 06:54:25 crc kubenswrapper[4841]: I0130 06:54:25.432114 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8f2c" event={"ID":"bb416a47-9b95-47e1-ae5c-35bdb775b9f0","Type":"ContainerStarted","Data":"eb00382ea6918ed027250059ed14d2ee0d8af383f078b9b2488e962ab547f147"} Jan 30 06:54:25 crc kubenswrapper[4841]: I0130 06:54:25.432194 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1557-account-create-update-jbrq7" Jan 30 06:54:25 crc kubenswrapper[4841]: I0130 06:54:25.432194 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-96zrp" Jan 30 06:54:27 crc kubenswrapper[4841]: I0130 06:54:27.464074 4841 generic.go:334] "Generic (PLEG): container finished" podID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerID="e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b" exitCode=0 Jan 30 06:54:27 crc kubenswrapper[4841]: I0130 06:54:27.464142 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5cd70eb-9a2a-4714-8c62-a08e83a7366f","Type":"ContainerDied","Data":"e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b"} Jan 30 06:54:28 crc kubenswrapper[4841]: I0130 06:54:28.816633 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 06:54:29 crc kubenswrapper[4841]: I0130 06:54:29.486765 4841 generic.go:334] "Generic (PLEG): container finished" podID="bb416a47-9b95-47e1-ae5c-35bdb775b9f0" containerID="eb00382ea6918ed027250059ed14d2ee0d8af383f078b9b2488e962ab547f147" exitCode=0 Jan 30 06:54:29 crc kubenswrapper[4841]: I0130 06:54:29.486807 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8f2c" event={"ID":"bb416a47-9b95-47e1-ae5c-35bdb775b9f0","Type":"ContainerDied","Data":"eb00382ea6918ed027250059ed14d2ee0d8af383f078b9b2488e962ab547f147"} Jan 30 06:54:33 crc kubenswrapper[4841]: I0130 06:54:33.030463 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ws5lz"] Jan 30 06:54:33 crc kubenswrapper[4841]: I0130 06:54:33.045255 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ws5lz"] Jan 30 06:54:33 crc kubenswrapper[4841]: I0130 06:54:33.532364 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8f2c" event={"ID":"bb416a47-9b95-47e1-ae5c-35bdb775b9f0","Type":"ContainerStarted","Data":"7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552"} Jan 30 06:54:33 crc kubenswrapper[4841]: I0130 06:54:33.557352 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b8f2c" podStartSLOduration=3.565952543 podStartE2EDuration="12.557333082s" podCreationTimestamp="2026-01-30 06:54:21 +0000 UTC" firstStartedPulling="2026-01-30 06:54:23.408839277 +0000 UTC m=+6400.402311915" lastFinishedPulling="2026-01-30 06:54:32.400219786 +0000 UTC m=+6409.393692454" observedRunningTime="2026-01-30 06:54:33.554049254 +0000 UTC m=+6410.547521922" watchObservedRunningTime="2026-01-30 06:54:33.557333082 +0000 UTC m=+6410.550805720" Jan 30 06:54:34 crc kubenswrapper[4841]: I0130 06:54:34.450686 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="797cdae3-bb0b-497e-b8a6-cdd4e3c13f32" path="/var/lib/kubelet/pods/797cdae3-bb0b-497e-b8a6-cdd4e3c13f32/volumes" Jan 30 06:54:40 crc kubenswrapper[4841]: I0130 06:54:40.463953 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 06:54:40 crc kubenswrapper[4841]: I0130 06:54:40.464669 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 06:54:40 crc kubenswrapper[4841]: I0130 06:54:40.470678 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 06:54:40 crc kubenswrapper[4841]: I0130 06:54:40.475648 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 06:54:40 crc kubenswrapper[4841]: I0130 06:54:40.475811 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" gracePeriod=600 Jan 30 06:54:40 crc kubenswrapper[4841]: I0130 06:54:40.609586 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" exitCode=0 Jan 30 06:54:40 crc kubenswrapper[4841]: I0130 06:54:40.609635 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2"} Jan 30 06:54:40 crc kubenswrapper[4841]: I0130 06:54:40.609675 4841 scope.go:117] "RemoveContainer" containerID="5f5c280cd214672383e7f51c0a2877cc3f0cec94eda93bd7736e321f46e8506f" Jan 30 06:54:40 crc kubenswrapper[4841]: E0130 06:54:40.613485 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:54:41 crc kubenswrapper[4841]: I0130 06:54:41.627569 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:54:41 crc kubenswrapper[4841]: E0130 06:54:41.628800 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:54:41 crc kubenswrapper[4841]: I0130 06:54:41.899652 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:41 crc kubenswrapper[4841]: I0130 06:54:41.900087 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:41 crc kubenswrapper[4841]: I0130 06:54:41.987350 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:42 crc kubenswrapper[4841]: I0130 06:54:42.716680 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:42 crc kubenswrapper[4841]: I0130 06:54:42.767931 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b8f2c"] Jan 30 06:54:44 crc kubenswrapper[4841]: I0130 06:54:44.669478 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b8f2c" podUID="bb416a47-9b95-47e1-ae5c-35bdb775b9f0" containerName="registry-server" containerID="cri-o://7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552" gracePeriod=2 Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.215158 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.386810 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-utilities\") pod \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\" (UID: \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\") " Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.387031 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcqdz\" (UniqueName: \"kubernetes.io/projected/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-kube-api-access-mcqdz\") pod \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\" (UID: \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\") " Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.387104 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-catalog-content\") pod \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\" (UID: \"bb416a47-9b95-47e1-ae5c-35bdb775b9f0\") " Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.389106 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-utilities" (OuterVolumeSpecName: "utilities") pod "bb416a47-9b95-47e1-ae5c-35bdb775b9f0" (UID: "bb416a47-9b95-47e1-ae5c-35bdb775b9f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.399231 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-kube-api-access-mcqdz" (OuterVolumeSpecName: "kube-api-access-mcqdz") pod "bb416a47-9b95-47e1-ae5c-35bdb775b9f0" (UID: "bb416a47-9b95-47e1-ae5c-35bdb775b9f0"). InnerVolumeSpecName "kube-api-access-mcqdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.444837 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb416a47-9b95-47e1-ae5c-35bdb775b9f0" (UID: "bb416a47-9b95-47e1-ae5c-35bdb775b9f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.489715 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.489760 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcqdz\" (UniqueName: \"kubernetes.io/projected/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-kube-api-access-mcqdz\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.489772 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb416a47-9b95-47e1-ae5c-35bdb775b9f0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.681293 4841 generic.go:334] "Generic (PLEG): container finished" podID="bb416a47-9b95-47e1-ae5c-35bdb775b9f0" containerID="7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552" exitCode=0 Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.681353 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8f2c" event={"ID":"bb416a47-9b95-47e1-ae5c-35bdb775b9f0","Type":"ContainerDied","Data":"7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552"} Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.681430 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8f2c" event={"ID":"bb416a47-9b95-47e1-ae5c-35bdb775b9f0","Type":"ContainerDied","Data":"267cc02408d3072c5c45bc3fec194bcf8e14c41f5b54d3bdd05fed2500c48be6"} Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.681424 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8f2c" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.681456 4841 scope.go:117] "RemoveContainer" containerID="7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.715376 4841 scope.go:117] "RemoveContainer" containerID="eb00382ea6918ed027250059ed14d2ee0d8af383f078b9b2488e962ab547f147" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.733716 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b8f2c"] Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.751205 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b8f2c"] Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.781692 4841 scope.go:117] "RemoveContainer" containerID="ad09933cdf1dda48ecd911ea33aa48c73e339149c544d0357505243fcbec8502" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.815070 4841 scope.go:117] "RemoveContainer" containerID="7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552" Jan 30 06:54:45 crc kubenswrapper[4841]: E0130 06:54:45.815647 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552\": container with ID starting with 7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552 not found: ID does not exist" containerID="7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.815679 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552"} err="failed to get container status \"7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552\": rpc error: code = NotFound desc = could not find container \"7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552\": container with ID starting with 7f62240174e4b495ea68c49d270e15047076355da3c72633cbda09ed92369552 not found: ID does not exist" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.815714 4841 scope.go:117] "RemoveContainer" containerID="eb00382ea6918ed027250059ed14d2ee0d8af383f078b9b2488e962ab547f147" Jan 30 06:54:45 crc kubenswrapper[4841]: E0130 06:54:45.817633 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb00382ea6918ed027250059ed14d2ee0d8af383f078b9b2488e962ab547f147\": container with ID starting with eb00382ea6918ed027250059ed14d2ee0d8af383f078b9b2488e962ab547f147 not found: ID does not exist" containerID="eb00382ea6918ed027250059ed14d2ee0d8af383f078b9b2488e962ab547f147" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.817682 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb00382ea6918ed027250059ed14d2ee0d8af383f078b9b2488e962ab547f147"} err="failed to get container status \"eb00382ea6918ed027250059ed14d2ee0d8af383f078b9b2488e962ab547f147\": rpc error: code = NotFound desc = could not find container \"eb00382ea6918ed027250059ed14d2ee0d8af383f078b9b2488e962ab547f147\": container with ID starting with eb00382ea6918ed027250059ed14d2ee0d8af383f078b9b2488e962ab547f147 not found: ID does not exist" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.817720 4841 scope.go:117] "RemoveContainer" containerID="ad09933cdf1dda48ecd911ea33aa48c73e339149c544d0357505243fcbec8502" Jan 30 06:54:45 crc kubenswrapper[4841]: E0130 06:54:45.818126 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad09933cdf1dda48ecd911ea33aa48c73e339149c544d0357505243fcbec8502\": container with ID starting with ad09933cdf1dda48ecd911ea33aa48c73e339149c544d0357505243fcbec8502 not found: ID does not exist" containerID="ad09933cdf1dda48ecd911ea33aa48c73e339149c544d0357505243fcbec8502" Jan 30 06:54:45 crc kubenswrapper[4841]: I0130 06:54:45.818163 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad09933cdf1dda48ecd911ea33aa48c73e339149c544d0357505243fcbec8502"} err="failed to get container status \"ad09933cdf1dda48ecd911ea33aa48c73e339149c544d0357505243fcbec8502\": rpc error: code = NotFound desc = could not find container \"ad09933cdf1dda48ecd911ea33aa48c73e339149c544d0357505243fcbec8502\": container with ID starting with ad09933cdf1dda48ecd911ea33aa48c73e339149c544d0357505243fcbec8502 not found: ID does not exist" Jan 30 06:54:46 crc kubenswrapper[4841]: I0130 06:54:46.451217 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb416a47-9b95-47e1-ae5c-35bdb775b9f0" path="/var/lib/kubelet/pods/bb416a47-9b95-47e1-ae5c-35bdb775b9f0/volumes" Jan 30 06:54:46 crc kubenswrapper[4841]: I0130 06:54:46.730072 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 06:54:46 crc kubenswrapper[4841]: I0130 06:54:46.734533 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 30 06:54:48 crc kubenswrapper[4841]: I0130 06:54:48.715756 4841 generic.go:334] "Generic (PLEG): container finished" podID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerID="3562fa2b1c65550d090c7d871bb6738a907af79877a9b89f3331d76a901ed65a" exitCode=137 Jan 30 06:54:48 crc kubenswrapper[4841]: I0130 06:54:48.715831 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"19fdd47a-240f-41ce-98a6-b331af293ca1","Type":"ContainerDied","Data":"3562fa2b1c65550d090c7d871bb6738a907af79877a9b89f3331d76a901ed65a"} Jan 30 06:54:48 crc kubenswrapper[4841]: I0130 06:54:48.864032 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 30 06:54:48 crc kubenswrapper[4841]: I0130 06:54:48.965089 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-scripts\") pod \"19fdd47a-240f-41ce-98a6-b331af293ca1\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " Jan 30 06:54:48 crc kubenswrapper[4841]: I0130 06:54:48.965180 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjfzm\" (UniqueName: \"kubernetes.io/projected/19fdd47a-240f-41ce-98a6-b331af293ca1-kube-api-access-zjfzm\") pod \"19fdd47a-240f-41ce-98a6-b331af293ca1\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " Jan 30 06:54:48 crc kubenswrapper[4841]: I0130 06:54:48.965316 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-config-data\") pod \"19fdd47a-240f-41ce-98a6-b331af293ca1\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " Jan 30 06:54:48 crc kubenswrapper[4841]: I0130 06:54:48.965442 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-combined-ca-bundle\") pod \"19fdd47a-240f-41ce-98a6-b331af293ca1\" (UID: \"19fdd47a-240f-41ce-98a6-b331af293ca1\") " Jan 30 06:54:48 crc kubenswrapper[4841]: I0130 06:54:48.983868 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fdd47a-240f-41ce-98a6-b331af293ca1-kube-api-access-zjfzm" (OuterVolumeSpecName: "kube-api-access-zjfzm") pod "19fdd47a-240f-41ce-98a6-b331af293ca1" (UID: "19fdd47a-240f-41ce-98a6-b331af293ca1"). InnerVolumeSpecName "kube-api-access-zjfzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:54:48 crc kubenswrapper[4841]: I0130 06:54:48.995657 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-scripts" (OuterVolumeSpecName: "scripts") pod "19fdd47a-240f-41ce-98a6-b331af293ca1" (UID: "19fdd47a-240f-41ce-98a6-b331af293ca1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.068388 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjfzm\" (UniqueName: \"kubernetes.io/projected/19fdd47a-240f-41ce-98a6-b331af293ca1-kube-api-access-zjfzm\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.068442 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.100738 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19fdd47a-240f-41ce-98a6-b331af293ca1" (UID: "19fdd47a-240f-41ce-98a6-b331af293ca1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.128279 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-config-data" (OuterVolumeSpecName: "config-data") pod "19fdd47a-240f-41ce-98a6-b331af293ca1" (UID: "19fdd47a-240f-41ce-98a6-b331af293ca1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.170901 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.170937 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fdd47a-240f-41ce-98a6-b331af293ca1-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.732359 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"19fdd47a-240f-41ce-98a6-b331af293ca1","Type":"ContainerDied","Data":"3998174449c5d4a2a64101603791d7e3fb95911d632898f0f830f3a3d5e82920"} Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.732844 4841 scope.go:117] "RemoveContainer" containerID="3562fa2b1c65550d090c7d871bb6738a907af79877a9b89f3331d76a901ed65a" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.732558 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.794707 4841 scope.go:117] "RemoveContainer" containerID="7dce8f344b41918435532978e310812d59c6ea4a4c82c7dfee5932561ebc05bc" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.807439 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.819324 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.845628 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 30 06:54:49 crc kubenswrapper[4841]: E0130 06:54:49.846014 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb416a47-9b95-47e1-ae5c-35bdb775b9f0" containerName="registry-server" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.846031 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb416a47-9b95-47e1-ae5c-35bdb775b9f0" containerName="registry-server" Jan 30 06:54:49 crc kubenswrapper[4841]: E0130 06:54:49.846047 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-api" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.846053 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-api" Jan 30 06:54:49 crc kubenswrapper[4841]: E0130 06:54:49.846067 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb416a47-9b95-47e1-ae5c-35bdb775b9f0" containerName="extract-content" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.846072 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb416a47-9b95-47e1-ae5c-35bdb775b9f0" containerName="extract-content" Jan 30 06:54:49 crc kubenswrapper[4841]: E0130 06:54:49.846083 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-listener" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.846088 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-listener" Jan 30 06:54:49 crc kubenswrapper[4841]: E0130 06:54:49.846105 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb416a47-9b95-47e1-ae5c-35bdb775b9f0" containerName="extract-utilities" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.846112 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb416a47-9b95-47e1-ae5c-35bdb775b9f0" containerName="extract-utilities" Jan 30 06:54:49 crc kubenswrapper[4841]: E0130 06:54:49.846124 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-evaluator" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.846130 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-evaluator" Jan 30 06:54:49 crc kubenswrapper[4841]: E0130 06:54:49.846149 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-notifier" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.846155 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-notifier" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.846318 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-listener" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.846342 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb416a47-9b95-47e1-ae5c-35bdb775b9f0" containerName="registry-server" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.846352 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-api" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.846363 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-evaluator" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.846371 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" containerName="aodh-notifier" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.848109 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.851628 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.855567 4841 scope.go:117] "RemoveContainer" containerID="09030ae9a7f7917e9a694fb352b6a013fe574302567aa4156e8dd69592df4c03" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.855838 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9swpr" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.856229 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.856568 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.856738 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.885526 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.889919 4841 scope.go:117] "RemoveContainer" containerID="6b92d2dab1150328abb26122eb6c6a11e84b01e00f25928108cbed68c74ecdf9" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.994213 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-internal-tls-certs\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.994257 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-public-tls-certs\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.994566 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.994655 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-scripts\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.994729 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkmzh\" (UniqueName: \"kubernetes.io/projected/ec786f59-6593-4344-988a-14e9236879df-kube-api-access-lkmzh\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:49 crc kubenswrapper[4841]: I0130 06:54:49.994813 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-config-data\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.096897 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-internal-tls-certs\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.096940 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-public-tls-certs\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.097019 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.097040 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-scripts\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.097072 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkmzh\" (UniqueName: \"kubernetes.io/projected/ec786f59-6593-4344-988a-14e9236879df-kube-api-access-lkmzh\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.097776 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-config-data\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.102829 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-config-data\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.104940 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-public-tls-certs\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.106216 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-internal-tls-certs\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.116635 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-scripts\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.117310 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec786f59-6593-4344-988a-14e9236879df-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.121958 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkmzh\" (UniqueName: \"kubernetes.io/projected/ec786f59-6593-4344-988a-14e9236879df-kube-api-access-lkmzh\") pod \"aodh-0\" (UID: \"ec786f59-6593-4344-988a-14e9236879df\") " pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.168548 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.446242 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19fdd47a-240f-41ce-98a6-b331af293ca1" path="/var/lib/kubelet/pods/19fdd47a-240f-41ce-98a6-b331af293ca1/volumes" Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.695748 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 30 06:54:50 crc kubenswrapper[4841]: I0130 06:54:50.743261 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ec786f59-6593-4344-988a-14e9236879df","Type":"ContainerStarted","Data":"ffd21b7872cdafb2e298a3aa79c9ecdd3d668b0a8a4a8b8ab5717386f549f730"} Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.038563 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hwdkc"] Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.061159 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hwdkc"] Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.720943 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.767714 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ec786f59-6593-4344-988a-14e9236879df","Type":"ContainerStarted","Data":"ef9efd673cfd58fb4a4f259ece541ea4ad56fbdfcea74895b71e15aec0e237da"} Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.770334 4841 generic.go:334] "Generic (PLEG): container finished" podID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerID="e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3" exitCode=137 Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.770368 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5cd70eb-9a2a-4714-8c62-a08e83a7366f","Type":"ContainerDied","Data":"e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3"} Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.770386 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5cd70eb-9a2a-4714-8c62-a08e83a7366f","Type":"ContainerDied","Data":"f47b6a8821cc859766f5686ec09eef60d473b071b7a4a84c8b135b7a20dfcf11"} Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.770415 4841 scope.go:117] "RemoveContainer" containerID="e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.770518 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.793810 4841 scope.go:117] "RemoveContainer" containerID="07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.815712 4841 scope.go:117] "RemoveContainer" containerID="21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.839941 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-combined-ca-bundle\") pod \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.840000 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-sg-core-conf-yaml\") pod \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.840036 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-log-httpd\") pod \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.840095 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-scripts\") pod \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.840145 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-run-httpd\") pod \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.840163 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mrxh\" (UniqueName: \"kubernetes.io/projected/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-kube-api-access-8mrxh\") pod \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.840263 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-config-data\") pod \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\" (UID: \"c5cd70eb-9a2a-4714-8c62-a08e83a7366f\") " Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.842223 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c5cd70eb-9a2a-4714-8c62-a08e83a7366f" (UID: "c5cd70eb-9a2a-4714-8c62-a08e83a7366f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.845351 4841 scope.go:117] "RemoveContainer" containerID="e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.849141 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-kube-api-access-8mrxh" (OuterVolumeSpecName: "kube-api-access-8mrxh") pod "c5cd70eb-9a2a-4714-8c62-a08e83a7366f" (UID: "c5cd70eb-9a2a-4714-8c62-a08e83a7366f"). InnerVolumeSpecName "kube-api-access-8mrxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.851244 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c5cd70eb-9a2a-4714-8c62-a08e83a7366f" (UID: "c5cd70eb-9a2a-4714-8c62-a08e83a7366f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.858924 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-scripts" (OuterVolumeSpecName: "scripts") pod "c5cd70eb-9a2a-4714-8c62-a08e83a7366f" (UID: "c5cd70eb-9a2a-4714-8c62-a08e83a7366f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.872487 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c5cd70eb-9a2a-4714-8c62-a08e83a7366f" (UID: "c5cd70eb-9a2a-4714-8c62-a08e83a7366f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.943026 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.943054 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.943064 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.943073 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:51 crc kubenswrapper[4841]: I0130 06:54:51.943084 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mrxh\" (UniqueName: \"kubernetes.io/projected/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-kube-api-access-8mrxh\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.063951 4841 scope.go:117] "RemoveContainer" containerID="e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3" Jan 30 06:54:52 crc kubenswrapper[4841]: E0130 06:54:52.070555 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3\": container with ID starting with e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3 not found: ID does not exist" containerID="e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.070619 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3"} err="failed to get container status \"e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3\": rpc error: code = NotFound desc = could not find container \"e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3\": container with ID starting with e63cf92966d0a5f9208cb9dc3329ae1df28599fe89be853d76d28f04698c18a3 not found: ID does not exist" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.070655 4841 scope.go:117] "RemoveContainer" containerID="07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8" Jan 30 06:54:52 crc kubenswrapper[4841]: E0130 06:54:52.071498 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8\": container with ID starting with 07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8 not found: ID does not exist" containerID="07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.071519 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8"} err="failed to get container status \"07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8\": rpc error: code = NotFound desc = could not find container \"07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8\": container with ID starting with 07f6c0d7ee1478df1b82703160439f8379b45e6b9c40d8d0be363754ed94c6f8 not found: ID does not exist" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.071532 4841 scope.go:117] "RemoveContainer" containerID="21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b" Jan 30 06:54:52 crc kubenswrapper[4841]: E0130 06:54:52.073041 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b\": container with ID starting with 21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b not found: ID does not exist" containerID="21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.073064 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b"} err="failed to get container status \"21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b\": rpc error: code = NotFound desc = could not find container \"21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b\": container with ID starting with 21b04ff46b6e27f5671941207387114838e069a64fce5a794dabf244da5fee0b not found: ID does not exist" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.073082 4841 scope.go:117] "RemoveContainer" containerID="e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b" Jan 30 06:54:52 crc kubenswrapper[4841]: E0130 06:54:52.078494 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b\": container with ID starting with e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b not found: ID does not exist" containerID="e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.078537 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b"} err="failed to get container status \"e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b\": rpc error: code = NotFound desc = could not find container \"e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b\": container with ID starting with e0620540d352532451e120e68ae221b2d79f56d522bf6d4ad45d12a29c51c90b not found: ID does not exist" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.104756 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mccnj"] Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.104905 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5cd70eb-9a2a-4714-8c62-a08e83a7366f" (UID: "c5cd70eb-9a2a-4714-8c62-a08e83a7366f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.110555 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mccnj"] Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.124167 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-config-data" (OuterVolumeSpecName: "config-data") pod "c5cd70eb-9a2a-4714-8c62-a08e83a7366f" (UID: "c5cd70eb-9a2a-4714-8c62-a08e83a7366f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.151789 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.151825 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd70eb-9a2a-4714-8c62-a08e83a7366f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.405367 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.415903 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.430824 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:54:52 crc kubenswrapper[4841]: E0130 06:54:52.432563 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="ceilometer-central-agent" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.432595 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="ceilometer-central-agent" Jan 30 06:54:52 crc kubenswrapper[4841]: E0130 06:54:52.432627 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="ceilometer-notification-agent" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.432634 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="ceilometer-notification-agent" Jan 30 06:54:52 crc kubenswrapper[4841]: E0130 06:54:52.432644 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="sg-core" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.432651 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="sg-core" Jan 30 06:54:52 crc kubenswrapper[4841]: E0130 06:54:52.432660 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="proxy-httpd" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.432666 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="proxy-httpd" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.432879 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="proxy-httpd" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.432889 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="ceilometer-notification-agent" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.432897 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="ceilometer-central-agent" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.432915 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" containerName="sg-core" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.435025 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.443702 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.443720 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.444237 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.482588 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="273b206a-f216-479f-927c-aed775be10b6" path="/var/lib/kubelet/pods/273b206a-f216-479f-927c-aed775be10b6/volumes" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.484351 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="995f0195-eb79-4f48-ac66-ac38b0f7cd0f" path="/var/lib/kubelet/pods/995f0195-eb79-4f48-ac66-ac38b0f7cd0f/volumes" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.485721 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5cd70eb-9a2a-4714-8c62-a08e83a7366f" path="/var/lib/kubelet/pods/c5cd70eb-9a2a-4714-8c62-a08e83a7366f/volumes" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.487065 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.573274 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.573312 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-scripts\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.573408 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-config-data\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.573430 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3735cd33-a80d-4d70-a60e-28f51e415a4e-run-httpd\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.573443 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3735cd33-a80d-4d70-a60e-28f51e415a4e-log-httpd\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.573466 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.573501 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58qh5\" (UniqueName: \"kubernetes.io/projected/3735cd33-a80d-4d70-a60e-28f51e415a4e-kube-api-access-58qh5\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.573551 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.674901 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58qh5\" (UniqueName: \"kubernetes.io/projected/3735cd33-a80d-4d70-a60e-28f51e415a4e-kube-api-access-58qh5\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.674971 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.675057 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.675084 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-scripts\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.675133 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-config-data\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.675151 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3735cd33-a80d-4d70-a60e-28f51e415a4e-run-httpd\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.675163 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3735cd33-a80d-4d70-a60e-28f51e415a4e-log-httpd\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.675186 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.677635 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3735cd33-a80d-4d70-a60e-28f51e415a4e-run-httpd\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.678995 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3735cd33-a80d-4d70-a60e-28f51e415a4e-log-httpd\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.680850 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.680974 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.681641 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-config-data\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.684317 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-scripts\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.685127 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3735cd33-a80d-4d70-a60e-28f51e415a4e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.692165 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58qh5\" (UniqueName: \"kubernetes.io/projected/3735cd33-a80d-4d70-a60e-28f51e415a4e-kube-api-access-58qh5\") pod \"ceilometer-0\" (UID: \"3735cd33-a80d-4d70-a60e-28f51e415a4e\") " pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.770005 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 06:54:52 crc kubenswrapper[4841]: I0130 06:54:52.803297 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ec786f59-6593-4344-988a-14e9236879df","Type":"ContainerStarted","Data":"ef33b10f963a0a48efd907f90d0b972967d385c8588d7633ab38256c2aee2e06"} Jan 30 06:54:53 crc kubenswrapper[4841]: I0130 06:54:53.270827 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 06:54:53 crc kubenswrapper[4841]: I0130 06:54:53.830881 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3735cd33-a80d-4d70-a60e-28f51e415a4e","Type":"ContainerStarted","Data":"df21ed36dbb666f0740273b3bc84bcf9563b951b8ec3f501afb7f710e9edc712"} Jan 30 06:54:53 crc kubenswrapper[4841]: I0130 06:54:53.836710 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ec786f59-6593-4344-988a-14e9236879df","Type":"ContainerStarted","Data":"eacb1af856c2cdd84cc2e9ed0af31aa87f4a411924d457630ecf6b9dfa673571"} Jan 30 06:54:54 crc kubenswrapper[4841]: I0130 06:54:54.855221 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3735cd33-a80d-4d70-a60e-28f51e415a4e","Type":"ContainerStarted","Data":"e806a9d731331bd796983876cd5535119a4dc5cf161fd066033b17bbe26bb821"} Jan 30 06:54:54 crc kubenswrapper[4841]: I0130 06:54:54.855823 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3735cd33-a80d-4d70-a60e-28f51e415a4e","Type":"ContainerStarted","Data":"1edc82526c37af9ed52cafd48b538549240dfc119465ab9d801fb906c4ad9484"} Jan 30 06:54:54 crc kubenswrapper[4841]: I0130 06:54:54.860431 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ec786f59-6593-4344-988a-14e9236879df","Type":"ContainerStarted","Data":"a502fe383cb3e6b5cd20c8f3a8d6c99cc47dfbf34eb61a6c9fb687d41c890576"} Jan 30 06:54:54 crc kubenswrapper[4841]: I0130 06:54:54.889562 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.750205705 podStartE2EDuration="5.889545627s" podCreationTimestamp="2026-01-30 06:54:49 +0000 UTC" firstStartedPulling="2026-01-30 06:54:50.7141797 +0000 UTC m=+6427.707652338" lastFinishedPulling="2026-01-30 06:54:53.853519612 +0000 UTC m=+6430.846992260" observedRunningTime="2026-01-30 06:54:54.883080592 +0000 UTC m=+6431.876553220" watchObservedRunningTime="2026-01-30 06:54:54.889545627 +0000 UTC m=+6431.883018255" Jan 30 06:54:55 crc kubenswrapper[4841]: I0130 06:54:55.400482 4841 scope.go:117] "RemoveContainer" containerID="64c11b2824241bdfc1a972a0a8cb5f7c371117564c57e3a50363123323495590" Jan 30 06:54:55 crc kubenswrapper[4841]: I0130 06:54:55.473119 4841 scope.go:117] "RemoveContainer" containerID="66b4d4b8ef2b5b7d96321d40fc7e26088f66b7a180d0ab03cd01e57807a5ea1a" Jan 30 06:54:55 crc kubenswrapper[4841]: I0130 06:54:55.518539 4841 scope.go:117] "RemoveContainer" containerID="51cb556626cb01050492c4a16cbf0f284dd302575ccee459cafea347ce884939" Jan 30 06:54:55 crc kubenswrapper[4841]: I0130 06:54:55.548612 4841 scope.go:117] "RemoveContainer" containerID="ff7ee4bfc96e7a9b617a23c96e78aac89195c1ab971439d27f61a84ceccb7268" Jan 30 06:54:55 crc kubenswrapper[4841]: I0130 06:54:55.578171 4841 scope.go:117] "RemoveContainer" containerID="62a204ea064b2252916ea57f5f0f20afbca6ab750f74b88c97979f918df1721a" Jan 30 06:54:55 crc kubenswrapper[4841]: I0130 06:54:55.613811 4841 scope.go:117] "RemoveContainer" containerID="bd65f3076ea24c438dde45460632288a230a6e6a52c3fcf6f672d01f2b2982f3" Jan 30 06:54:55 crc kubenswrapper[4841]: I0130 06:54:55.651668 4841 scope.go:117] "RemoveContainer" containerID="64e4d66c014416becbe6b28e4b5228500fe624eaa4ba0aafbb94b47a30e0dbc6" Jan 30 06:54:55 crc kubenswrapper[4841]: I0130 06:54:55.716967 4841 scope.go:117] "RemoveContainer" containerID="4d625a00dde5495041c61246d870b950a9d82636d2edbbc24c5dc1c4e2c2729b" Jan 30 06:54:55 crc kubenswrapper[4841]: I0130 06:54:55.810044 4841 scope.go:117] "RemoveContainer" containerID="e614bb3403e088c40feda02435441d13c0e4673e306b16c045e5bcf222e3e102" Jan 30 06:54:55 crc kubenswrapper[4841]: I0130 06:54:55.929824 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3735cd33-a80d-4d70-a60e-28f51e415a4e","Type":"ContainerStarted","Data":"3c74dd17ad29c383ea5181a09ce8c86fe7d3eb55003bb3e9419fd1567c1a1eab"} Jan 30 06:54:57 crc kubenswrapper[4841]: I0130 06:54:57.433240 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:54:57 crc kubenswrapper[4841]: E0130 06:54:57.433711 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:54:58 crc kubenswrapper[4841]: I0130 06:54:58.967362 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3735cd33-a80d-4d70-a60e-28f51e415a4e","Type":"ContainerStarted","Data":"34e5043e2e4008ad601418126ead76c96b68e60eadd09603b5877a03cacb3215"} Jan 30 06:54:58 crc kubenswrapper[4841]: I0130 06:54:58.968194 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 06:54:59 crc kubenswrapper[4841]: I0130 06:54:59.012121 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.574003834 podStartE2EDuration="7.012094017s" podCreationTimestamp="2026-01-30 06:54:52 +0000 UTC" firstStartedPulling="2026-01-30 06:54:53.283856964 +0000 UTC m=+6430.277329602" lastFinishedPulling="2026-01-30 06:54:57.721947117 +0000 UTC m=+6434.715419785" observedRunningTime="2026-01-30 06:54:58.998824458 +0000 UTC m=+6435.992297106" watchObservedRunningTime="2026-01-30 06:54:59.012094017 +0000 UTC m=+6436.005566675" Jan 30 06:55:09 crc kubenswrapper[4841]: I0130 06:55:09.053915 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-wxs7z"] Jan 30 06:55:09 crc kubenswrapper[4841]: I0130 06:55:09.066241 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-wxs7z"] Jan 30 06:55:10 crc kubenswrapper[4841]: I0130 06:55:10.450376 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ff06e12-bd8d-4324-8521-3363f844ca41" path="/var/lib/kubelet/pods/9ff06e12-bd8d-4324-8521-3363f844ca41/volumes" Jan 30 06:55:12 crc kubenswrapper[4841]: I0130 06:55:12.432327 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:55:12 crc kubenswrapper[4841]: E0130 06:55:12.434194 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:55:22 crc kubenswrapper[4841]: I0130 06:55:22.788382 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 06:55:25 crc kubenswrapper[4841]: I0130 06:55:25.432360 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:55:25 crc kubenswrapper[4841]: E0130 06:55:25.432858 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:55:36 crc kubenswrapper[4841]: I0130 06:55:36.433519 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:55:36 crc kubenswrapper[4841]: E0130 06:55:36.434349 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:55:47 crc kubenswrapper[4841]: I0130 06:55:47.633868 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wkctm/must-gather-s8xvz"] Jan 30 06:55:47 crc kubenswrapper[4841]: I0130 06:55:47.643985 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkctm/must-gather-s8xvz" Jan 30 06:55:47 crc kubenswrapper[4841]: I0130 06:55:47.646588 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wkctm"/"kube-root-ca.crt" Jan 30 06:55:47 crc kubenswrapper[4841]: I0130 06:55:47.646824 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wkctm"/"openshift-service-ca.crt" Jan 30 06:55:47 crc kubenswrapper[4841]: I0130 06:55:47.657173 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wkctm/must-gather-s8xvz"] Jan 30 06:55:47 crc kubenswrapper[4841]: I0130 06:55:47.741237 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dc994cfa-341d-4e23-ac9c-72abe21d2b0c-must-gather-output\") pod \"must-gather-s8xvz\" (UID: \"dc994cfa-341d-4e23-ac9c-72abe21d2b0c\") " pod="openshift-must-gather-wkctm/must-gather-s8xvz" Jan 30 06:55:47 crc kubenswrapper[4841]: I0130 06:55:47.741763 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8fqk\" (UniqueName: \"kubernetes.io/projected/dc994cfa-341d-4e23-ac9c-72abe21d2b0c-kube-api-access-l8fqk\") pod \"must-gather-s8xvz\" (UID: \"dc994cfa-341d-4e23-ac9c-72abe21d2b0c\") " pod="openshift-must-gather-wkctm/must-gather-s8xvz" Jan 30 06:55:47 crc kubenswrapper[4841]: I0130 06:55:47.843164 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8fqk\" (UniqueName: \"kubernetes.io/projected/dc994cfa-341d-4e23-ac9c-72abe21d2b0c-kube-api-access-l8fqk\") pod \"must-gather-s8xvz\" (UID: \"dc994cfa-341d-4e23-ac9c-72abe21d2b0c\") " pod="openshift-must-gather-wkctm/must-gather-s8xvz" Jan 30 06:55:47 crc kubenswrapper[4841]: I0130 06:55:47.843248 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dc994cfa-341d-4e23-ac9c-72abe21d2b0c-must-gather-output\") pod \"must-gather-s8xvz\" (UID: \"dc994cfa-341d-4e23-ac9c-72abe21d2b0c\") " pod="openshift-must-gather-wkctm/must-gather-s8xvz" Jan 30 06:55:47 crc kubenswrapper[4841]: I0130 06:55:47.843674 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dc994cfa-341d-4e23-ac9c-72abe21d2b0c-must-gather-output\") pod \"must-gather-s8xvz\" (UID: \"dc994cfa-341d-4e23-ac9c-72abe21d2b0c\") " pod="openshift-must-gather-wkctm/must-gather-s8xvz" Jan 30 06:55:47 crc kubenswrapper[4841]: I0130 06:55:47.863780 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8fqk\" (UniqueName: \"kubernetes.io/projected/dc994cfa-341d-4e23-ac9c-72abe21d2b0c-kube-api-access-l8fqk\") pod \"must-gather-s8xvz\" (UID: \"dc994cfa-341d-4e23-ac9c-72abe21d2b0c\") " pod="openshift-must-gather-wkctm/must-gather-s8xvz" Jan 30 06:55:47 crc kubenswrapper[4841]: I0130 06:55:47.964206 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkctm/must-gather-s8xvz" Jan 30 06:55:48 crc kubenswrapper[4841]: I0130 06:55:48.497027 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wkctm/must-gather-s8xvz"] Jan 30 06:55:48 crc kubenswrapper[4841]: I0130 06:55:48.555726 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkctm/must-gather-s8xvz" event={"ID":"dc994cfa-341d-4e23-ac9c-72abe21d2b0c","Type":"ContainerStarted","Data":"73564c1a8d1ea1e00743e035699edbe8eb41f1c88bc90aee2cfe5025239fdcf5"} Jan 30 06:55:49 crc kubenswrapper[4841]: I0130 06:55:49.432009 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:55:49 crc kubenswrapper[4841]: E0130 06:55:49.432537 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:55:56 crc kubenswrapper[4841]: I0130 06:55:56.211290 4841 scope.go:117] "RemoveContainer" containerID="67d26d8a2b7bb3854d4c64a0d38149ff1c129f3f6a963336277c0f8ab206577b" Jan 30 06:55:56 crc kubenswrapper[4841]: I0130 06:55:56.449704 4841 scope.go:117] "RemoveContainer" containerID="9cd67ad651434aa9878f56de8705229c2671573dd5e09e67f7fe1f4c9ff352b6" Jan 30 06:55:56 crc kubenswrapper[4841]: I0130 06:55:56.481627 4841 scope.go:117] "RemoveContainer" containerID="e207f845fb90b7268c4ac6168efac75c16bf89e47f6e2148f595517a174980a6" Jan 30 06:55:56 crc kubenswrapper[4841]: I0130 06:55:56.632256 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkctm/must-gather-s8xvz" event={"ID":"dc994cfa-341d-4e23-ac9c-72abe21d2b0c","Type":"ContainerStarted","Data":"60bb217417395f0454bc85373d471f2d330b025d7b07592d268eb8ee2914ed7b"} Jan 30 06:55:56 crc kubenswrapper[4841]: I0130 06:55:56.632306 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkctm/must-gather-s8xvz" event={"ID":"dc994cfa-341d-4e23-ac9c-72abe21d2b0c","Type":"ContainerStarted","Data":"97ca3af24a8d8ec56f02bf5e25aa20877120e5a51f57579c1a03a25e8c18da03"} Jan 30 06:55:56 crc kubenswrapper[4841]: I0130 06:55:56.650090 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wkctm/must-gather-s8xvz" podStartSLOduration=2.438892867 podStartE2EDuration="9.650072628s" podCreationTimestamp="2026-01-30 06:55:47 +0000 UTC" firstStartedPulling="2026-01-30 06:55:48.463768296 +0000 UTC m=+6485.457240944" lastFinishedPulling="2026-01-30 06:55:55.674948057 +0000 UTC m=+6492.668420705" observedRunningTime="2026-01-30 06:55:56.647163009 +0000 UTC m=+6493.640635647" watchObservedRunningTime="2026-01-30 06:55:56.650072628 +0000 UTC m=+6493.643545266" Jan 30 06:56:01 crc kubenswrapper[4841]: I0130 06:56:01.070265 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wkctm/crc-debug-5hwcn"] Jan 30 06:56:01 crc kubenswrapper[4841]: I0130 06:56:01.072153 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkctm/crc-debug-5hwcn" Jan 30 06:56:01 crc kubenswrapper[4841]: I0130 06:56:01.073717 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wkctm"/"default-dockercfg-s7wm6" Jan 30 06:56:01 crc kubenswrapper[4841]: I0130 06:56:01.214785 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlj2x\" (UniqueName: \"kubernetes.io/projected/a7a7b9ed-7b78-4954-bf72-ffc847065b24-kube-api-access-vlj2x\") pod \"crc-debug-5hwcn\" (UID: \"a7a7b9ed-7b78-4954-bf72-ffc847065b24\") " pod="openshift-must-gather-wkctm/crc-debug-5hwcn" Jan 30 06:56:01 crc kubenswrapper[4841]: I0130 06:56:01.214925 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7a7b9ed-7b78-4954-bf72-ffc847065b24-host\") pod \"crc-debug-5hwcn\" (UID: \"a7a7b9ed-7b78-4954-bf72-ffc847065b24\") " pod="openshift-must-gather-wkctm/crc-debug-5hwcn" Jan 30 06:56:01 crc kubenswrapper[4841]: I0130 06:56:01.317152 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlj2x\" (UniqueName: \"kubernetes.io/projected/a7a7b9ed-7b78-4954-bf72-ffc847065b24-kube-api-access-vlj2x\") pod \"crc-debug-5hwcn\" (UID: \"a7a7b9ed-7b78-4954-bf72-ffc847065b24\") " pod="openshift-must-gather-wkctm/crc-debug-5hwcn" Jan 30 06:56:01 crc kubenswrapper[4841]: I0130 06:56:01.317263 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7a7b9ed-7b78-4954-bf72-ffc847065b24-host\") pod \"crc-debug-5hwcn\" (UID: \"a7a7b9ed-7b78-4954-bf72-ffc847065b24\") " pod="openshift-must-gather-wkctm/crc-debug-5hwcn" Jan 30 06:56:01 crc kubenswrapper[4841]: I0130 06:56:01.317349 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7a7b9ed-7b78-4954-bf72-ffc847065b24-host\") pod \"crc-debug-5hwcn\" (UID: \"a7a7b9ed-7b78-4954-bf72-ffc847065b24\") " pod="openshift-must-gather-wkctm/crc-debug-5hwcn" Jan 30 06:56:01 crc kubenswrapper[4841]: I0130 06:56:01.354249 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlj2x\" (UniqueName: \"kubernetes.io/projected/a7a7b9ed-7b78-4954-bf72-ffc847065b24-kube-api-access-vlj2x\") pod \"crc-debug-5hwcn\" (UID: \"a7a7b9ed-7b78-4954-bf72-ffc847065b24\") " pod="openshift-must-gather-wkctm/crc-debug-5hwcn" Jan 30 06:56:01 crc kubenswrapper[4841]: I0130 06:56:01.434963 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkctm/crc-debug-5hwcn" Jan 30 06:56:01 crc kubenswrapper[4841]: W0130 06:56:01.476509 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7a7b9ed_7b78_4954_bf72_ffc847065b24.slice/crio-453853a0d16c52ecdb5860ff8353e713213d6a59e42641a70fb192a88ba14c23 WatchSource:0}: Error finding container 453853a0d16c52ecdb5860ff8353e713213d6a59e42641a70fb192a88ba14c23: Status 404 returned error can't find the container with id 453853a0d16c52ecdb5860ff8353e713213d6a59e42641a70fb192a88ba14c23 Jan 30 06:56:01 crc kubenswrapper[4841]: I0130 06:56:01.682665 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkctm/crc-debug-5hwcn" event={"ID":"a7a7b9ed-7b78-4954-bf72-ffc847065b24","Type":"ContainerStarted","Data":"453853a0d16c52ecdb5860ff8353e713213d6a59e42641a70fb192a88ba14c23"} Jan 30 06:56:02 crc kubenswrapper[4841]: I0130 06:56:02.433430 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:56:02 crc kubenswrapper[4841]: E0130 06:56:02.434013 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:56:12 crc kubenswrapper[4841]: I0130 06:56:12.803494 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkctm/crc-debug-5hwcn" event={"ID":"a7a7b9ed-7b78-4954-bf72-ffc847065b24","Type":"ContainerStarted","Data":"64cc1845f05ed3d4eebc200eb4b899e89b060e8e1e33e900b099d0b3d868d19b"} Jan 30 06:56:12 crc kubenswrapper[4841]: I0130 06:56:12.837936 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wkctm/crc-debug-5hwcn" podStartSLOduration=1.000742902 podStartE2EDuration="11.83790943s" podCreationTimestamp="2026-01-30 06:56:01 +0000 UTC" firstStartedPulling="2026-01-30 06:56:01.480646133 +0000 UTC m=+6498.474118771" lastFinishedPulling="2026-01-30 06:56:12.317812661 +0000 UTC m=+6509.311285299" observedRunningTime="2026-01-30 06:56:12.817989972 +0000 UTC m=+6509.811462640" watchObservedRunningTime="2026-01-30 06:56:12.83790943 +0000 UTC m=+6509.831382108" Jan 30 06:56:16 crc kubenswrapper[4841]: I0130 06:56:16.434772 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:56:16 crc kubenswrapper[4841]: E0130 06:56:16.435704 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:56:20 crc kubenswrapper[4841]: I0130 06:56:20.146959 4841 patch_prober.go:28] interesting pod/router-default-5444994796-r5skt container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 06:56:20 crc kubenswrapper[4841]: I0130 06:56:20.147679 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-r5skt" podUID="64d09cb6-73a6-4de8-8164-d8a241df4e5c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:56:20 crc kubenswrapper[4841]: I0130 06:56:20.146956 4841 patch_prober.go:28] interesting pod/router-default-5444994796-r5skt container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 06:56:20 crc kubenswrapper[4841]: I0130 06:56:20.147761 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-r5skt" podUID="64d09cb6-73a6-4de8-8164-d8a241df4e5c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 06:56:20 crc kubenswrapper[4841]: I0130 06:56:20.328067 4841 trace.go:236] Trace[1671896538]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-1" (30-Jan-2026 06:56:19.291) (total time: 1035ms): Jan 30 06:56:20 crc kubenswrapper[4841]: Trace[1671896538]: [1.03580851s] [1.03580851s] END Jan 30 06:56:25 crc kubenswrapper[4841]: I0130 06:56:25.059637 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="f5fdf7cb-89ff-42f2-b405-f0423ab54224" containerName="galera" probeResult="failure" output="command timed out" Jan 30 06:56:25 crc kubenswrapper[4841]: I0130 06:56:25.132860 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="f5fdf7cb-89ff-42f2-b405-f0423ab54224" containerName="galera" probeResult="failure" output="command timed out" Jan 30 06:56:29 crc kubenswrapper[4841]: I0130 06:56:29.433343 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:56:29 crc kubenswrapper[4841]: E0130 06:56:29.434345 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:56:44 crc kubenswrapper[4841]: I0130 06:56:44.144870 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-xblv8"] Jan 30 06:56:44 crc kubenswrapper[4841]: I0130 06:56:44.158255 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-xblv8"] Jan 30 06:56:44 crc kubenswrapper[4841]: I0130 06:56:44.440717 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:56:44 crc kubenswrapper[4841]: E0130 06:56:44.441237 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:56:44 crc kubenswrapper[4841]: I0130 06:56:44.511003 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c7b918d-ef6a-4644-bf1a-d653c4cfdeac" path="/var/lib/kubelet/pods/2c7b918d-ef6a-4644-bf1a-d653c4cfdeac/volumes" Jan 30 06:56:45 crc kubenswrapper[4841]: I0130 06:56:45.038843 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-a59b-account-create-update-xr8hp"] Jan 30 06:56:45 crc kubenswrapper[4841]: I0130 06:56:45.053588 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-a59b-account-create-update-xr8hp"] Jan 30 06:56:45 crc kubenswrapper[4841]: I0130 06:56:45.542627 4841 generic.go:334] "Generic (PLEG): container finished" podID="a7a7b9ed-7b78-4954-bf72-ffc847065b24" containerID="64cc1845f05ed3d4eebc200eb4b899e89b060e8e1e33e900b099d0b3d868d19b" exitCode=0 Jan 30 06:56:45 crc kubenswrapper[4841]: I0130 06:56:45.542713 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkctm/crc-debug-5hwcn" event={"ID":"a7a7b9ed-7b78-4954-bf72-ffc847065b24","Type":"ContainerDied","Data":"64cc1845f05ed3d4eebc200eb4b899e89b060e8e1e33e900b099d0b3d868d19b"} Jan 30 06:56:46 crc kubenswrapper[4841]: I0130 06:56:46.444615 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59452b25-eace-4fe3-985a-2efd23a31dc4" path="/var/lib/kubelet/pods/59452b25-eace-4fe3-985a-2efd23a31dc4/volumes" Jan 30 06:56:46 crc kubenswrapper[4841]: I0130 06:56:46.666394 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkctm/crc-debug-5hwcn" Jan 30 06:56:46 crc kubenswrapper[4841]: I0130 06:56:46.718685 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wkctm/crc-debug-5hwcn"] Jan 30 06:56:46 crc kubenswrapper[4841]: I0130 06:56:46.728592 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wkctm/crc-debug-5hwcn"] Jan 30 06:56:46 crc kubenswrapper[4841]: I0130 06:56:46.740963 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlj2x\" (UniqueName: \"kubernetes.io/projected/a7a7b9ed-7b78-4954-bf72-ffc847065b24-kube-api-access-vlj2x\") pod \"a7a7b9ed-7b78-4954-bf72-ffc847065b24\" (UID: \"a7a7b9ed-7b78-4954-bf72-ffc847065b24\") " Jan 30 06:56:46 crc kubenswrapper[4841]: I0130 06:56:46.741187 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7a7b9ed-7b78-4954-bf72-ffc847065b24-host\") pod \"a7a7b9ed-7b78-4954-bf72-ffc847065b24\" (UID: \"a7a7b9ed-7b78-4954-bf72-ffc847065b24\") " Jan 30 06:56:46 crc kubenswrapper[4841]: I0130 06:56:46.741327 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7a7b9ed-7b78-4954-bf72-ffc847065b24-host" (OuterVolumeSpecName: "host") pod "a7a7b9ed-7b78-4954-bf72-ffc847065b24" (UID: "a7a7b9ed-7b78-4954-bf72-ffc847065b24"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:56:46 crc kubenswrapper[4841]: I0130 06:56:46.741850 4841 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7a7b9ed-7b78-4954-bf72-ffc847065b24-host\") on node \"crc\" DevicePath \"\"" Jan 30 06:56:46 crc kubenswrapper[4841]: I0130 06:56:46.752778 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a7b9ed-7b78-4954-bf72-ffc847065b24-kube-api-access-vlj2x" (OuterVolumeSpecName: "kube-api-access-vlj2x") pod "a7a7b9ed-7b78-4954-bf72-ffc847065b24" (UID: "a7a7b9ed-7b78-4954-bf72-ffc847065b24"). InnerVolumeSpecName "kube-api-access-vlj2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:56:46 crc kubenswrapper[4841]: I0130 06:56:46.843731 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlj2x\" (UniqueName: \"kubernetes.io/projected/a7a7b9ed-7b78-4954-bf72-ffc847065b24-kube-api-access-vlj2x\") on node \"crc\" DevicePath \"\"" Jan 30 06:56:47 crc kubenswrapper[4841]: I0130 06:56:47.562917 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="453853a0d16c52ecdb5860ff8353e713213d6a59e42641a70fb192a88ba14c23" Jan 30 06:56:47 crc kubenswrapper[4841]: I0130 06:56:47.562964 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkctm/crc-debug-5hwcn" Jan 30 06:56:47 crc kubenswrapper[4841]: I0130 06:56:47.969068 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wkctm/crc-debug-xhjns"] Jan 30 06:56:47 crc kubenswrapper[4841]: E0130 06:56:47.970015 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a7b9ed-7b78-4954-bf72-ffc847065b24" containerName="container-00" Jan 30 06:56:47 crc kubenswrapper[4841]: I0130 06:56:47.970037 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a7b9ed-7b78-4954-bf72-ffc847065b24" containerName="container-00" Jan 30 06:56:47 crc kubenswrapper[4841]: I0130 06:56:47.970482 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a7b9ed-7b78-4954-bf72-ffc847065b24" containerName="container-00" Jan 30 06:56:47 crc kubenswrapper[4841]: I0130 06:56:47.971669 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkctm/crc-debug-xhjns" Jan 30 06:56:47 crc kubenswrapper[4841]: I0130 06:56:47.975468 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wkctm"/"default-dockercfg-s7wm6" Jan 30 06:56:48 crc kubenswrapper[4841]: I0130 06:56:48.069110 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7ppk\" (UniqueName: \"kubernetes.io/projected/63851204-21ac-4882-bc86-1c35b28eb3ae-kube-api-access-p7ppk\") pod \"crc-debug-xhjns\" (UID: \"63851204-21ac-4882-bc86-1c35b28eb3ae\") " pod="openshift-must-gather-wkctm/crc-debug-xhjns" Jan 30 06:56:48 crc kubenswrapper[4841]: I0130 06:56:48.069316 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63851204-21ac-4882-bc86-1c35b28eb3ae-host\") pod \"crc-debug-xhjns\" (UID: \"63851204-21ac-4882-bc86-1c35b28eb3ae\") " pod="openshift-must-gather-wkctm/crc-debug-xhjns" Jan 30 06:56:48 crc kubenswrapper[4841]: I0130 06:56:48.171138 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7ppk\" (UniqueName: \"kubernetes.io/projected/63851204-21ac-4882-bc86-1c35b28eb3ae-kube-api-access-p7ppk\") pod \"crc-debug-xhjns\" (UID: \"63851204-21ac-4882-bc86-1c35b28eb3ae\") " pod="openshift-must-gather-wkctm/crc-debug-xhjns" Jan 30 06:56:48 crc kubenswrapper[4841]: I0130 06:56:48.171255 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63851204-21ac-4882-bc86-1c35b28eb3ae-host\") pod \"crc-debug-xhjns\" (UID: \"63851204-21ac-4882-bc86-1c35b28eb3ae\") " pod="openshift-must-gather-wkctm/crc-debug-xhjns" Jan 30 06:56:48 crc kubenswrapper[4841]: I0130 06:56:48.171367 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63851204-21ac-4882-bc86-1c35b28eb3ae-host\") pod \"crc-debug-xhjns\" (UID: \"63851204-21ac-4882-bc86-1c35b28eb3ae\") " pod="openshift-must-gather-wkctm/crc-debug-xhjns" Jan 30 06:56:48 crc kubenswrapper[4841]: I0130 06:56:48.205231 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7ppk\" (UniqueName: \"kubernetes.io/projected/63851204-21ac-4882-bc86-1c35b28eb3ae-kube-api-access-p7ppk\") pod \"crc-debug-xhjns\" (UID: \"63851204-21ac-4882-bc86-1c35b28eb3ae\") " pod="openshift-must-gather-wkctm/crc-debug-xhjns" Jan 30 06:56:48 crc kubenswrapper[4841]: I0130 06:56:48.294093 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkctm/crc-debug-xhjns" Jan 30 06:56:48 crc kubenswrapper[4841]: W0130 06:56:48.337587 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63851204_21ac_4882_bc86_1c35b28eb3ae.slice/crio-951f44af6cbe956c022971a2873f8f395e2b70a256c7f3c87effbbdae7e41c65 WatchSource:0}: Error finding container 951f44af6cbe956c022971a2873f8f395e2b70a256c7f3c87effbbdae7e41c65: Status 404 returned error can't find the container with id 951f44af6cbe956c022971a2873f8f395e2b70a256c7f3c87effbbdae7e41c65 Jan 30 06:56:48 crc kubenswrapper[4841]: I0130 06:56:48.445044 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7a7b9ed-7b78-4954-bf72-ffc847065b24" path="/var/lib/kubelet/pods/a7a7b9ed-7b78-4954-bf72-ffc847065b24/volumes" Jan 30 06:56:48 crc kubenswrapper[4841]: I0130 06:56:48.571756 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkctm/crc-debug-xhjns" event={"ID":"63851204-21ac-4882-bc86-1c35b28eb3ae","Type":"ContainerStarted","Data":"951f44af6cbe956c022971a2873f8f395e2b70a256c7f3c87effbbdae7e41c65"} Jan 30 06:56:49 crc kubenswrapper[4841]: I0130 06:56:49.583254 4841 generic.go:334] "Generic (PLEG): container finished" podID="63851204-21ac-4882-bc86-1c35b28eb3ae" containerID="ff7e888a0ce7435981c017525b76ff992fb960e0ff0a4230bab903a404a039b6" exitCode=1 Jan 30 06:56:49 crc kubenswrapper[4841]: I0130 06:56:49.583372 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkctm/crc-debug-xhjns" event={"ID":"63851204-21ac-4882-bc86-1c35b28eb3ae","Type":"ContainerDied","Data":"ff7e888a0ce7435981c017525b76ff992fb960e0ff0a4230bab903a404a039b6"} Jan 30 06:56:49 crc kubenswrapper[4841]: I0130 06:56:49.628671 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wkctm/crc-debug-xhjns"] Jan 30 06:56:49 crc kubenswrapper[4841]: I0130 06:56:49.642162 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wkctm/crc-debug-xhjns"] Jan 30 06:56:50 crc kubenswrapper[4841]: I0130 06:56:50.035593 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-7vfn6"] Jan 30 06:56:50 crc kubenswrapper[4841]: I0130 06:56:50.047241 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-7vfn6"] Jan 30 06:56:50 crc kubenswrapper[4841]: I0130 06:56:50.457033 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad3b0080-e6fb-4ac6-8e06-b4096892d2b8" path="/var/lib/kubelet/pods/ad3b0080-e6fb-4ac6-8e06-b4096892d2b8/volumes" Jan 30 06:56:50 crc kubenswrapper[4841]: I0130 06:56:50.749986 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkctm/crc-debug-xhjns" Jan 30 06:56:50 crc kubenswrapper[4841]: I0130 06:56:50.851066 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63851204-21ac-4882-bc86-1c35b28eb3ae-host\") pod \"63851204-21ac-4882-bc86-1c35b28eb3ae\" (UID: \"63851204-21ac-4882-bc86-1c35b28eb3ae\") " Jan 30 06:56:50 crc kubenswrapper[4841]: I0130 06:56:50.851249 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7ppk\" (UniqueName: \"kubernetes.io/projected/63851204-21ac-4882-bc86-1c35b28eb3ae-kube-api-access-p7ppk\") pod \"63851204-21ac-4882-bc86-1c35b28eb3ae\" (UID: \"63851204-21ac-4882-bc86-1c35b28eb3ae\") " Jan 30 06:56:50 crc kubenswrapper[4841]: I0130 06:56:50.852861 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63851204-21ac-4882-bc86-1c35b28eb3ae-host" (OuterVolumeSpecName: "host") pod "63851204-21ac-4882-bc86-1c35b28eb3ae" (UID: "63851204-21ac-4882-bc86-1c35b28eb3ae"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 06:56:50 crc kubenswrapper[4841]: I0130 06:56:50.870328 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63851204-21ac-4882-bc86-1c35b28eb3ae-kube-api-access-p7ppk" (OuterVolumeSpecName: "kube-api-access-p7ppk") pod "63851204-21ac-4882-bc86-1c35b28eb3ae" (UID: "63851204-21ac-4882-bc86-1c35b28eb3ae"). InnerVolumeSpecName "kube-api-access-p7ppk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:56:50 crc kubenswrapper[4841]: I0130 06:56:50.954831 4841 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63851204-21ac-4882-bc86-1c35b28eb3ae-host\") on node \"crc\" DevicePath \"\"" Jan 30 06:56:50 crc kubenswrapper[4841]: I0130 06:56:50.954886 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7ppk\" (UniqueName: \"kubernetes.io/projected/63851204-21ac-4882-bc86-1c35b28eb3ae-kube-api-access-p7ppk\") on node \"crc\" DevicePath \"\"" Jan 30 06:56:51 crc kubenswrapper[4841]: I0130 06:56:51.037806 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-2df5-account-create-update-gqjjl"] Jan 30 06:56:51 crc kubenswrapper[4841]: I0130 06:56:51.051029 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-2df5-account-create-update-gqjjl"] Jan 30 06:56:51 crc kubenswrapper[4841]: I0130 06:56:51.605912 4841 scope.go:117] "RemoveContainer" containerID="ff7e888a0ce7435981c017525b76ff992fb960e0ff0a4230bab903a404a039b6" Jan 30 06:56:51 crc kubenswrapper[4841]: I0130 06:56:51.605972 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkctm/crc-debug-xhjns" Jan 30 06:56:52 crc kubenswrapper[4841]: I0130 06:56:52.448161 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63851204-21ac-4882-bc86-1c35b28eb3ae" path="/var/lib/kubelet/pods/63851204-21ac-4882-bc86-1c35b28eb3ae/volumes" Jan 30 06:56:52 crc kubenswrapper[4841]: I0130 06:56:52.568216 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e24b2d2-fd41-4fb5-9433-e2e0d7b70266" path="/var/lib/kubelet/pods/9e24b2d2-fd41-4fb5-9433-e2e0d7b70266/volumes" Jan 30 06:56:56 crc kubenswrapper[4841]: I0130 06:56:56.597138 4841 scope.go:117] "RemoveContainer" containerID="9ee49d0d2d9dc00ea3264731c984def580b89af0651675dfb616fe9f5ff92aff" Jan 30 06:56:58 crc kubenswrapper[4841]: I0130 06:56:58.432487 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:56:58 crc kubenswrapper[4841]: E0130 06:56:58.433247 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:56:58 crc kubenswrapper[4841]: I0130 06:56:58.631121 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="fb572435-b17a-4caf-afb1-a2c334f6bc35" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.1.150:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 06:57:01 crc kubenswrapper[4841]: I0130 06:57:01.487613 4841 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 5.610093713s: [/var/lib/containers/storage/overlay/fee380da97675e226870d09a78019299c113afb1f82416a4d562f75747566844/diff /var/log/pods/openshift-must-gather-wkctm_must-gather-s8xvz_dc994cfa-341d-4e23-ac9c-72abe21d2b0c/gather/0.log]; will not log again for this container unless duration exceeds 2s Jan 30 06:57:01 crc kubenswrapper[4841]: I0130 06:57:01.499597 4841 scope.go:117] "RemoveContainer" containerID="c2912e987a0200d8fbbeb69885f2ad0a6171dd539ce5726ca73ec45187c4367c" Jan 30 06:57:01 crc kubenswrapper[4841]: I0130 06:57:01.558815 4841 scope.go:117] "RemoveContainer" containerID="faba54ee2d6ecf901ab1a1a6af0a6c3f047379488ad4e20367c55ec816112a6d" Jan 30 06:57:01 crc kubenswrapper[4841]: I0130 06:57:01.618488 4841 scope.go:117] "RemoveContainer" containerID="a93748098483f741ad17575cfeaaf729dc2486ab84f610abf18b5ada2b6e3e10" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.356197 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-49mbp"] Jan 30 06:57:09 crc kubenswrapper[4841]: E0130 06:57:09.360713 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63851204-21ac-4882-bc86-1c35b28eb3ae" containerName="container-00" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.360840 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="63851204-21ac-4882-bc86-1c35b28eb3ae" containerName="container-00" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.361219 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="63851204-21ac-4882-bc86-1c35b28eb3ae" containerName="container-00" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.363051 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.400786 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-49mbp"] Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.490934 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11b5f26-2240-41c0-b4cb-0329f96c42be-catalog-content\") pod \"redhat-operators-49mbp\" (UID: \"e11b5f26-2240-41c0-b4cb-0329f96c42be\") " pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.491032 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n4rp\" (UniqueName: \"kubernetes.io/projected/e11b5f26-2240-41c0-b4cb-0329f96c42be-kube-api-access-2n4rp\") pod \"redhat-operators-49mbp\" (UID: \"e11b5f26-2240-41c0-b4cb-0329f96c42be\") " pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.491064 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11b5f26-2240-41c0-b4cb-0329f96c42be-utilities\") pod \"redhat-operators-49mbp\" (UID: \"e11b5f26-2240-41c0-b4cb-0329f96c42be\") " pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.593269 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11b5f26-2240-41c0-b4cb-0329f96c42be-catalog-content\") pod \"redhat-operators-49mbp\" (UID: \"e11b5f26-2240-41c0-b4cb-0329f96c42be\") " pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.593371 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n4rp\" (UniqueName: \"kubernetes.io/projected/e11b5f26-2240-41c0-b4cb-0329f96c42be-kube-api-access-2n4rp\") pod \"redhat-operators-49mbp\" (UID: \"e11b5f26-2240-41c0-b4cb-0329f96c42be\") " pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.593394 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11b5f26-2240-41c0-b4cb-0329f96c42be-utilities\") pod \"redhat-operators-49mbp\" (UID: \"e11b5f26-2240-41c0-b4cb-0329f96c42be\") " pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.594006 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11b5f26-2240-41c0-b4cb-0329f96c42be-catalog-content\") pod \"redhat-operators-49mbp\" (UID: \"e11b5f26-2240-41c0-b4cb-0329f96c42be\") " pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.594187 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11b5f26-2240-41c0-b4cb-0329f96c42be-utilities\") pod \"redhat-operators-49mbp\" (UID: \"e11b5f26-2240-41c0-b4cb-0329f96c42be\") " pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.611965 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n4rp\" (UniqueName: \"kubernetes.io/projected/e11b5f26-2240-41c0-b4cb-0329f96c42be-kube-api-access-2n4rp\") pod \"redhat-operators-49mbp\" (UID: \"e11b5f26-2240-41c0-b4cb-0329f96c42be\") " pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:09 crc kubenswrapper[4841]: I0130 06:57:09.700253 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:10 crc kubenswrapper[4841]: I0130 06:57:10.265137 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-49mbp"] Jan 30 06:57:10 crc kubenswrapper[4841]: W0130 06:57:10.269271 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode11b5f26_2240_41c0_b4cb_0329f96c42be.slice/crio-700d19fd04b0bde721bff0793d7b45164b3b579795313c8594fadfb3bd952c49 WatchSource:0}: Error finding container 700d19fd04b0bde721bff0793d7b45164b3b579795313c8594fadfb3bd952c49: Status 404 returned error can't find the container with id 700d19fd04b0bde721bff0793d7b45164b3b579795313c8594fadfb3bd952c49 Jan 30 06:57:10 crc kubenswrapper[4841]: I0130 06:57:10.844669 4841 generic.go:334] "Generic (PLEG): container finished" podID="e11b5f26-2240-41c0-b4cb-0329f96c42be" containerID="3c69c0b524758f6fe664e8c7117fb4f4d734cccf96546ebb74e288021d15cb42" exitCode=0 Jan 30 06:57:10 crc kubenswrapper[4841]: I0130 06:57:10.844773 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49mbp" event={"ID":"e11b5f26-2240-41c0-b4cb-0329f96c42be","Type":"ContainerDied","Data":"3c69c0b524758f6fe664e8c7117fb4f4d734cccf96546ebb74e288021d15cb42"} Jan 30 06:57:10 crc kubenswrapper[4841]: I0130 06:57:10.844897 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49mbp" event={"ID":"e11b5f26-2240-41c0-b4cb-0329f96c42be","Type":"ContainerStarted","Data":"700d19fd04b0bde721bff0793d7b45164b3b579795313c8594fadfb3bd952c49"} Jan 30 06:57:11 crc kubenswrapper[4841]: I0130 06:57:11.432188 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:57:11 crc kubenswrapper[4841]: E0130 06:57:11.432764 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:57:11 crc kubenswrapper[4841]: I0130 06:57:11.862570 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49mbp" event={"ID":"e11b5f26-2240-41c0-b4cb-0329f96c42be","Type":"ContainerStarted","Data":"c66ebf0384a6906526e5a0ac37b7d6ebc77582f588798f20fb7e098aa8731e84"} Jan 30 06:57:18 crc kubenswrapper[4841]: I0130 06:57:18.924881 4841 generic.go:334] "Generic (PLEG): container finished" podID="e11b5f26-2240-41c0-b4cb-0329f96c42be" containerID="c66ebf0384a6906526e5a0ac37b7d6ebc77582f588798f20fb7e098aa8731e84" exitCode=0 Jan 30 06:57:18 crc kubenswrapper[4841]: I0130 06:57:18.925342 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49mbp" event={"ID":"e11b5f26-2240-41c0-b4cb-0329f96c42be","Type":"ContainerDied","Data":"c66ebf0384a6906526e5a0ac37b7d6ebc77582f588798f20fb7e098aa8731e84"} Jan 30 06:57:19 crc kubenswrapper[4841]: I0130 06:57:19.940884 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49mbp" event={"ID":"e11b5f26-2240-41c0-b4cb-0329f96c42be","Type":"ContainerStarted","Data":"b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9"} Jan 30 06:57:19 crc kubenswrapper[4841]: I0130 06:57:19.962549 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-49mbp" podStartSLOduration=2.403949379 podStartE2EDuration="10.962533105s" podCreationTimestamp="2026-01-30 06:57:09 +0000 UTC" firstStartedPulling="2026-01-30 06:57:10.847158799 +0000 UTC m=+6567.840631437" lastFinishedPulling="2026-01-30 06:57:19.405742525 +0000 UTC m=+6576.399215163" observedRunningTime="2026-01-30 06:57:19.959602136 +0000 UTC m=+6576.953074804" watchObservedRunningTime="2026-01-30 06:57:19.962533105 +0000 UTC m=+6576.956005743" Jan 30 06:57:26 crc kubenswrapper[4841]: I0130 06:57:26.432034 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:57:26 crc kubenswrapper[4841]: E0130 06:57:26.432829 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:57:29 crc kubenswrapper[4841]: I0130 06:57:29.701779 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:29 crc kubenswrapper[4841]: I0130 06:57:29.702021 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:30 crc kubenswrapper[4841]: I0130 06:57:30.771795 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-49mbp" podUID="e11b5f26-2240-41c0-b4cb-0329f96c42be" containerName="registry-server" probeResult="failure" output=< Jan 30 06:57:30 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Jan 30 06:57:30 crc kubenswrapper[4841]: > Jan 30 06:57:39 crc kubenswrapper[4841]: I0130 06:57:39.432943 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:57:39 crc kubenswrapper[4841]: E0130 06:57:39.433650 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:57:39 crc kubenswrapper[4841]: I0130 06:57:39.773740 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:39 crc kubenswrapper[4841]: I0130 06:57:39.823052 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:40 crc kubenswrapper[4841]: I0130 06:57:40.562200 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-49mbp"] Jan 30 06:57:41 crc kubenswrapper[4841]: I0130 06:57:41.068298 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-fvvjj"] Jan 30 06:57:41 crc kubenswrapper[4841]: I0130 06:57:41.084903 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-fvvjj"] Jan 30 06:57:41 crc kubenswrapper[4841]: I0130 06:57:41.157204 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-49mbp" podUID="e11b5f26-2240-41c0-b4cb-0329f96c42be" containerName="registry-server" containerID="cri-o://b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9" gracePeriod=2 Jan 30 06:57:41 crc kubenswrapper[4841]: I0130 06:57:41.968786 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.009060 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n4rp\" (UniqueName: \"kubernetes.io/projected/e11b5f26-2240-41c0-b4cb-0329f96c42be-kube-api-access-2n4rp\") pod \"e11b5f26-2240-41c0-b4cb-0329f96c42be\" (UID: \"e11b5f26-2240-41c0-b4cb-0329f96c42be\") " Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.009185 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11b5f26-2240-41c0-b4cb-0329f96c42be-utilities\") pod \"e11b5f26-2240-41c0-b4cb-0329f96c42be\" (UID: \"e11b5f26-2240-41c0-b4cb-0329f96c42be\") " Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.009207 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11b5f26-2240-41c0-b4cb-0329f96c42be-catalog-content\") pod \"e11b5f26-2240-41c0-b4cb-0329f96c42be\" (UID: \"e11b5f26-2240-41c0-b4cb-0329f96c42be\") " Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.010325 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e11b5f26-2240-41c0-b4cb-0329f96c42be-utilities" (OuterVolumeSpecName: "utilities") pod "e11b5f26-2240-41c0-b4cb-0329f96c42be" (UID: "e11b5f26-2240-41c0-b4cb-0329f96c42be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.019139 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e11b5f26-2240-41c0-b4cb-0329f96c42be-kube-api-access-2n4rp" (OuterVolumeSpecName: "kube-api-access-2n4rp") pod "e11b5f26-2240-41c0-b4cb-0329f96c42be" (UID: "e11b5f26-2240-41c0-b4cb-0329f96c42be"). InnerVolumeSpecName "kube-api-access-2n4rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.112408 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n4rp\" (UniqueName: \"kubernetes.io/projected/e11b5f26-2240-41c0-b4cb-0329f96c42be-kube-api-access-2n4rp\") on node \"crc\" DevicePath \"\"" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.112467 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11b5f26-2240-41c0-b4cb-0329f96c42be-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.169582 4841 generic.go:334] "Generic (PLEG): container finished" podID="e11b5f26-2240-41c0-b4cb-0329f96c42be" containerID="b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9" exitCode=0 Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.169634 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49mbp" event={"ID":"e11b5f26-2240-41c0-b4cb-0329f96c42be","Type":"ContainerDied","Data":"b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9"} Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.169673 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49mbp" event={"ID":"e11b5f26-2240-41c0-b4cb-0329f96c42be","Type":"ContainerDied","Data":"700d19fd04b0bde721bff0793d7b45164b3b579795313c8594fadfb3bd952c49"} Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.169696 4841 scope.go:117] "RemoveContainer" containerID="b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.169731 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49mbp" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.192182 4841 scope.go:117] "RemoveContainer" containerID="c66ebf0384a6906526e5a0ac37b7d6ebc77582f588798f20fb7e098aa8731e84" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.215704 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e11b5f26-2240-41c0-b4cb-0329f96c42be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e11b5f26-2240-41c0-b4cb-0329f96c42be" (UID: "e11b5f26-2240-41c0-b4cb-0329f96c42be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.215862 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11b5f26-2240-41c0-b4cb-0329f96c42be-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.228621 4841 scope.go:117] "RemoveContainer" containerID="3c69c0b524758f6fe664e8c7117fb4f4d734cccf96546ebb74e288021d15cb42" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.288134 4841 scope.go:117] "RemoveContainer" containerID="b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9" Jan 30 06:57:42 crc kubenswrapper[4841]: E0130 06:57:42.288745 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9\": container with ID starting with b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9 not found: ID does not exist" containerID="b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.288797 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9"} err="failed to get container status \"b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9\": rpc error: code = NotFound desc = could not find container \"b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9\": container with ID starting with b9313fdbe58cc252d1c51081633b43f08d796e4a73389c99088914275a1748a9 not found: ID does not exist" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.288824 4841 scope.go:117] "RemoveContainer" containerID="c66ebf0384a6906526e5a0ac37b7d6ebc77582f588798f20fb7e098aa8731e84" Jan 30 06:57:42 crc kubenswrapper[4841]: E0130 06:57:42.289180 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66ebf0384a6906526e5a0ac37b7d6ebc77582f588798f20fb7e098aa8731e84\": container with ID starting with c66ebf0384a6906526e5a0ac37b7d6ebc77582f588798f20fb7e098aa8731e84 not found: ID does not exist" containerID="c66ebf0384a6906526e5a0ac37b7d6ebc77582f588798f20fb7e098aa8731e84" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.289215 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66ebf0384a6906526e5a0ac37b7d6ebc77582f588798f20fb7e098aa8731e84"} err="failed to get container status \"c66ebf0384a6906526e5a0ac37b7d6ebc77582f588798f20fb7e098aa8731e84\": rpc error: code = NotFound desc = could not find container \"c66ebf0384a6906526e5a0ac37b7d6ebc77582f588798f20fb7e098aa8731e84\": container with ID starting with c66ebf0384a6906526e5a0ac37b7d6ebc77582f588798f20fb7e098aa8731e84 not found: ID does not exist" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.289237 4841 scope.go:117] "RemoveContainer" containerID="3c69c0b524758f6fe664e8c7117fb4f4d734cccf96546ebb74e288021d15cb42" Jan 30 06:57:42 crc kubenswrapper[4841]: E0130 06:57:42.289673 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c69c0b524758f6fe664e8c7117fb4f4d734cccf96546ebb74e288021d15cb42\": container with ID starting with 3c69c0b524758f6fe664e8c7117fb4f4d734cccf96546ebb74e288021d15cb42 not found: ID does not exist" containerID="3c69c0b524758f6fe664e8c7117fb4f4d734cccf96546ebb74e288021d15cb42" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.289856 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c69c0b524758f6fe664e8c7117fb4f4d734cccf96546ebb74e288021d15cb42"} err="failed to get container status \"3c69c0b524758f6fe664e8c7117fb4f4d734cccf96546ebb74e288021d15cb42\": rpc error: code = NotFound desc = could not find container \"3c69c0b524758f6fe664e8c7117fb4f4d734cccf96546ebb74e288021d15cb42\": container with ID starting with 3c69c0b524758f6fe664e8c7117fb4f4d734cccf96546ebb74e288021d15cb42 not found: ID does not exist" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.455956 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="621420dc-8438-4a7d-ad06-37e25becb572" path="/var/lib/kubelet/pods/621420dc-8438-4a7d-ad06-37e25becb572/volumes" Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.510267 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-49mbp"] Jan 30 06:57:42 crc kubenswrapper[4841]: I0130 06:57:42.520394 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-49mbp"] Jan 30 06:57:44 crc kubenswrapper[4841]: I0130 06:57:44.476195 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e11b5f26-2240-41c0-b4cb-0329f96c42be" path="/var/lib/kubelet/pods/e11b5f26-2240-41c0-b4cb-0329f96c42be/volumes" Jan 30 06:57:45 crc kubenswrapper[4841]: I0130 06:57:45.221837 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_bbb5c437-a583-437d-9d16-34a0f8ed492e/init-config-reloader/0.log" Jan 30 06:57:45 crc kubenswrapper[4841]: I0130 06:57:45.439961 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_bbb5c437-a583-437d-9d16-34a0f8ed492e/init-config-reloader/0.log" Jan 30 06:57:45 crc kubenswrapper[4841]: I0130 06:57:45.458283 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_bbb5c437-a583-437d-9d16-34a0f8ed492e/alertmanager/0.log" Jan 30 06:57:45 crc kubenswrapper[4841]: I0130 06:57:45.485254 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_bbb5c437-a583-437d-9d16-34a0f8ed492e/config-reloader/0.log" Jan 30 06:57:45 crc kubenswrapper[4841]: I0130 06:57:45.645097 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ec786f59-6593-4344-988a-14e9236879df/aodh-api/0.log" Jan 30 06:57:45 crc kubenswrapper[4841]: I0130 06:57:45.696652 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ec786f59-6593-4344-988a-14e9236879df/aodh-evaluator/0.log" Jan 30 06:57:45 crc kubenswrapper[4841]: I0130 06:57:45.820892 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ec786f59-6593-4344-988a-14e9236879df/aodh-listener/0.log" Jan 30 06:57:45 crc kubenswrapper[4841]: I0130 06:57:45.859351 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ec786f59-6593-4344-988a-14e9236879df/aodh-notifier/0.log" Jan 30 06:57:45 crc kubenswrapper[4841]: I0130 06:57:45.903844 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-1557-account-create-update-jbrq7_d248d34a-8ccf-48dd-bb30-9ad79bd380c8/mariadb-account-create-update/0.log" Jan 30 06:57:46 crc kubenswrapper[4841]: I0130 06:57:46.044547 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-db-create-96zrp_4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44/mariadb-database-create/0.log" Jan 30 06:57:46 crc kubenswrapper[4841]: I0130 06:57:46.115481 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-db-sync-npgfr_6f73848a-d9bc-4486-b7f5-f9f3fca5e13c/aodh-db-sync/0.log" Jan 30 06:57:46 crc kubenswrapper[4841]: I0130 06:57:46.270025 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7579c69d68-vszjk_521faf93-2a50-42d0-851e-3f2b407aeb5f/barbican-api/0.log" Jan 30 06:57:46 crc kubenswrapper[4841]: I0130 06:57:46.299000 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7579c69d68-vszjk_521faf93-2a50-42d0-851e-3f2b407aeb5f/barbican-api-log/0.log" Jan 30 06:57:46 crc kubenswrapper[4841]: I0130 06:57:46.451520 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7bdf49ff4b-r29n8_5daf3957-8483-40b4-a236-f92459dab9e4/barbican-keystone-listener/0.log" Jan 30 06:57:46 crc kubenswrapper[4841]: I0130 06:57:46.537555 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7bdf49ff4b-r29n8_5daf3957-8483-40b4-a236-f92459dab9e4/barbican-keystone-listener-log/0.log" Jan 30 06:57:46 crc kubenswrapper[4841]: I0130 06:57:46.554840 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-775cc4458f-ltjlx_a1b2b5e9-9162-48d4-b839-18e4d5316535/barbican-worker/0.log" Jan 30 06:57:46 crc kubenswrapper[4841]: I0130 06:57:46.759183 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-775cc4458f-ltjlx_a1b2b5e9-9162-48d4-b839-18e4d5316535/barbican-worker-log/0.log" Jan 30 06:57:46 crc kubenswrapper[4841]: I0130 06:57:46.785591 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3735cd33-a80d-4d70-a60e-28f51e415a4e/ceilometer-central-agent/0.log" Jan 30 06:57:46 crc kubenswrapper[4841]: I0130 06:57:46.977259 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3735cd33-a80d-4d70-a60e-28f51e415a4e/sg-core/0.log" Jan 30 06:57:46 crc kubenswrapper[4841]: I0130 06:57:46.979904 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3735cd33-a80d-4d70-a60e-28f51e415a4e/proxy-httpd/0.log" Jan 30 06:57:46 crc kubenswrapper[4841]: I0130 06:57:46.998927 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3735cd33-a80d-4d70-a60e-28f51e415a4e/ceilometer-notification-agent/0.log" Jan 30 06:57:47 crc kubenswrapper[4841]: I0130 06:57:47.184461 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_881d50e8-f2e2-4d39-b27d-b893a75b9470/cinder-api/0.log" Jan 30 06:57:47 crc kubenswrapper[4841]: I0130 06:57:47.191374 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_881d50e8-f2e2-4d39-b27d-b893a75b9470/cinder-api-log/0.log" Jan 30 06:57:47 crc kubenswrapper[4841]: I0130 06:57:47.363000 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9e49f703-0cd0-4d76-aa9a-7694ac86e74b/cinder-scheduler/0.log" Jan 30 06:57:47 crc kubenswrapper[4841]: I0130 06:57:47.406154 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9e49f703-0cd0-4d76-aa9a-7694ac86e74b/probe/0.log" Jan 30 06:57:47 crc kubenswrapper[4841]: I0130 06:57:47.495730 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6446589fcf-vl8h4_1aef25db-fefb-45b2-8db8-14dd86910ddf/init/0.log" Jan 30 06:57:47 crc kubenswrapper[4841]: I0130 06:57:47.681518 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6446589fcf-vl8h4_1aef25db-fefb-45b2-8db8-14dd86910ddf/init/0.log" Jan 30 06:57:47 crc kubenswrapper[4841]: I0130 06:57:47.686119 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6446589fcf-vl8h4_1aef25db-fefb-45b2-8db8-14dd86910ddf/dnsmasq-dns/0.log" Jan 30 06:57:47 crc kubenswrapper[4841]: I0130 06:57:47.778989 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_02d60ed7-0a8d-45c7-8448-89a43c660178/glance-httpd/0.log" Jan 30 06:57:47 crc kubenswrapper[4841]: I0130 06:57:47.898613 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_02d60ed7-0a8d-45c7-8448-89a43c660178/glance-log/0.log" Jan 30 06:57:47 crc kubenswrapper[4841]: I0130 06:57:47.952114 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_75a77158-2a09-4554-b6c1-c04b11fff9cf/glance-httpd/0.log" Jan 30 06:57:47 crc kubenswrapper[4841]: I0130 06:57:47.995983 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_75a77158-2a09-4554-b6c1-c04b11fff9cf/glance-log/0.log" Jan 30 06:57:48 crc kubenswrapper[4841]: I0130 06:57:48.145665 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-30c7-account-create-update-tzfxj_96f84dc1-99f0-4b50-ae30-d15c6166c0b9/mariadb-account-create-update/0.log" Jan 30 06:57:48 crc kubenswrapper[4841]: I0130 06:57:48.250554 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-766f7d9fb9-kf8rg_23b34f45-6116-42be-b368-f81b715edee4/heat-api/0.log" Jan 30 06:57:48 crc kubenswrapper[4841]: I0130 06:57:48.446337 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-67dcbb9cbb-qvgxj_baef3ad1-d37a-4782-af64-1e87771092cd/heat-cfnapi/0.log" Jan 30 06:57:48 crc kubenswrapper[4841]: I0130 06:57:48.582436 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-create-sgbz8_90c33414-5508-46f4-bce1-b008af425e4c/mariadb-database-create/0.log" Jan 30 06:57:48 crc kubenswrapper[4841]: I0130 06:57:48.690750 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-sync-8sv85_a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3/heat-db-sync/0.log" Jan 30 06:57:48 crc kubenswrapper[4841]: I0130 06:57:48.817129 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-69b8748bf7-r7xx5_484c8903-064d-425e-b01b-2d61dbe306da/heat-engine/0.log" Jan 30 06:57:48 crc kubenswrapper[4841]: I0130 06:57:48.991973 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-67fcb744b8-lwsgh_0fcb13ee-9577-4300-909e-735238669ee3/horizon/0.log" Jan 30 06:57:49 crc kubenswrapper[4841]: I0130 06:57:49.264745 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-67fcb744b8-lwsgh_0fcb13ee-9577-4300-909e-735238669ee3/horizon-log/0.log" Jan 30 06:57:49 crc kubenswrapper[4841]: I0130 06:57:49.354131 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5525028c-d229-479d-a828-a5d33c61d333/kube-state-metrics/0.log" Jan 30 06:57:49 crc kubenswrapper[4841]: I0130 06:57:49.453759 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7b7f59b7f9-hz6dc_e2f58a90-22a1-478c-a19e-6e71499f307e/keystone-api/0.log" Jan 30 06:57:49 crc kubenswrapper[4841]: I0130 06:57:49.507222 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_e8971d3d-723a-46c8-b729-707fbe7b953b/adoption/0.log" Jan 30 06:57:49 crc kubenswrapper[4841]: I0130 06:57:49.748363 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-589f6d954f-4jxxb_d0490b4c-ee5f-4676-80f4-0d239b99a182/neutron-api/0.log" Jan 30 06:57:49 crc kubenswrapper[4841]: I0130 06:57:49.844451 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-589f6d954f-4jxxb_d0490b4c-ee5f-4676-80f4-0d239b99a182/neutron-httpd/0.log" Jan 30 06:57:50 crc kubenswrapper[4841]: I0130 06:57:50.114293 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fec0a9da-3385-4362-b5bf-c44ff1de727a/nova-api-log/0.log" Jan 30 06:57:50 crc kubenswrapper[4841]: I0130 06:57:50.213717 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fec0a9da-3385-4362-b5bf-c44ff1de727a/nova-api-api/0.log" Jan 30 06:57:50 crc kubenswrapper[4841]: I0130 06:57:50.245666 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_67e2985a-b102-472e-90af-13a1e4278197/nova-cell0-conductor-conductor/0.log" Jan 30 06:57:50 crc kubenswrapper[4841]: I0130 06:57:50.361137 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_85adb897-5ab0-44b4-95a1-36e1610522d8/nova-cell1-conductor-conductor/0.log" Jan 30 06:57:50 crc kubenswrapper[4841]: I0130 06:57:50.534911 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b7eb1e2a-e1f2-4bb7-a95d-fbe741da44ae/nova-cell1-novncproxy-novncproxy/0.log" Jan 30 06:57:50 crc kubenswrapper[4841]: I0130 06:57:50.712828 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_12ae12c4-8995-4bcd-9237-393737f5ae1d/nova-metadata-log/0.log" Jan 30 06:57:50 crc kubenswrapper[4841]: I0130 06:57:50.999521 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-86b4d7c875-xgqnc_01c16743-1d26-4b7a-805a-1b7452a2dd0e/init/0.log" Jan 30 06:57:51 crc kubenswrapper[4841]: I0130 06:57:51.037193 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e97b62be-59aa-4331-bc11-98b4fd2297d8/nova-scheduler-scheduler/0.log" Jan 30 06:57:51 crc kubenswrapper[4841]: I0130 06:57:51.107940 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_12ae12c4-8995-4bcd-9237-393737f5ae1d/nova-metadata-metadata/0.log" Jan 30 06:57:51 crc kubenswrapper[4841]: I0130 06:57:51.196856 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-86b4d7c875-xgqnc_01c16743-1d26-4b7a-805a-1b7452a2dd0e/init/0.log" Jan 30 06:57:51 crc kubenswrapper[4841]: I0130 06:57:51.395455 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-86b4d7c875-xgqnc_01c16743-1d26-4b7a-805a-1b7452a2dd0e/octavia-api-provider-agent/0.log" Jan 30 06:57:51 crc kubenswrapper[4841]: I0130 06:57:51.420572 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-64w76_e20a4f5d-18f6-4eeb-84ac-6d1ee628795e/init/0.log" Jan 30 06:57:51 crc kubenswrapper[4841]: I0130 06:57:51.445826 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-86b4d7c875-xgqnc_01c16743-1d26-4b7a-805a-1b7452a2dd0e/octavia-api/0.log" Jan 30 06:57:51 crc kubenswrapper[4841]: I0130 06:57:51.636617 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-64w76_e20a4f5d-18f6-4eeb-84ac-6d1ee628795e/octavia-healthmanager/0.log" Jan 30 06:57:51 crc kubenswrapper[4841]: I0130 06:57:51.669381 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-d9nvn_8414debc-743c-4450-9a4b-d3cc68df42c7/init/0.log" Jan 30 06:57:51 crc kubenswrapper[4841]: I0130 06:57:51.694064 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-64w76_e20a4f5d-18f6-4eeb-84ac-6d1ee628795e/init/0.log" Jan 30 06:57:52 crc kubenswrapper[4841]: I0130 06:57:52.008114 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-d9nvn_8414debc-743c-4450-9a4b-d3cc68df42c7/init/0.log" Jan 30 06:57:52 crc kubenswrapper[4841]: I0130 06:57:52.018363 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-d9nvn_8414debc-743c-4450-9a4b-d3cc68df42c7/octavia-housekeeping/0.log" Jan 30 06:57:52 crc kubenswrapper[4841]: I0130 06:57:52.087430 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-65dd99cb46-xxkcb_c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb/init/0.log" Jan 30 06:57:52 crc kubenswrapper[4841]: I0130 06:57:52.271943 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-65dd99cb46-xxkcb_c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb/init/0.log" Jan 30 06:57:52 crc kubenswrapper[4841]: I0130 06:57:52.285846 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-65dd99cb46-xxkcb_c0ac0073-7e30-4b75-9b7a-3694c2cd3cbb/octavia-amphora-httpd/0.log" Jan 30 06:57:52 crc kubenswrapper[4841]: I0130 06:57:52.386630 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-c92xw_1f804e83-62a5-4880-b595-7ece506cb780/init/0.log" Jan 30 06:57:52 crc kubenswrapper[4841]: I0130 06:57:52.431601 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:57:52 crc kubenswrapper[4841]: E0130 06:57:52.431938 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:57:52 crc kubenswrapper[4841]: I0130 06:57:52.608249 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-c92xw_1f804e83-62a5-4880-b595-7ece506cb780/octavia-rsyslog/0.log" Jan 30 06:57:52 crc kubenswrapper[4841]: I0130 06:57:52.662396 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-c92xw_1f804e83-62a5-4880-b595-7ece506cb780/init/0.log" Jan 30 06:57:52 crc kubenswrapper[4841]: I0130 06:57:52.683680 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-55s5s_794000ea-5a8e-44dd-b44c-af445a21f8ec/init/0.log" Jan 30 06:57:52 crc kubenswrapper[4841]: I0130 06:57:52.886452 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b274ae67-2cd8-40ad-a998-0483646d84a1/mysql-bootstrap/0.log" Jan 30 06:57:52 crc kubenswrapper[4841]: I0130 06:57:52.898411 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-55s5s_794000ea-5a8e-44dd-b44c-af445a21f8ec/init/0.log" Jan 30 06:57:53 crc kubenswrapper[4841]: I0130 06:57:53.124893 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-55s5s_794000ea-5a8e-44dd-b44c-af445a21f8ec/octavia-worker/0.log" Jan 30 06:57:53 crc kubenswrapper[4841]: I0130 06:57:53.272903 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b274ae67-2cd8-40ad-a998-0483646d84a1/mysql-bootstrap/0.log" Jan 30 06:57:53 crc kubenswrapper[4841]: I0130 06:57:53.294445 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b274ae67-2cd8-40ad-a998-0483646d84a1/galera/0.log" Jan 30 06:57:53 crc kubenswrapper[4841]: I0130 06:57:53.391486 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f5fdf7cb-89ff-42f2-b405-f0423ab54224/mysql-bootstrap/0.log" Jan 30 06:57:53 crc kubenswrapper[4841]: I0130 06:57:53.559660 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f5fdf7cb-89ff-42f2-b405-f0423ab54224/mysql-bootstrap/0.log" Jan 30 06:57:53 crc kubenswrapper[4841]: I0130 06:57:53.616657 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_207e1540-ff6c-44b7-8d66-a6a4572fcbb2/openstackclient/0.log" Jan 30 06:57:53 crc kubenswrapper[4841]: I0130 06:57:53.655412 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f5fdf7cb-89ff-42f2-b405-f0423ab54224/galera/0.log" Jan 30 06:57:53 crc kubenswrapper[4841]: I0130 06:57:53.861867 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-4wrmv_496ddc15-cb66-4871-85b5-39673abb40e6/ovn-controller/0.log" Jan 30 06:57:53 crc kubenswrapper[4841]: I0130 06:57:53.961169 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mm4vw_6caea7ee-bd59-4d38-89d0-ca3bbeda764e/openstack-network-exporter/0.log" Jan 30 06:57:54 crc kubenswrapper[4841]: I0130 06:57:54.073757 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2596f_7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd/ovsdb-server-init/0.log" Jan 30 06:57:54 crc kubenswrapper[4841]: I0130 06:57:54.285637 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2596f_7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd/ovsdb-server/0.log" Jan 30 06:57:54 crc kubenswrapper[4841]: I0130 06:57:54.349082 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2596f_7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd/ovsdb-server-init/0.log" Jan 30 06:57:54 crc kubenswrapper[4841]: I0130 06:57:54.380021 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2596f_7b90ffe0-123f-41d2-a8b2-3f9e1cb55ccd/ovs-vswitchd/0.log" Jan 30 06:57:54 crc kubenswrapper[4841]: I0130 06:57:54.489536 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_5fe595aa-ebed-48fc-a546-37e04be7808f/adoption/0.log" Jan 30 06:57:54 crc kubenswrapper[4841]: I0130 06:57:54.604773 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cea99eaf-d2ae-4baf-a45c-27a7f5279d5c/ovn-northd/0.log" Jan 30 06:57:54 crc kubenswrapper[4841]: I0130 06:57:54.643102 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cea99eaf-d2ae-4baf-a45c-27a7f5279d5c/openstack-network-exporter/0.log" Jan 30 06:57:54 crc kubenswrapper[4841]: I0130 06:57:54.769358 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1317cff7-7bfb-4a1a-8686-d3bcb83d6949/openstack-network-exporter/0.log" Jan 30 06:57:54 crc kubenswrapper[4841]: I0130 06:57:54.815027 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1317cff7-7bfb-4a1a-8686-d3bcb83d6949/ovsdbserver-nb/0.log" Jan 30 06:57:54 crc kubenswrapper[4841]: I0130 06:57:54.959180 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_8975a1d6-f9a2-4161-9d45-3e2886345aec/openstack-network-exporter/0.log" Jan 30 06:57:55 crc kubenswrapper[4841]: I0130 06:57:55.074983 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_8975a1d6-f9a2-4161-9d45-3e2886345aec/ovsdbserver-nb/0.log" Jan 30 06:57:55 crc kubenswrapper[4841]: I0130 06:57:55.126586 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_dcc67785-4c94-4fda-a487-9a6d82288895/ovsdbserver-nb/0.log" Jan 30 06:57:55 crc kubenswrapper[4841]: I0130 06:57:55.162726 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_dcc67785-4c94-4fda-a487-9a6d82288895/openstack-network-exporter/0.log" Jan 30 06:57:55 crc kubenswrapper[4841]: I0130 06:57:55.288122 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a274354c-c005-44ce-85f0-feb8762cc66d/openstack-network-exporter/0.log" Jan 30 06:57:55 crc kubenswrapper[4841]: I0130 06:57:55.413082 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a274354c-c005-44ce-85f0-feb8762cc66d/ovsdbserver-sb/0.log" Jan 30 06:57:55 crc kubenswrapper[4841]: I0130 06:57:55.483968 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_861c7989-9589-44f4-bfac-39fc9c3c5b8c/openstack-network-exporter/0.log" Jan 30 06:57:55 crc kubenswrapper[4841]: I0130 06:57:55.549908 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_861c7989-9589-44f4-bfac-39fc9c3c5b8c/ovsdbserver-sb/0.log" Jan 30 06:57:55 crc kubenswrapper[4841]: I0130 06:57:55.677660 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_990cb14a-6eab-4cbf-997f-d04cb95b3575/openstack-network-exporter/0.log" Jan 30 06:57:55 crc kubenswrapper[4841]: I0130 06:57:55.752181 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_990cb14a-6eab-4cbf-997f-d04cb95b3575/ovsdbserver-sb/0.log" Jan 30 06:57:55 crc kubenswrapper[4841]: I0130 06:57:55.916569 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55cd7599f4-9hklc_6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f/placement-api/0.log" Jan 30 06:57:55 crc kubenswrapper[4841]: I0130 06:57:55.945803 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55cd7599f4-9hklc_6f4aa5c0-cef6-4fbb-9194-544ada4c5f8f/placement-log/0.log" Jan 30 06:57:56 crc kubenswrapper[4841]: I0130 06:57:56.062376 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb572435-b17a-4caf-afb1-a2c334f6bc35/init-config-reloader/0.log" Jan 30 06:57:56 crc kubenswrapper[4841]: I0130 06:57:56.218700 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb572435-b17a-4caf-afb1-a2c334f6bc35/config-reloader/0.log" Jan 30 06:57:56 crc kubenswrapper[4841]: I0130 06:57:56.230567 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb572435-b17a-4caf-afb1-a2c334f6bc35/init-config-reloader/0.log" Jan 30 06:57:56 crc kubenswrapper[4841]: I0130 06:57:56.251288 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb572435-b17a-4caf-afb1-a2c334f6bc35/prometheus/0.log" Jan 30 06:57:56 crc kubenswrapper[4841]: I0130 06:57:56.338634 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb572435-b17a-4caf-afb1-a2c334f6bc35/thanos-sidecar/0.log" Jan 30 06:57:56 crc kubenswrapper[4841]: I0130 06:57:56.473986 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a6b64b15-a098-4533-98d1-c9d8ac355ada/setup-container/0.log" Jan 30 06:57:56 crc kubenswrapper[4841]: I0130 06:57:56.666238 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a6b64b15-a098-4533-98d1-c9d8ac355ada/setup-container/0.log" Jan 30 06:57:56 crc kubenswrapper[4841]: I0130 06:57:56.674223 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a6b64b15-a098-4533-98d1-c9d8ac355ada/rabbitmq/0.log" Jan 30 06:57:56 crc kubenswrapper[4841]: I0130 06:57:56.825040 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a957beed-affa-4b58-9ac2-f3fe95d3a50c/setup-container/0.log" Jan 30 06:57:57 crc kubenswrapper[4841]: I0130 06:57:57.017110 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a957beed-affa-4b58-9ac2-f3fe95d3a50c/setup-container/0.log" Jan 30 06:57:57 crc kubenswrapper[4841]: I0130 06:57:57.067014 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a957beed-affa-4b58-9ac2-f3fe95d3a50c/rabbitmq/0.log" Jan 30 06:57:57 crc kubenswrapper[4841]: I0130 06:57:57.083711 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-666dbb5c86-lvn9k_6eab423d-c14d-4368-8fa2-7cf5147b9410/proxy-httpd/0.log" Jan 30 06:57:57 crc kubenswrapper[4841]: I0130 06:57:57.192610 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-666dbb5c86-lvn9k_6eab423d-c14d-4368-8fa2-7cf5147b9410/proxy-server/0.log" Jan 30 06:57:57 crc kubenswrapper[4841]: I0130 06:57:57.338633 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mt4km_e345d041-d2b4-4aa7-aae0-35c5dbba9d0b/swift-ring-rebalance/0.log" Jan 30 06:58:01 crc kubenswrapper[4841]: I0130 06:58:01.861421 4841 scope.go:117] "RemoveContainer" containerID="a58793f75867646f56782014bc7dfb75ad6b7818ff19eaa2ee886e3c411105dd" Jan 30 06:58:01 crc kubenswrapper[4841]: I0130 06:58:01.914843 4841 scope.go:117] "RemoveContainer" containerID="7c753b15d9f36d778db1c8d786c8c9dfc38edce3c553775a33eb5718aa479253" Jan 30 06:58:07 crc kubenswrapper[4841]: I0130 06:58:07.431888 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:58:07 crc kubenswrapper[4841]: E0130 06:58:07.432535 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:58:12 crc kubenswrapper[4841]: I0130 06:58:12.341785 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_bcdeb8de-47da-4a31-b19f-e178726e3676/memcached/0.log" Jan 30 06:58:21 crc kubenswrapper[4841]: I0130 06:58:21.432915 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:58:21 crc kubenswrapper[4841]: E0130 06:58:21.434577 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:58:26 crc kubenswrapper[4841]: I0130 06:58:26.660599 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88_d34858df-a7b1-4f65-9406-1b86d5a84e7d/util/0.log" Jan 30 06:58:26 crc kubenswrapper[4841]: I0130 06:58:26.854965 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88_d34858df-a7b1-4f65-9406-1b86d5a84e7d/pull/0.log" Jan 30 06:58:26 crc kubenswrapper[4841]: I0130 06:58:26.858014 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88_d34858df-a7b1-4f65-9406-1b86d5a84e7d/util/0.log" Jan 30 06:58:26 crc kubenswrapper[4841]: I0130 06:58:26.861935 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88_d34858df-a7b1-4f65-9406-1b86d5a84e7d/pull/0.log" Jan 30 06:58:27 crc kubenswrapper[4841]: I0130 06:58:27.087221 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88_d34858df-a7b1-4f65-9406-1b86d5a84e7d/util/0.log" Jan 30 06:58:27 crc kubenswrapper[4841]: I0130 06:58:27.092263 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88_d34858df-a7b1-4f65-9406-1b86d5a84e7d/extract/0.log" Jan 30 06:58:27 crc kubenswrapper[4841]: I0130 06:58:27.101888 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1znp88_d34858df-a7b1-4f65-9406-1b86d5a84e7d/pull/0.log" Jan 30 06:58:27 crc kubenswrapper[4841]: I0130 06:58:27.362011 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-c857j_683d1de5-a849-4cc4-ad31-e4ddce58ce3a/manager/0.log" Jan 30 06:58:27 crc kubenswrapper[4841]: I0130 06:58:27.372129 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-ggldr_5abd246f-9cd6-44a5-b189-3f757aa6904b/manager/0.log" Jan 30 06:58:27 crc kubenswrapper[4841]: I0130 06:58:27.479373 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-b9wjq_da35fca1-95bf-4bf9-9dc6-2696846c402d/manager/0.log" Jan 30 06:58:27 crc kubenswrapper[4841]: I0130 06:58:27.680119 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-5bf7x_382d26e6-1e34-4de0-8e4a-50230ce1a90f/manager/0.log" Jan 30 06:58:27 crc kubenswrapper[4841]: I0130 06:58:27.775225 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-69jsz_36279e2b-d43d-452c-a159-269eccab814a/manager/0.log" Jan 30 06:58:27 crc kubenswrapper[4841]: I0130 06:58:27.783125 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-qf8h2_99db8f34-be75-405c-abfc-c79f8a246b3a/manager/0.log" Jan 30 06:58:28 crc kubenswrapper[4841]: I0130 06:58:28.006577 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-74h94_e0cb95b9-d9f5-4927-b7e5-47199da17894/manager/0.log" Jan 30 06:58:28 crc kubenswrapper[4841]: I0130 06:58:28.371764 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-b64xq_d3675976-12f6-4169-8354-b3fbca99354a/manager/0.log" Jan 30 06:58:28 crc kubenswrapper[4841]: I0130 06:58:28.398456 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-8622s_53bed426-a8df-4f33-8c52-c838d1a47f35/manager/0.log" Jan 30 06:58:28 crc kubenswrapper[4841]: I0130 06:58:28.503186 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-bz4zb_a3671fee-5baf-4bcf-8246-49b65ef8f0c8/manager/0.log" Jan 30 06:58:28 crc kubenswrapper[4841]: I0130 06:58:28.574444 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-6mfjj_0ddb6fe3-6a2c-493f-b718-a5766e3a7cf8/manager/0.log" Jan 30 06:58:28 crc kubenswrapper[4841]: I0130 06:58:28.692829 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-ptsr8_9dfc9a4b-0426-4293-b78d-e63c74b0ec96/manager/0.log" Jan 30 06:58:28 crc kubenswrapper[4841]: I0130 06:58:28.875968 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-cwtpg_a9d964d8-8035-47e5-9ed9-c7713882002c/manager/0.log" Jan 30 06:58:28 crc kubenswrapper[4841]: I0130 06:58:28.941087 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-s227p_22c16a0e-8121-40ce-82a1-6129d8f4b017/manager/0.log" Jan 30 06:58:28 crc kubenswrapper[4841]: I0130 06:58:28.980156 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86dfb79cc7lnh4f_13851207-9bc9-41c9-b6d6-3dab03a5e62c/manager/0.log" Jan 30 06:58:29 crc kubenswrapper[4841]: I0130 06:58:29.266772 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-757f46c65d-45fmd_fff273ef-511c-42c9-aeaa-b9f1b67483e8/operator/0.log" Jan 30 06:58:29 crc kubenswrapper[4841]: I0130 06:58:29.451491 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-lzlm9_50258f4b-004d-44a8-bdde-f2e607ed183d/registry-server/0.log" Jan 30 06:58:29 crc kubenswrapper[4841]: I0130 06:58:29.744989 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-fhf8t_f8753b24-bc7c-4623-904c-a7c7c0dd7aec/manager/0.log" Jan 30 06:58:29 crc kubenswrapper[4841]: I0130 06:58:29.849993 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-fjqg4_3b8874f3-5993-4156-be01-f7952851fb6f/manager/0.log" Jan 30 06:58:30 crc kubenswrapper[4841]: I0130 06:58:30.100124 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wbqf2_9585c98b-78c2-4860-a00f-4347390f4432/operator/0.log" Jan 30 06:58:30 crc kubenswrapper[4841]: I0130 06:58:30.246973 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-56qdv_72033411-0acc-4925-85c1-9fe48cb2157d/manager/0.log" Jan 30 06:58:30 crc kubenswrapper[4841]: I0130 06:58:30.505536 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-64b5b76f97-zfhk4_4c201612-a22c-44ad-8f91-c5e4f45e895f/manager/0.log" Jan 30 06:58:30 crc kubenswrapper[4841]: I0130 06:58:30.600108 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-vcq29_14b05ee0-2b05-4406-8721-979476d7c5be/manager/0.log" Jan 30 06:58:30 crc kubenswrapper[4841]: I0130 06:58:30.805827 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-hlthw_cad53450-3002-49dc-bde3-c32d90ec2272/manager/0.log" Jan 30 06:58:31 crc kubenswrapper[4841]: I0130 06:58:31.212313 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b6f655c79-mbf2k_1e0a24a0-c7c6-4c83-94f9-918314ee3ac7/manager/0.log" Jan 30 06:58:32 crc kubenswrapper[4841]: I0130 06:58:32.433248 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:58:32 crc kubenswrapper[4841]: E0130 06:58:32.434051 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:58:43 crc kubenswrapper[4841]: I0130 06:58:43.431947 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:58:43 crc kubenswrapper[4841]: E0130 06:58:43.432697 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:58:50 crc kubenswrapper[4841]: I0130 06:58:50.475318 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-s42sv_5f2ade6a-33a5-4643-84ee-b2bd43c55446/control-plane-machine-set-operator/0.log" Jan 30 06:58:50 crc kubenswrapper[4841]: I0130 06:58:50.564189 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qrk5s_29892f01-d39f-41cd-aa3c-402791553b2c/kube-rbac-proxy/0.log" Jan 30 06:58:50 crc kubenswrapper[4841]: I0130 06:58:50.657213 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qrk5s_29892f01-d39f-41cd-aa3c-402791553b2c/machine-api-operator/0.log" Jan 30 06:58:58 crc kubenswrapper[4841]: I0130 06:58:58.431931 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:58:58 crc kubenswrapper[4841]: E0130 06:58:58.432960 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:59:04 crc kubenswrapper[4841]: I0130 06:59:04.756533 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-rnc5n_e233256a-c807-4442-bb33-2506fd8d34bd/cert-manager-controller/0.log" Jan 30 06:59:04 crc kubenswrapper[4841]: I0130 06:59:04.860497 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-pfb94_af42727b-5803-4bf8-a60b-992706c93f4b/cert-manager-cainjector/0.log" Jan 30 06:59:04 crc kubenswrapper[4841]: I0130 06:59:04.939836 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-b8rnj_1b8b844d-eefd-433f-9f59-3bedb92b8731/cert-manager-webhook/0.log" Jan 30 06:59:11 crc kubenswrapper[4841]: I0130 06:59:11.433067 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:59:11 crc kubenswrapper[4841]: E0130 06:59:11.434499 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:59:20 crc kubenswrapper[4841]: I0130 06:59:20.402540 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-mnmq2_e802f3fc-3992-4900-b7db-8fc0938a3433/nmstate-console-plugin/0.log" Jan 30 06:59:20 crc kubenswrapper[4841]: I0130 06:59:20.623653 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-766rn_1eb9465f-7705-4888-ab88-9da6eb8d9f5b/nmstate-handler/0.log" Jan 30 06:59:20 crc kubenswrapper[4841]: I0130 06:59:20.667375 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-bmmlb_b3cf04d4-9d2c-47cd-9bac-6a4e13850ff5/kube-rbac-proxy/0.log" Jan 30 06:59:20 crc kubenswrapper[4841]: I0130 06:59:20.756160 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-bmmlb_b3cf04d4-9d2c-47cd-9bac-6a4e13850ff5/nmstate-metrics/0.log" Jan 30 06:59:20 crc kubenswrapper[4841]: I0130 06:59:20.836173 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-gkcjz_9e2f7f43-61e4-481f-8015-170a5af14054/nmstate-operator/0.log" Jan 30 06:59:20 crc kubenswrapper[4841]: I0130 06:59:20.940714 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-7jj7v_c0d1633d-39be-4f82-a1b8-472d8c578b0c/nmstate-webhook/0.log" Jan 30 06:59:26 crc kubenswrapper[4841]: I0130 06:59:26.433301 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:59:26 crc kubenswrapper[4841]: E0130 06:59:26.434636 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:59:36 crc kubenswrapper[4841]: I0130 06:59:36.888345 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-t9p5b_226c058c-6fb2-493d-ac46-d42aeec0a369/prometheus-operator/0.log" Jan 30 06:59:37 crc kubenswrapper[4841]: I0130 06:59:37.108025 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5_04deced0-d0da-4612-a8d3-7c03ec537275/prometheus-operator-admission-webhook/0.log" Jan 30 06:59:37 crc kubenswrapper[4841]: I0130 06:59:37.117458 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns_3039e990-e132-43ec-bef0-22d0c3c66705/prometheus-operator-admission-webhook/0.log" Jan 30 06:59:37 crc kubenswrapper[4841]: I0130 06:59:37.339338 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ql57f_971ec121-d790-4ee9-b43a-6e924e45fd27/operator/0.log" Jan 30 06:59:37 crc kubenswrapper[4841]: I0130 06:59:37.369949 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-8ljvk_9d5142af-ee4e-4290-bb79-e7ee3e20fca3/perses-operator/0.log" Jan 30 06:59:40 crc kubenswrapper[4841]: I0130 06:59:40.432971 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:59:40 crc kubenswrapper[4841]: E0130 06:59:40.433805 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hd8v2_openshift-machine-config-operator(a24700eb-27ff-4126-9f6a-40ee9575e5ef)\"" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" Jan 30 06:59:53 crc kubenswrapper[4841]: I0130 06:59:53.094130 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-lq5x8_4d1b8121-f308-408b-9a30-c80bf53ce798/kube-rbac-proxy/0.log" Jan 30 06:59:53 crc kubenswrapper[4841]: I0130 06:59:53.525837 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/cp-frr-files/0.log" Jan 30 06:59:53 crc kubenswrapper[4841]: I0130 06:59:53.580257 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-lq5x8_4d1b8121-f308-408b-9a30-c80bf53ce798/controller/0.log" Jan 30 06:59:53 crc kubenswrapper[4841]: I0130 06:59:53.758640 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/cp-metrics/0.log" Jan 30 06:59:53 crc kubenswrapper[4841]: I0130 06:59:53.827423 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/cp-frr-files/0.log" Jan 30 06:59:53 crc kubenswrapper[4841]: I0130 06:59:53.829677 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/cp-reloader/0.log" Jan 30 06:59:53 crc kubenswrapper[4841]: I0130 06:59:53.855559 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/cp-reloader/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.012496 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/cp-frr-files/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.018953 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/cp-metrics/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.019090 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/cp-metrics/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.046813 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/cp-reloader/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.199221 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/cp-frr-files/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.208299 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/cp-reloader/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.232692 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/cp-metrics/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.254763 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/controller/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.417570 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/frr-metrics/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.465013 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/kube-rbac-proxy-frr/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.523965 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/kube-rbac-proxy/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.607257 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/reloader/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.745240 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-h8rs4_07615a0a-8c5e-4800-894f-96d6d83fdc93/frr-k8s-webhook-server/0.log" Jan 30 06:59:54 crc kubenswrapper[4841]: I0130 06:59:54.918961 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6b59845b8f-g7kss_581356ab-8116-4326-b032-e5cc2b5ce488/manager/0.log" Jan 30 06:59:55 crc kubenswrapper[4841]: I0130 06:59:55.095737 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7c6fcc658d-pprtw_de2ec55e-51fe-48f2-87f5-06fb1ceed00a/webhook-server/0.log" Jan 30 06:59:55 crc kubenswrapper[4841]: I0130 06:59:55.272527 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2pl4z_2f6fd33c-3515-4d5b-ac7a-582a89f3c82c/kube-rbac-proxy/0.log" Jan 30 06:59:55 crc kubenswrapper[4841]: I0130 06:59:55.431805 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 06:59:56 crc kubenswrapper[4841]: I0130 06:59:56.150009 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2pl4z_2f6fd33c-3515-4d5b-ac7a-582a89f3c82c/speaker/0.log" Jan 30 06:59:56 crc kubenswrapper[4841]: I0130 06:59:56.524373 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"f996c8819c194d15de61cb4cf0dae2be58222d7e2572d190b2fcc044b3e875c1"} Jan 30 06:59:57 crc kubenswrapper[4841]: I0130 06:59:57.305321 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sq72v_d80501d3-b73c-4a52-a12d-81e0115bc785/frr/0.log" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.319141 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw"] Jan 30 07:00:00 crc kubenswrapper[4841]: E0130 07:00:00.320065 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11b5f26-2240-41c0-b4cb-0329f96c42be" containerName="extract-utilities" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.320078 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11b5f26-2240-41c0-b4cb-0329f96c42be" containerName="extract-utilities" Jan 30 07:00:00 crc kubenswrapper[4841]: E0130 07:00:00.320101 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11b5f26-2240-41c0-b4cb-0329f96c42be" containerName="registry-server" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.320107 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11b5f26-2240-41c0-b4cb-0329f96c42be" containerName="registry-server" Jan 30 07:00:00 crc kubenswrapper[4841]: E0130 07:00:00.320146 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11b5f26-2240-41c0-b4cb-0329f96c42be" containerName="extract-content" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.320153 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11b5f26-2240-41c0-b4cb-0329f96c42be" containerName="extract-content" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.320379 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e11b5f26-2240-41c0-b4cb-0329f96c42be" containerName="registry-server" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.321233 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.326114 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.326509 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.334696 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw"] Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.421816 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51d11379-2bec-4555-8de8-48cede56188c-config-volume\") pod \"collect-profiles-29495940-9qhbw\" (UID: \"51d11379-2bec-4555-8de8-48cede56188c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.421947 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51d11379-2bec-4555-8de8-48cede56188c-secret-volume\") pod \"collect-profiles-29495940-9qhbw\" (UID: \"51d11379-2bec-4555-8de8-48cede56188c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.422132 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmlf4\" (UniqueName: \"kubernetes.io/projected/51d11379-2bec-4555-8de8-48cede56188c-kube-api-access-qmlf4\") pod \"collect-profiles-29495940-9qhbw\" (UID: \"51d11379-2bec-4555-8de8-48cede56188c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.523785 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51d11379-2bec-4555-8de8-48cede56188c-secret-volume\") pod \"collect-profiles-29495940-9qhbw\" (UID: \"51d11379-2bec-4555-8de8-48cede56188c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.523980 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmlf4\" (UniqueName: \"kubernetes.io/projected/51d11379-2bec-4555-8de8-48cede56188c-kube-api-access-qmlf4\") pod \"collect-profiles-29495940-9qhbw\" (UID: \"51d11379-2bec-4555-8de8-48cede56188c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.524069 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51d11379-2bec-4555-8de8-48cede56188c-config-volume\") pod \"collect-profiles-29495940-9qhbw\" (UID: \"51d11379-2bec-4555-8de8-48cede56188c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.525823 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51d11379-2bec-4555-8de8-48cede56188c-config-volume\") pod \"collect-profiles-29495940-9qhbw\" (UID: \"51d11379-2bec-4555-8de8-48cede56188c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.544864 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmlf4\" (UniqueName: \"kubernetes.io/projected/51d11379-2bec-4555-8de8-48cede56188c-kube-api-access-qmlf4\") pod \"collect-profiles-29495940-9qhbw\" (UID: \"51d11379-2bec-4555-8de8-48cede56188c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.554486 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51d11379-2bec-4555-8de8-48cede56188c-secret-volume\") pod \"collect-profiles-29495940-9qhbw\" (UID: \"51d11379-2bec-4555-8de8-48cede56188c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:00 crc kubenswrapper[4841]: I0130 07:00:00.655542 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:01 crc kubenswrapper[4841]: I0130 07:00:01.228317 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw"] Jan 30 07:00:01 crc kubenswrapper[4841]: I0130 07:00:01.575611 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" event={"ID":"51d11379-2bec-4555-8de8-48cede56188c","Type":"ContainerStarted","Data":"aaef3b8b9bcfd3f1861e273394ce38b7b0dd6f0ea5de47f071f084ea5472f3b3"} Jan 30 07:00:01 crc kubenswrapper[4841]: I0130 07:00:01.575959 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" event={"ID":"51d11379-2bec-4555-8de8-48cede56188c","Type":"ContainerStarted","Data":"e29281ce8e583a8f99a1dd8d30e81c2d3f75f1ec5b570f82bae773efce90c5b4"} Jan 30 07:00:01 crc kubenswrapper[4841]: I0130 07:00:01.594163 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" podStartSLOduration=1.59414086 podStartE2EDuration="1.59414086s" podCreationTimestamp="2026-01-30 07:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 07:00:01.5923024 +0000 UTC m=+6738.585775038" watchObservedRunningTime="2026-01-30 07:00:01.59414086 +0000 UTC m=+6738.587613518" Jan 30 07:00:02 crc kubenswrapper[4841]: I0130 07:00:02.587460 4841 generic.go:334] "Generic (PLEG): container finished" podID="51d11379-2bec-4555-8de8-48cede56188c" containerID="aaef3b8b9bcfd3f1861e273394ce38b7b0dd6f0ea5de47f071f084ea5472f3b3" exitCode=0 Jan 30 07:00:02 crc kubenswrapper[4841]: I0130 07:00:02.587557 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" event={"ID":"51d11379-2bec-4555-8de8-48cede56188c","Type":"ContainerDied","Data":"aaef3b8b9bcfd3f1861e273394ce38b7b0dd6f0ea5de47f071f084ea5472f3b3"} Jan 30 07:00:03 crc kubenswrapper[4841]: I0130 07:00:03.990280 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.101553 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51d11379-2bec-4555-8de8-48cede56188c-secret-volume\") pod \"51d11379-2bec-4555-8de8-48cede56188c\" (UID: \"51d11379-2bec-4555-8de8-48cede56188c\") " Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.101986 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51d11379-2bec-4555-8de8-48cede56188c-config-volume\") pod \"51d11379-2bec-4555-8de8-48cede56188c\" (UID: \"51d11379-2bec-4555-8de8-48cede56188c\") " Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.102135 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmlf4\" (UniqueName: \"kubernetes.io/projected/51d11379-2bec-4555-8de8-48cede56188c-kube-api-access-qmlf4\") pod \"51d11379-2bec-4555-8de8-48cede56188c\" (UID: \"51d11379-2bec-4555-8de8-48cede56188c\") " Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.103823 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d11379-2bec-4555-8de8-48cede56188c-config-volume" (OuterVolumeSpecName: "config-volume") pod "51d11379-2bec-4555-8de8-48cede56188c" (UID: "51d11379-2bec-4555-8de8-48cede56188c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.112031 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d11379-2bec-4555-8de8-48cede56188c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "51d11379-2bec-4555-8de8-48cede56188c" (UID: "51d11379-2bec-4555-8de8-48cede56188c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.112176 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d11379-2bec-4555-8de8-48cede56188c-kube-api-access-qmlf4" (OuterVolumeSpecName: "kube-api-access-qmlf4") pod "51d11379-2bec-4555-8de8-48cede56188c" (UID: "51d11379-2bec-4555-8de8-48cede56188c"). InnerVolumeSpecName "kube-api-access-qmlf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.205266 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51d11379-2bec-4555-8de8-48cede56188c-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.205302 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51d11379-2bec-4555-8de8-48cede56188c-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.205317 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmlf4\" (UniqueName: \"kubernetes.io/projected/51d11379-2bec-4555-8de8-48cede56188c-kube-api-access-qmlf4\") on node \"crc\" DevicePath \"\"" Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.608816 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" event={"ID":"51d11379-2bec-4555-8de8-48cede56188c","Type":"ContainerDied","Data":"e29281ce8e583a8f99a1dd8d30e81c2d3f75f1ec5b570f82bae773efce90c5b4"} Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.608854 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e29281ce8e583a8f99a1dd8d30e81c2d3f75f1ec5b570f82bae773efce90c5b4" Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.608969 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495940-9qhbw" Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.694256 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss"] Jan 30 07:00:04 crc kubenswrapper[4841]: I0130 07:00:04.707051 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495895-brpss"] Jan 30 07:00:06 crc kubenswrapper[4841]: I0130 07:00:06.444498 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aee0eb8-1461-42b5-b53c-b020d519ee43" path="/var/lib/kubelet/pods/3aee0eb8-1461-42b5-b53c-b020d519ee43/volumes" Jan 30 07:00:11 crc kubenswrapper[4841]: I0130 07:00:11.027207 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr_bef20871-4c82-4eeb-83c9-0f47f86b41e1/util/0.log" Jan 30 07:00:11 crc kubenswrapper[4841]: I0130 07:00:11.243488 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr_bef20871-4c82-4eeb-83c9-0f47f86b41e1/util/0.log" Jan 30 07:00:11 crc kubenswrapper[4841]: I0130 07:00:11.256053 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr_bef20871-4c82-4eeb-83c9-0f47f86b41e1/pull/0.log" Jan 30 07:00:11 crc kubenswrapper[4841]: I0130 07:00:11.376048 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr_bef20871-4c82-4eeb-83c9-0f47f86b41e1/pull/0.log" Jan 30 07:00:11 crc kubenswrapper[4841]: I0130 07:00:11.492976 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr_bef20871-4c82-4eeb-83c9-0f47f86b41e1/pull/0.log" Jan 30 07:00:11 crc kubenswrapper[4841]: I0130 07:00:11.496727 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr_bef20871-4c82-4eeb-83c9-0f47f86b41e1/util/0.log" Jan 30 07:00:11 crc kubenswrapper[4841]: I0130 07:00:11.559828 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc822fr_bef20871-4c82-4eeb-83c9-0f47f86b41e1/extract/0.log" Jan 30 07:00:11 crc kubenswrapper[4841]: I0130 07:00:11.685314 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn_57724098-c506-430f-977a-11a306b6044c/util/0.log" Jan 30 07:00:11 crc kubenswrapper[4841]: I0130 07:00:11.851495 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn_57724098-c506-430f-977a-11a306b6044c/pull/0.log" Jan 30 07:00:11 crc kubenswrapper[4841]: I0130 07:00:11.854155 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn_57724098-c506-430f-977a-11a306b6044c/pull/0.log" Jan 30 07:00:11 crc kubenswrapper[4841]: I0130 07:00:11.882603 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn_57724098-c506-430f-977a-11a306b6044c/util/0.log" Jan 30 07:00:12 crc kubenswrapper[4841]: I0130 07:00:12.250310 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn_57724098-c506-430f-977a-11a306b6044c/pull/0.log" Jan 30 07:00:12 crc kubenswrapper[4841]: I0130 07:00:12.293072 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn_57724098-c506-430f-977a-11a306b6044c/util/0.log" Jan 30 07:00:12 crc kubenswrapper[4841]: I0130 07:00:12.296902 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s66xn_57724098-c506-430f-977a-11a306b6044c/extract/0.log" Jan 30 07:00:12 crc kubenswrapper[4841]: I0130 07:00:12.437642 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv_b37e9576-6d8a-408d-8bbd-ce2df13aecf1/util/0.log" Jan 30 07:00:12 crc kubenswrapper[4841]: I0130 07:00:12.696927 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv_b37e9576-6d8a-408d-8bbd-ce2df13aecf1/pull/0.log" Jan 30 07:00:12 crc kubenswrapper[4841]: I0130 07:00:12.726776 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv_b37e9576-6d8a-408d-8bbd-ce2df13aecf1/pull/0.log" Jan 30 07:00:12 crc kubenswrapper[4841]: I0130 07:00:12.732539 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv_b37e9576-6d8a-408d-8bbd-ce2df13aecf1/util/0.log" Jan 30 07:00:12 crc kubenswrapper[4841]: I0130 07:00:12.901144 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv_b37e9576-6d8a-408d-8bbd-ce2df13aecf1/extract/0.log" Jan 30 07:00:12 crc kubenswrapper[4841]: I0130 07:00:12.911021 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv_b37e9576-6d8a-408d-8bbd-ce2df13aecf1/util/0.log" Jan 30 07:00:12 crc kubenswrapper[4841]: I0130 07:00:12.963683 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jpdmv_b37e9576-6d8a-408d-8bbd-ce2df13aecf1/pull/0.log" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.095545 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9_ac910689-bd23-493a-9b20-89e21df5d758/util/0.log" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.281436 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9_ac910689-bd23-493a-9b20-89e21df5d758/pull/0.log" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.310526 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9_ac910689-bd23-493a-9b20-89e21df5d758/util/0.log" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.317997 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9_ac910689-bd23-493a-9b20-89e21df5d758/pull/0.log" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.478543 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rk7zd"] Jan 30 07:00:13 crc kubenswrapper[4841]: E0130 07:00:13.479994 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d11379-2bec-4555-8de8-48cede56188c" containerName="collect-profiles" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.480014 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d11379-2bec-4555-8de8-48cede56188c" containerName="collect-profiles" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.480221 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d11379-2bec-4555-8de8-48cede56188c" containerName="collect-profiles" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.481771 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.506736 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk7zd"] Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.521813 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9_ac910689-bd23-493a-9b20-89e21df5d758/util/0.log" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.558786 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9_ac910689-bd23-493a-9b20-89e21df5d758/pull/0.log" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.562688 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08czxp9_ac910689-bd23-493a-9b20-89e21df5d758/extract/0.log" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.606599 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9d136c-ea59-455b-9430-6bd19690d125-catalog-content\") pod \"redhat-marketplace-rk7zd\" (UID: \"5f9d136c-ea59-455b-9430-6bd19690d125\") " pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.606683 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9d136c-ea59-455b-9430-6bd19690d125-utilities\") pod \"redhat-marketplace-rk7zd\" (UID: \"5f9d136c-ea59-455b-9430-6bd19690d125\") " pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.606781 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8p5\" (UniqueName: \"kubernetes.io/projected/5f9d136c-ea59-455b-9430-6bd19690d125-kube-api-access-xl8p5\") pod \"redhat-marketplace-rk7zd\" (UID: \"5f9d136c-ea59-455b-9430-6bd19690d125\") " pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.710050 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9d136c-ea59-455b-9430-6bd19690d125-catalog-content\") pod \"redhat-marketplace-rk7zd\" (UID: \"5f9d136c-ea59-455b-9430-6bd19690d125\") " pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.710135 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9d136c-ea59-455b-9430-6bd19690d125-utilities\") pod \"redhat-marketplace-rk7zd\" (UID: \"5f9d136c-ea59-455b-9430-6bd19690d125\") " pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.710237 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl8p5\" (UniqueName: \"kubernetes.io/projected/5f9d136c-ea59-455b-9430-6bd19690d125-kube-api-access-xl8p5\") pod \"redhat-marketplace-rk7zd\" (UID: \"5f9d136c-ea59-455b-9430-6bd19690d125\") " pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.710703 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9d136c-ea59-455b-9430-6bd19690d125-catalog-content\") pod \"redhat-marketplace-rk7zd\" (UID: \"5f9d136c-ea59-455b-9430-6bd19690d125\") " pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.710763 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9d136c-ea59-455b-9430-6bd19690d125-utilities\") pod \"redhat-marketplace-rk7zd\" (UID: \"5f9d136c-ea59-455b-9430-6bd19690d125\") " pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.741552 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lb7sq_3fd4a2f0-3409-4512-9591-fd515639c1ea/extract-utilities/0.log" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.771165 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl8p5\" (UniqueName: \"kubernetes.io/projected/5f9d136c-ea59-455b-9430-6bd19690d125-kube-api-access-xl8p5\") pod \"redhat-marketplace-rk7zd\" (UID: \"5f9d136c-ea59-455b-9430-6bd19690d125\") " pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:13 crc kubenswrapper[4841]: I0130 07:00:13.807468 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.018966 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lb7sq_3fd4a2f0-3409-4512-9591-fd515639c1ea/extract-utilities/0.log" Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.128535 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lb7sq_3fd4a2f0-3409-4512-9591-fd515639c1ea/extract-content/0.log" Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.132378 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lb7sq_3fd4a2f0-3409-4512-9591-fd515639c1ea/extract-content/0.log" Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.315087 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk7zd"] Jan 30 07:00:14 crc kubenswrapper[4841]: W0130 07:00:14.316435 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f9d136c_ea59_455b_9430_6bd19690d125.slice/crio-06e6e1defc6370bbfac4ba52e5d206138963d5f877d99db9f35de87b54bb4a44 WatchSource:0}: Error finding container 06e6e1defc6370bbfac4ba52e5d206138963d5f877d99db9f35de87b54bb4a44: Status 404 returned error can't find the container with id 06e6e1defc6370bbfac4ba52e5d206138963d5f877d99db9f35de87b54bb4a44 Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.408993 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lb7sq_3fd4a2f0-3409-4512-9591-fd515639c1ea/extract-content/0.log" Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.446220 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lb7sq_3fd4a2f0-3409-4512-9591-fd515639c1ea/extract-utilities/0.log" Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.645961 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-znwlp_8bbac7be-b680-4d67-ace5-892c478656fa/extract-utilities/0.log" Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.709000 4841 generic.go:334] "Generic (PLEG): container finished" podID="5f9d136c-ea59-455b-9430-6bd19690d125" containerID="71a75aafff58599f03620b65e4a3a37c01054ec3fcdd3522d930169efbc5ec51" exitCode=0 Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.709367 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk7zd" event={"ID":"5f9d136c-ea59-455b-9430-6bd19690d125","Type":"ContainerDied","Data":"71a75aafff58599f03620b65e4a3a37c01054ec3fcdd3522d930169efbc5ec51"} Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.709415 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk7zd" event={"ID":"5f9d136c-ea59-455b-9430-6bd19690d125","Type":"ContainerStarted","Data":"06e6e1defc6370bbfac4ba52e5d206138963d5f877d99db9f35de87b54bb4a44"} Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.712164 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.868553 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-znwlp_8bbac7be-b680-4d67-ace5-892c478656fa/extract-content/0.log" Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.895748 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-znwlp_8bbac7be-b680-4d67-ace5-892c478656fa/extract-content/0.log" Jan 30 07:00:14 crc kubenswrapper[4841]: I0130 07:00:14.925185 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-znwlp_8bbac7be-b680-4d67-ace5-892c478656fa/extract-utilities/0.log" Jan 30 07:00:15 crc kubenswrapper[4841]: I0130 07:00:15.176660 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-znwlp_8bbac7be-b680-4d67-ace5-892c478656fa/extract-utilities/0.log" Jan 30 07:00:15 crc kubenswrapper[4841]: I0130 07:00:15.197942 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-znwlp_8bbac7be-b680-4d67-ace5-892c478656fa/extract-content/0.log" Jan 30 07:00:15 crc kubenswrapper[4841]: I0130 07:00:15.271659 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lb7sq_3fd4a2f0-3409-4512-9591-fd515639c1ea/registry-server/0.log" Jan 30 07:00:15 crc kubenswrapper[4841]: I0130 07:00:15.474253 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-slkqt_05338f03-2aa7-4afc-80f6-44a70f084420/marketplace-operator/0.log" Jan 30 07:00:15 crc kubenswrapper[4841]: I0130 07:00:15.525596 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8rcwn_d383667a-b7ea-42ea-b91f-4ac4306ddbaf/extract-utilities/0.log" Jan 30 07:00:15 crc kubenswrapper[4841]: I0130 07:00:15.687710 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-znwlp_8bbac7be-b680-4d67-ace5-892c478656fa/registry-server/0.log" Jan 30 07:00:15 crc kubenswrapper[4841]: I0130 07:00:15.718229 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk7zd" event={"ID":"5f9d136c-ea59-455b-9430-6bd19690d125","Type":"ContainerStarted","Data":"e514d8a958c36a29751abcba711c9eccd084dec0aee88113395c8664a48851f3"} Jan 30 07:00:15 crc kubenswrapper[4841]: I0130 07:00:15.726994 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8rcwn_d383667a-b7ea-42ea-b91f-4ac4306ddbaf/extract-content/0.log" Jan 30 07:00:15 crc kubenswrapper[4841]: I0130 07:00:15.727442 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8rcwn_d383667a-b7ea-42ea-b91f-4ac4306ddbaf/extract-content/0.log" Jan 30 07:00:15 crc kubenswrapper[4841]: I0130 07:00:15.743242 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8rcwn_d383667a-b7ea-42ea-b91f-4ac4306ddbaf/extract-utilities/0.log" Jan 30 07:00:15 crc kubenswrapper[4841]: I0130 07:00:15.932914 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8rcwn_d383667a-b7ea-42ea-b91f-4ac4306ddbaf/extract-content/0.log" Jan 30 07:00:15 crc kubenswrapper[4841]: I0130 07:00:15.946723 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8rcwn_d383667a-b7ea-42ea-b91f-4ac4306ddbaf/extract-utilities/0.log" Jan 30 07:00:16 crc kubenswrapper[4841]: I0130 07:00:16.016581 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s766p_e34e1d3e-f2d8-466b-b80e-158f344ac558/extract-utilities/0.log" Jan 30 07:00:16 crc kubenswrapper[4841]: I0130 07:00:16.208896 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8rcwn_d383667a-b7ea-42ea-b91f-4ac4306ddbaf/registry-server/0.log" Jan 30 07:00:16 crc kubenswrapper[4841]: I0130 07:00:16.231586 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s766p_e34e1d3e-f2d8-466b-b80e-158f344ac558/extract-content/0.log" Jan 30 07:00:16 crc kubenswrapper[4841]: I0130 07:00:16.264790 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s766p_e34e1d3e-f2d8-466b-b80e-158f344ac558/extract-utilities/0.log" Jan 30 07:00:16 crc kubenswrapper[4841]: I0130 07:00:16.265467 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s766p_e34e1d3e-f2d8-466b-b80e-158f344ac558/extract-content/0.log" Jan 30 07:00:16 crc kubenswrapper[4841]: I0130 07:00:16.439639 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s766p_e34e1d3e-f2d8-466b-b80e-158f344ac558/extract-utilities/0.log" Jan 30 07:00:16 crc kubenswrapper[4841]: I0130 07:00:16.453523 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s766p_e34e1d3e-f2d8-466b-b80e-158f344ac558/extract-content/0.log" Jan 30 07:00:16 crc kubenswrapper[4841]: I0130 07:00:16.727484 4841 generic.go:334] "Generic (PLEG): container finished" podID="5f9d136c-ea59-455b-9430-6bd19690d125" containerID="e514d8a958c36a29751abcba711c9eccd084dec0aee88113395c8664a48851f3" exitCode=0 Jan 30 07:00:16 crc kubenswrapper[4841]: I0130 07:00:16.727535 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk7zd" event={"ID":"5f9d136c-ea59-455b-9430-6bd19690d125","Type":"ContainerDied","Data":"e514d8a958c36a29751abcba711c9eccd084dec0aee88113395c8664a48851f3"} Jan 30 07:00:17 crc kubenswrapper[4841]: I0130 07:00:17.123867 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s766p_e34e1d3e-f2d8-466b-b80e-158f344ac558/registry-server/0.log" Jan 30 07:00:17 crc kubenswrapper[4841]: I0130 07:00:17.739660 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk7zd" event={"ID":"5f9d136c-ea59-455b-9430-6bd19690d125","Type":"ContainerStarted","Data":"a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a"} Jan 30 07:00:17 crc kubenswrapper[4841]: I0130 07:00:17.791349 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rk7zd" podStartSLOduration=2.365221441 podStartE2EDuration="4.791326925s" podCreationTimestamp="2026-01-30 07:00:13 +0000 UTC" firstStartedPulling="2026-01-30 07:00:14.711953814 +0000 UTC m=+6751.705426442" lastFinishedPulling="2026-01-30 07:00:17.138059288 +0000 UTC m=+6754.131531926" observedRunningTime="2026-01-30 07:00:17.781867039 +0000 UTC m=+6754.775339677" watchObservedRunningTime="2026-01-30 07:00:17.791326925 +0000 UTC m=+6754.784799563" Jan 30 07:00:23 crc kubenswrapper[4841]: I0130 07:00:23.808184 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:23 crc kubenswrapper[4841]: I0130 07:00:23.808812 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:23 crc kubenswrapper[4841]: I0130 07:00:23.877742 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:24 crc kubenswrapper[4841]: I0130 07:00:24.934836 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:25 crc kubenswrapper[4841]: I0130 07:00:25.633190 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk7zd"] Jan 30 07:00:26 crc kubenswrapper[4841]: I0130 07:00:26.876501 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rk7zd" podUID="5f9d136c-ea59-455b-9430-6bd19690d125" containerName="registry-server" containerID="cri-o://a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a" gracePeriod=2 Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.410857 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.550253 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl8p5\" (UniqueName: \"kubernetes.io/projected/5f9d136c-ea59-455b-9430-6bd19690d125-kube-api-access-xl8p5\") pod \"5f9d136c-ea59-455b-9430-6bd19690d125\" (UID: \"5f9d136c-ea59-455b-9430-6bd19690d125\") " Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.550359 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9d136c-ea59-455b-9430-6bd19690d125-utilities\") pod \"5f9d136c-ea59-455b-9430-6bd19690d125\" (UID: \"5f9d136c-ea59-455b-9430-6bd19690d125\") " Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.550388 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9d136c-ea59-455b-9430-6bd19690d125-catalog-content\") pod \"5f9d136c-ea59-455b-9430-6bd19690d125\" (UID: \"5f9d136c-ea59-455b-9430-6bd19690d125\") " Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.551916 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9d136c-ea59-455b-9430-6bd19690d125-utilities" (OuterVolumeSpecName: "utilities") pod "5f9d136c-ea59-455b-9430-6bd19690d125" (UID: "5f9d136c-ea59-455b-9430-6bd19690d125"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.555527 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9d136c-ea59-455b-9430-6bd19690d125-kube-api-access-xl8p5" (OuterVolumeSpecName: "kube-api-access-xl8p5") pod "5f9d136c-ea59-455b-9430-6bd19690d125" (UID: "5f9d136c-ea59-455b-9430-6bd19690d125"). InnerVolumeSpecName "kube-api-access-xl8p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.575837 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9d136c-ea59-455b-9430-6bd19690d125-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f9d136c-ea59-455b-9430-6bd19690d125" (UID: "5f9d136c-ea59-455b-9430-6bd19690d125"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.652945 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f9d136c-ea59-455b-9430-6bd19690d125-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.652982 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f9d136c-ea59-455b-9430-6bd19690d125-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.652993 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl8p5\" (UniqueName: \"kubernetes.io/projected/5f9d136c-ea59-455b-9430-6bd19690d125-kube-api-access-xl8p5\") on node \"crc\" DevicePath \"\"" Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.896946 4841 generic.go:334] "Generic (PLEG): container finished" podID="5f9d136c-ea59-455b-9430-6bd19690d125" containerID="a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a" exitCode=0 Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.896995 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk7zd" event={"ID":"5f9d136c-ea59-455b-9430-6bd19690d125","Type":"ContainerDied","Data":"a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a"} Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.897033 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rk7zd" Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.897134 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk7zd" event={"ID":"5f9d136c-ea59-455b-9430-6bd19690d125","Type":"ContainerDied","Data":"06e6e1defc6370bbfac4ba52e5d206138963d5f877d99db9f35de87b54bb4a44"} Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.897174 4841 scope.go:117] "RemoveContainer" containerID="a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a" Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.932736 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk7zd"] Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.935312 4841 scope.go:117] "RemoveContainer" containerID="e514d8a958c36a29751abcba711c9eccd084dec0aee88113395c8664a48851f3" Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.945867 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk7zd"] Jan 30 07:00:27 crc kubenswrapper[4841]: I0130 07:00:27.960843 4841 scope.go:117] "RemoveContainer" containerID="71a75aafff58599f03620b65e4a3a37c01054ec3fcdd3522d930169efbc5ec51" Jan 30 07:00:28 crc kubenswrapper[4841]: I0130 07:00:28.020680 4841 scope.go:117] "RemoveContainer" containerID="a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a" Jan 30 07:00:28 crc kubenswrapper[4841]: E0130 07:00:28.021196 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a\": container with ID starting with a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a not found: ID does not exist" containerID="a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a" Jan 30 07:00:28 crc kubenswrapper[4841]: I0130 07:00:28.021240 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a"} err="failed to get container status \"a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a\": rpc error: code = NotFound desc = could not find container \"a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a\": container with ID starting with a1645b98f19be80554f5dbd4bcf7852886d83376ec1a1456af85d49e5c85735a not found: ID does not exist" Jan 30 07:00:28 crc kubenswrapper[4841]: I0130 07:00:28.021267 4841 scope.go:117] "RemoveContainer" containerID="e514d8a958c36a29751abcba711c9eccd084dec0aee88113395c8664a48851f3" Jan 30 07:00:28 crc kubenswrapper[4841]: E0130 07:00:28.021952 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e514d8a958c36a29751abcba711c9eccd084dec0aee88113395c8664a48851f3\": container with ID starting with e514d8a958c36a29751abcba711c9eccd084dec0aee88113395c8664a48851f3 not found: ID does not exist" containerID="e514d8a958c36a29751abcba711c9eccd084dec0aee88113395c8664a48851f3" Jan 30 07:00:28 crc kubenswrapper[4841]: I0130 07:00:28.021976 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e514d8a958c36a29751abcba711c9eccd084dec0aee88113395c8664a48851f3"} err="failed to get container status \"e514d8a958c36a29751abcba711c9eccd084dec0aee88113395c8664a48851f3\": rpc error: code = NotFound desc = could not find container \"e514d8a958c36a29751abcba711c9eccd084dec0aee88113395c8664a48851f3\": container with ID starting with e514d8a958c36a29751abcba711c9eccd084dec0aee88113395c8664a48851f3 not found: ID does not exist" Jan 30 07:00:28 crc kubenswrapper[4841]: I0130 07:00:28.021989 4841 scope.go:117] "RemoveContainer" containerID="71a75aafff58599f03620b65e4a3a37c01054ec3fcdd3522d930169efbc5ec51" Jan 30 07:00:28 crc kubenswrapper[4841]: E0130 07:00:28.022445 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a75aafff58599f03620b65e4a3a37c01054ec3fcdd3522d930169efbc5ec51\": container with ID starting with 71a75aafff58599f03620b65e4a3a37c01054ec3fcdd3522d930169efbc5ec51 not found: ID does not exist" containerID="71a75aafff58599f03620b65e4a3a37c01054ec3fcdd3522d930169efbc5ec51" Jan 30 07:00:28 crc kubenswrapper[4841]: I0130 07:00:28.022476 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a75aafff58599f03620b65e4a3a37c01054ec3fcdd3522d930169efbc5ec51"} err="failed to get container status \"71a75aafff58599f03620b65e4a3a37c01054ec3fcdd3522d930169efbc5ec51\": rpc error: code = NotFound desc = could not find container \"71a75aafff58599f03620b65e4a3a37c01054ec3fcdd3522d930169efbc5ec51\": container with ID starting with 71a75aafff58599f03620b65e4a3a37c01054ec3fcdd3522d930169efbc5ec51 not found: ID does not exist" Jan 30 07:00:28 crc kubenswrapper[4841]: I0130 07:00:28.447388 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9d136c-ea59-455b-9430-6bd19690d125" path="/var/lib/kubelet/pods/5f9d136c-ea59-455b-9430-6bd19690d125/volumes" Jan 30 07:00:31 crc kubenswrapper[4841]: I0130 07:00:31.227168 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6f78bb5fff-2fch5_04deced0-d0da-4612-a8d3-7c03ec537275/prometheus-operator-admission-webhook/0.log" Jan 30 07:00:31 crc kubenswrapper[4841]: I0130 07:00:31.231686 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-t9p5b_226c058c-6fb2-493d-ac46-d42aeec0a369/prometheus-operator/0.log" Jan 30 07:00:31 crc kubenswrapper[4841]: I0130 07:00:31.309316 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6f78bb5fff-l48ns_3039e990-e132-43ec-bef0-22d0c3c66705/prometheus-operator-admission-webhook/0.log" Jan 30 07:00:31 crc kubenswrapper[4841]: I0130 07:00:31.437303 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-8ljvk_9d5142af-ee4e-4290-bb79-e7ee3e20fca3/perses-operator/0.log" Jan 30 07:00:31 crc kubenswrapper[4841]: I0130 07:00:31.445667 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ql57f_971ec121-d790-4ee9-b43a-6e924e45fd27/operator/0.log" Jan 30 07:00:42 crc kubenswrapper[4841]: E0130 07:00:42.812496 4841 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.36:41384->38.102.83.36:43553: write tcp 38.102.83.36:41384->38.102.83.36:43553: write: broken pipe Jan 30 07:00:55 crc kubenswrapper[4841]: E0130 07:00:55.978542 4841 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.36:34510->38.102.83.36:43553: write tcp 38.102.83.36:34510->38.102.83.36:43553: write: connection reset by peer Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.163154 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29495941-s5m54"] Jan 30 07:01:00 crc kubenswrapper[4841]: E0130 07:01:00.164189 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9d136c-ea59-455b-9430-6bd19690d125" containerName="registry-server" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.164204 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9d136c-ea59-455b-9430-6bd19690d125" containerName="registry-server" Jan 30 07:01:00 crc kubenswrapper[4841]: E0130 07:01:00.164226 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9d136c-ea59-455b-9430-6bd19690d125" containerName="extract-utilities" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.164235 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9d136c-ea59-455b-9430-6bd19690d125" containerName="extract-utilities" Jan 30 07:01:00 crc kubenswrapper[4841]: E0130 07:01:00.164254 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9d136c-ea59-455b-9430-6bd19690d125" containerName="extract-content" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.164291 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9d136c-ea59-455b-9430-6bd19690d125" containerName="extract-content" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.164563 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9d136c-ea59-455b-9430-6bd19690d125" containerName="registry-server" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.165455 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.180251 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29495941-s5m54"] Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.299714 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5rth\" (UniqueName: \"kubernetes.io/projected/357e6c7b-f67e-4305-90a3-3c65621471fc-kube-api-access-p5rth\") pod \"keystone-cron-29495941-s5m54\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.299785 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-config-data\") pod \"keystone-cron-29495941-s5m54\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.299818 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-fernet-keys\") pod \"keystone-cron-29495941-s5m54\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.300215 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-combined-ca-bundle\") pod \"keystone-cron-29495941-s5m54\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.402067 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-combined-ca-bundle\") pod \"keystone-cron-29495941-s5m54\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.402195 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5rth\" (UniqueName: \"kubernetes.io/projected/357e6c7b-f67e-4305-90a3-3c65621471fc-kube-api-access-p5rth\") pod \"keystone-cron-29495941-s5m54\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.402242 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-config-data\") pod \"keystone-cron-29495941-s5m54\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.402273 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-fernet-keys\") pod \"keystone-cron-29495941-s5m54\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.408811 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-fernet-keys\") pod \"keystone-cron-29495941-s5m54\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.408900 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-config-data\") pod \"keystone-cron-29495941-s5m54\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.409528 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-combined-ca-bundle\") pod \"keystone-cron-29495941-s5m54\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.452231 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5rth\" (UniqueName: \"kubernetes.io/projected/357e6c7b-f67e-4305-90a3-3c65621471fc-kube-api-access-p5rth\") pod \"keystone-cron-29495941-s5m54\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.492835 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:00 crc kubenswrapper[4841]: I0130 07:01:00.979306 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29495941-s5m54"] Jan 30 07:01:01 crc kubenswrapper[4841]: I0130 07:01:01.205486 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29495941-s5m54" event={"ID":"357e6c7b-f67e-4305-90a3-3c65621471fc","Type":"ContainerStarted","Data":"611624149ed3b9f6c48fcd901132a7a01c4e4fd3e8e872de450aedc3d37e4b55"} Jan 30 07:01:01 crc kubenswrapper[4841]: I0130 07:01:01.205804 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29495941-s5m54" event={"ID":"357e6c7b-f67e-4305-90a3-3c65621471fc","Type":"ContainerStarted","Data":"fedc700b9051521b67ee26aae018dc2403c292d566b28851fc045a697bb2885a"} Jan 30 07:01:01 crc kubenswrapper[4841]: I0130 07:01:01.230388 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29495941-s5m54" podStartSLOduration=1.230370816 podStartE2EDuration="1.230370816s" podCreationTimestamp="2026-01-30 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 07:01:01.217552209 +0000 UTC m=+6798.211024847" watchObservedRunningTime="2026-01-30 07:01:01.230370816 +0000 UTC m=+6798.223843454" Jan 30 07:01:02 crc kubenswrapper[4841]: I0130 07:01:02.076122 4841 scope.go:117] "RemoveContainer" containerID="15203c44eb2199d094440f479d6223702bccf3b5cd6b611e91dc45bbc260a8ff" Jan 30 07:01:04 crc kubenswrapper[4841]: I0130 07:01:04.056898 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-sgbz8"] Jan 30 07:01:04 crc kubenswrapper[4841]: I0130 07:01:04.066873 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-30c7-account-create-update-tzfxj"] Jan 30 07:01:04 crc kubenswrapper[4841]: I0130 07:01:04.078178 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-sgbz8"] Jan 30 07:01:04 crc kubenswrapper[4841]: I0130 07:01:04.099847 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-30c7-account-create-update-tzfxj"] Jan 30 07:01:04 crc kubenswrapper[4841]: I0130 07:01:04.457383 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90c33414-5508-46f4-bce1-b008af425e4c" path="/var/lib/kubelet/pods/90c33414-5508-46f4-bce1-b008af425e4c/volumes" Jan 30 07:01:04 crc kubenswrapper[4841]: I0130 07:01:04.458373 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f84dc1-99f0-4b50-ae30-d15c6166c0b9" path="/var/lib/kubelet/pods/96f84dc1-99f0-4b50-ae30-d15c6166c0b9/volumes" Jan 30 07:01:05 crc kubenswrapper[4841]: I0130 07:01:05.249272 4841 generic.go:334] "Generic (PLEG): container finished" podID="357e6c7b-f67e-4305-90a3-3c65621471fc" containerID="611624149ed3b9f6c48fcd901132a7a01c4e4fd3e8e872de450aedc3d37e4b55" exitCode=0 Jan 30 07:01:05 crc kubenswrapper[4841]: I0130 07:01:05.249430 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29495941-s5m54" event={"ID":"357e6c7b-f67e-4305-90a3-3c65621471fc","Type":"ContainerDied","Data":"611624149ed3b9f6c48fcd901132a7a01c4e4fd3e8e872de450aedc3d37e4b55"} Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.723753 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.882882 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5rth\" (UniqueName: \"kubernetes.io/projected/357e6c7b-f67e-4305-90a3-3c65621471fc-kube-api-access-p5rth\") pod \"357e6c7b-f67e-4305-90a3-3c65621471fc\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.883004 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-combined-ca-bundle\") pod \"357e6c7b-f67e-4305-90a3-3c65621471fc\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.883096 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-fernet-keys\") pod \"357e6c7b-f67e-4305-90a3-3c65621471fc\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.883169 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-config-data\") pod \"357e6c7b-f67e-4305-90a3-3c65621471fc\" (UID: \"357e6c7b-f67e-4305-90a3-3c65621471fc\") " Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.888575 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "357e6c7b-f67e-4305-90a3-3c65621471fc" (UID: "357e6c7b-f67e-4305-90a3-3c65621471fc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.889753 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/357e6c7b-f67e-4305-90a3-3c65621471fc-kube-api-access-p5rth" (OuterVolumeSpecName: "kube-api-access-p5rth") pod "357e6c7b-f67e-4305-90a3-3c65621471fc" (UID: "357e6c7b-f67e-4305-90a3-3c65621471fc"). InnerVolumeSpecName "kube-api-access-p5rth". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.920630 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "357e6c7b-f67e-4305-90a3-3c65621471fc" (UID: "357e6c7b-f67e-4305-90a3-3c65621471fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.942887 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-config-data" (OuterVolumeSpecName: "config-data") pod "357e6c7b-f67e-4305-90a3-3c65621471fc" (UID: "357e6c7b-f67e-4305-90a3-3c65621471fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.986099 4841 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.986148 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.986170 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5rth\" (UniqueName: \"kubernetes.io/projected/357e6c7b-f67e-4305-90a3-3c65621471fc-kube-api-access-p5rth\") on node \"crc\" DevicePath \"\"" Jan 30 07:01:06 crc kubenswrapper[4841]: I0130 07:01:06.986190 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/357e6c7b-f67e-4305-90a3-3c65621471fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 07:01:07 crc kubenswrapper[4841]: I0130 07:01:07.283392 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29495941-s5m54" event={"ID":"357e6c7b-f67e-4305-90a3-3c65621471fc","Type":"ContainerDied","Data":"fedc700b9051521b67ee26aae018dc2403c292d566b28851fc045a697bb2885a"} Jan 30 07:01:07 crc kubenswrapper[4841]: I0130 07:01:07.284185 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fedc700b9051521b67ee26aae018dc2403c292d566b28851fc045a697bb2885a" Jan 30 07:01:07 crc kubenswrapper[4841]: I0130 07:01:07.284256 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29495941-s5m54" Jan 30 07:01:19 crc kubenswrapper[4841]: I0130 07:01:19.061796 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-8sv85"] Jan 30 07:01:19 crc kubenswrapper[4841]: I0130 07:01:19.079876 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-8sv85"] Jan 30 07:01:20 crc kubenswrapper[4841]: I0130 07:01:20.455191 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3" path="/var/lib/kubelet/pods/a1ba88a0-5ce0-46a6-b0ca-8fb36d636ad3/volumes" Jan 30 07:02:02 crc kubenswrapper[4841]: I0130 07:02:02.190959 4841 scope.go:117] "RemoveContainer" containerID="3c04dafe32da32c2968f6b97bb1aa5d805cece9d89402407e9a0a0b7fc40840c" Jan 30 07:02:02 crc kubenswrapper[4841]: I0130 07:02:02.231898 4841 scope.go:117] "RemoveContainer" containerID="a20b84a78e155854c194b854cb0f8b3a36b090e47ec4ea63e5b03a2a9c57d365" Jan 30 07:02:02 crc kubenswrapper[4841]: I0130 07:02:02.344083 4841 scope.go:117] "RemoveContainer" containerID="20043f1975fcdde8b6a39c9ce9b879d7e01f63fe9829b3f8a38492ae13b5b75b" Jan 30 07:02:10 crc kubenswrapper[4841]: I0130 07:02:10.463213 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 07:02:10 crc kubenswrapper[4841]: I0130 07:02:10.464060 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 07:02:11 crc kubenswrapper[4841]: I0130 07:02:11.118168 4841 generic.go:334] "Generic (PLEG): container finished" podID="dc994cfa-341d-4e23-ac9c-72abe21d2b0c" containerID="97ca3af24a8d8ec56f02bf5e25aa20877120e5a51f57579c1a03a25e8c18da03" exitCode=0 Jan 30 07:02:11 crc kubenswrapper[4841]: I0130 07:02:11.118281 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkctm/must-gather-s8xvz" event={"ID":"dc994cfa-341d-4e23-ac9c-72abe21d2b0c","Type":"ContainerDied","Data":"97ca3af24a8d8ec56f02bf5e25aa20877120e5a51f57579c1a03a25e8c18da03"} Jan 30 07:02:11 crc kubenswrapper[4841]: I0130 07:02:11.120074 4841 scope.go:117] "RemoveContainer" containerID="97ca3af24a8d8ec56f02bf5e25aa20877120e5a51f57579c1a03a25e8c18da03" Jan 30 07:02:12 crc kubenswrapper[4841]: I0130 07:02:12.014714 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wkctm_must-gather-s8xvz_dc994cfa-341d-4e23-ac9c-72abe21d2b0c/gather/0.log" Jan 30 07:02:16 crc kubenswrapper[4841]: I0130 07:02:16.873150 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-szg4g"] Jan 30 07:02:16 crc kubenswrapper[4841]: E0130 07:02:16.874469 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357e6c7b-f67e-4305-90a3-3c65621471fc" containerName="keystone-cron" Jan 30 07:02:16 crc kubenswrapper[4841]: I0130 07:02:16.874491 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="357e6c7b-f67e-4305-90a3-3c65621471fc" containerName="keystone-cron" Jan 30 07:02:16 crc kubenswrapper[4841]: I0130 07:02:16.874848 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="357e6c7b-f67e-4305-90a3-3c65621471fc" containerName="keystone-cron" Jan 30 07:02:16 crc kubenswrapper[4841]: I0130 07:02:16.877532 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:16 crc kubenswrapper[4841]: I0130 07:02:16.922004 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szg4g"] Jan 30 07:02:16 crc kubenswrapper[4841]: I0130 07:02:16.958891 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbfe01cc-793a-4060-abff-bf3815481773-utilities\") pod \"community-operators-szg4g\" (UID: \"cbfe01cc-793a-4060-abff-bf3815481773\") " pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:16 crc kubenswrapper[4841]: I0130 07:02:16.959244 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbfe01cc-793a-4060-abff-bf3815481773-catalog-content\") pod \"community-operators-szg4g\" (UID: \"cbfe01cc-793a-4060-abff-bf3815481773\") " pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:16 crc kubenswrapper[4841]: I0130 07:02:16.959370 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxzsv\" (UniqueName: \"kubernetes.io/projected/cbfe01cc-793a-4060-abff-bf3815481773-kube-api-access-mxzsv\") pod \"community-operators-szg4g\" (UID: \"cbfe01cc-793a-4060-abff-bf3815481773\") " pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:17 crc kubenswrapper[4841]: I0130 07:02:17.061469 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbfe01cc-793a-4060-abff-bf3815481773-catalog-content\") pod \"community-operators-szg4g\" (UID: \"cbfe01cc-793a-4060-abff-bf3815481773\") " pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:17 crc kubenswrapper[4841]: I0130 07:02:17.061927 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxzsv\" (UniqueName: \"kubernetes.io/projected/cbfe01cc-793a-4060-abff-bf3815481773-kube-api-access-mxzsv\") pod \"community-operators-szg4g\" (UID: \"cbfe01cc-793a-4060-abff-bf3815481773\") " pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:17 crc kubenswrapper[4841]: I0130 07:02:17.062182 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbfe01cc-793a-4060-abff-bf3815481773-utilities\") pod \"community-operators-szg4g\" (UID: \"cbfe01cc-793a-4060-abff-bf3815481773\") " pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:17 crc kubenswrapper[4841]: I0130 07:02:17.062929 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbfe01cc-793a-4060-abff-bf3815481773-utilities\") pod \"community-operators-szg4g\" (UID: \"cbfe01cc-793a-4060-abff-bf3815481773\") " pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:17 crc kubenswrapper[4841]: I0130 07:02:17.063129 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbfe01cc-793a-4060-abff-bf3815481773-catalog-content\") pod \"community-operators-szg4g\" (UID: \"cbfe01cc-793a-4060-abff-bf3815481773\") " pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:17 crc kubenswrapper[4841]: I0130 07:02:17.089098 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxzsv\" (UniqueName: \"kubernetes.io/projected/cbfe01cc-793a-4060-abff-bf3815481773-kube-api-access-mxzsv\") pod \"community-operators-szg4g\" (UID: \"cbfe01cc-793a-4060-abff-bf3815481773\") " pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:17 crc kubenswrapper[4841]: I0130 07:02:17.222806 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:17 crc kubenswrapper[4841]: I0130 07:02:17.765041 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szg4g"] Jan 30 07:02:18 crc kubenswrapper[4841]: I0130 07:02:18.193674 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szg4g" event={"ID":"cbfe01cc-793a-4060-abff-bf3815481773","Type":"ContainerStarted","Data":"18b639d51c3493614dd56d08beb690271fd5b2fcf880ed05f1f72d0c89d2fa41"} Jan 30 07:02:22 crc kubenswrapper[4841]: I0130 07:02:22.534925 4841 patch_prober.go:28] interesting pod/controller-manager-6dcd857c4d-kfjhz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 07:02:22 crc kubenswrapper[4841]: I0130 07:02:22.535498 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6dcd857c4d-kfjhz" podUID="e0e8ae97-9364-4396-a9f5-e7bc2acc2147" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 07:02:27 crc kubenswrapper[4841]: I0130 07:02:27.766606 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="3735cd33-a80d-4d70-a60e-28f51e415a4e" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 30 07:02:27 crc kubenswrapper[4841]: I0130 07:02:27.767437 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="3735cd33-a80d-4d70-a60e-28f51e415a4e" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Jan 30 07:02:29 crc kubenswrapper[4841]: I0130 07:02:29.910644 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56qdv" podUID="72033411-0acc-4925-85c1-9fe48cb2157d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.90:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 07:02:31 crc kubenswrapper[4841]: I0130 07:02:31.276624 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-79955696d6-bz4zb" podUID="a3671fee-5baf-4bcf-8246-49b65ef8f0c8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.79:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 07:02:31 crc kubenswrapper[4841]: I0130 07:02:31.446826 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-545d4d4674-rnc5n" podUID="e233256a-c807-4442-bb33-2506fd8d34bd" containerName="cert-manager-controller" probeResult="failure" output="Get \"http://10.217.0.67:9403/livez\": EOF" Jan 30 07:02:32 crc kubenswrapper[4841]: I0130 07:02:32.337128 4841 generic.go:334] "Generic (PLEG): container finished" podID="cbfe01cc-793a-4060-abff-bf3815481773" containerID="d498e96c808120315885299be253c804328a62a6a0fcd60c291b3afa958fb302" exitCode=0 Jan 30 07:02:32 crc kubenswrapper[4841]: I0130 07:02:32.337734 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szg4g" event={"ID":"cbfe01cc-793a-4060-abff-bf3815481773","Type":"ContainerDied","Data":"d498e96c808120315885299be253c804328a62a6a0fcd60c291b3afa958fb302"} Jan 30 07:02:33 crc kubenswrapper[4841]: I0130 07:02:33.143308 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wkctm/must-gather-s8xvz"] Jan 30 07:02:33 crc kubenswrapper[4841]: I0130 07:02:33.143986 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-wkctm/must-gather-s8xvz" podUID="dc994cfa-341d-4e23-ac9c-72abe21d2b0c" containerName="copy" containerID="cri-o://60bb217417395f0454bc85373d471f2d330b025d7b07592d268eb8ee2914ed7b" gracePeriod=2 Jan 30 07:02:33 crc kubenswrapper[4841]: I0130 07:02:33.153812 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wkctm/must-gather-s8xvz"] Jan 30 07:02:33 crc kubenswrapper[4841]: I0130 07:02:33.369287 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wkctm_must-gather-s8xvz_dc994cfa-341d-4e23-ac9c-72abe21d2b0c/copy/0.log" Jan 30 07:02:33 crc kubenswrapper[4841]: I0130 07:02:33.373454 4841 generic.go:334] "Generic (PLEG): container finished" podID="dc994cfa-341d-4e23-ac9c-72abe21d2b0c" containerID="60bb217417395f0454bc85373d471f2d330b025d7b07592d268eb8ee2914ed7b" exitCode=143 Jan 30 07:02:33 crc kubenswrapper[4841]: I0130 07:02:33.380226 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szg4g" event={"ID":"cbfe01cc-793a-4060-abff-bf3815481773","Type":"ContainerStarted","Data":"546b1bec4c7fe316b2973b06962fe1776b06e49877228deae168e5b554da2c6a"} Jan 30 07:02:33 crc kubenswrapper[4841]: I0130 07:02:33.675048 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wkctm_must-gather-s8xvz_dc994cfa-341d-4e23-ac9c-72abe21d2b0c/copy/0.log" Jan 30 07:02:33 crc kubenswrapper[4841]: I0130 07:02:33.675872 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkctm/must-gather-s8xvz" Jan 30 07:02:33 crc kubenswrapper[4841]: I0130 07:02:33.783535 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dc994cfa-341d-4e23-ac9c-72abe21d2b0c-must-gather-output\") pod \"dc994cfa-341d-4e23-ac9c-72abe21d2b0c\" (UID: \"dc994cfa-341d-4e23-ac9c-72abe21d2b0c\") " Jan 30 07:02:33 crc kubenswrapper[4841]: I0130 07:02:33.783582 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8fqk\" (UniqueName: \"kubernetes.io/projected/dc994cfa-341d-4e23-ac9c-72abe21d2b0c-kube-api-access-l8fqk\") pod \"dc994cfa-341d-4e23-ac9c-72abe21d2b0c\" (UID: \"dc994cfa-341d-4e23-ac9c-72abe21d2b0c\") " Jan 30 07:02:33 crc kubenswrapper[4841]: I0130 07:02:33.833642 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc994cfa-341d-4e23-ac9c-72abe21d2b0c-kube-api-access-l8fqk" (OuterVolumeSpecName: "kube-api-access-l8fqk") pod "dc994cfa-341d-4e23-ac9c-72abe21d2b0c" (UID: "dc994cfa-341d-4e23-ac9c-72abe21d2b0c"). InnerVolumeSpecName "kube-api-access-l8fqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 07:02:33 crc kubenswrapper[4841]: I0130 07:02:33.888418 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8fqk\" (UniqueName: \"kubernetes.io/projected/dc994cfa-341d-4e23-ac9c-72abe21d2b0c-kube-api-access-l8fqk\") on node \"crc\" DevicePath \"\"" Jan 30 07:02:34 crc kubenswrapper[4841]: I0130 07:02:34.003835 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc994cfa-341d-4e23-ac9c-72abe21d2b0c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dc994cfa-341d-4e23-ac9c-72abe21d2b0c" (UID: "dc994cfa-341d-4e23-ac9c-72abe21d2b0c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 07:02:34 crc kubenswrapper[4841]: I0130 07:02:34.102491 4841 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dc994cfa-341d-4e23-ac9c-72abe21d2b0c-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 30 07:02:34 crc kubenswrapper[4841]: I0130 07:02:34.399109 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wkctm_must-gather-s8xvz_dc994cfa-341d-4e23-ac9c-72abe21d2b0c/copy/0.log" Jan 30 07:02:34 crc kubenswrapper[4841]: I0130 07:02:34.399908 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkctm/must-gather-s8xvz" Jan 30 07:02:34 crc kubenswrapper[4841]: I0130 07:02:34.400089 4841 scope.go:117] "RemoveContainer" containerID="60bb217417395f0454bc85373d471f2d330b025d7b07592d268eb8ee2914ed7b" Jan 30 07:02:34 crc kubenswrapper[4841]: I0130 07:02:34.423110 4841 scope.go:117] "RemoveContainer" containerID="97ca3af24a8d8ec56f02bf5e25aa20877120e5a51f57579c1a03a25e8c18da03" Jan 30 07:02:34 crc kubenswrapper[4841]: I0130 07:02:34.463606 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc994cfa-341d-4e23-ac9c-72abe21d2b0c" path="/var/lib/kubelet/pods/dc994cfa-341d-4e23-ac9c-72abe21d2b0c/volumes" Jan 30 07:02:36 crc kubenswrapper[4841]: I0130 07:02:36.423378 4841 generic.go:334] "Generic (PLEG): container finished" podID="cbfe01cc-793a-4060-abff-bf3815481773" containerID="546b1bec4c7fe316b2973b06962fe1776b06e49877228deae168e5b554da2c6a" exitCode=0 Jan 30 07:02:36 crc kubenswrapper[4841]: I0130 07:02:36.423446 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szg4g" event={"ID":"cbfe01cc-793a-4060-abff-bf3815481773","Type":"ContainerDied","Data":"546b1bec4c7fe316b2973b06962fe1776b06e49877228deae168e5b554da2c6a"} Jan 30 07:02:37 crc kubenswrapper[4841]: I0130 07:02:37.435161 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szg4g" event={"ID":"cbfe01cc-793a-4060-abff-bf3815481773","Type":"ContainerStarted","Data":"db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07"} Jan 30 07:02:37 crc kubenswrapper[4841]: I0130 07:02:37.463181 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-szg4g" podStartSLOduration=16.968680969 podStartE2EDuration="21.463160745s" podCreationTimestamp="2026-01-30 07:02:16 +0000 UTC" firstStartedPulling="2026-01-30 07:02:32.340721476 +0000 UTC m=+6889.334194154" lastFinishedPulling="2026-01-30 07:02:36.835201292 +0000 UTC m=+6893.828673930" observedRunningTime="2026-01-30 07:02:37.459601869 +0000 UTC m=+6894.453074517" watchObservedRunningTime="2026-01-30 07:02:37.463160745 +0000 UTC m=+6894.456633393" Jan 30 07:02:40 crc kubenswrapper[4841]: I0130 07:02:40.463828 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 07:02:40 crc kubenswrapper[4841]: I0130 07:02:40.464415 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 07:02:47 crc kubenswrapper[4841]: I0130 07:02:47.223597 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:47 crc kubenswrapper[4841]: I0130 07:02:47.224039 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:47 crc kubenswrapper[4841]: I0130 07:02:47.286174 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:47 crc kubenswrapper[4841]: I0130 07:02:47.618708 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:50 crc kubenswrapper[4841]: I0130 07:02:50.766504 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szg4g"] Jan 30 07:02:50 crc kubenswrapper[4841]: I0130 07:02:50.767160 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-szg4g" podUID="cbfe01cc-793a-4060-abff-bf3815481773" containerName="registry-server" containerID="cri-o://db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07" gracePeriod=2 Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.301171 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.344342 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbfe01cc-793a-4060-abff-bf3815481773-utilities\") pod \"cbfe01cc-793a-4060-abff-bf3815481773\" (UID: \"cbfe01cc-793a-4060-abff-bf3815481773\") " Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.344601 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxzsv\" (UniqueName: \"kubernetes.io/projected/cbfe01cc-793a-4060-abff-bf3815481773-kube-api-access-mxzsv\") pod \"cbfe01cc-793a-4060-abff-bf3815481773\" (UID: \"cbfe01cc-793a-4060-abff-bf3815481773\") " Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.344638 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbfe01cc-793a-4060-abff-bf3815481773-catalog-content\") pod \"cbfe01cc-793a-4060-abff-bf3815481773\" (UID: \"cbfe01cc-793a-4060-abff-bf3815481773\") " Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.346326 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbfe01cc-793a-4060-abff-bf3815481773-utilities" (OuterVolumeSpecName: "utilities") pod "cbfe01cc-793a-4060-abff-bf3815481773" (UID: "cbfe01cc-793a-4060-abff-bf3815481773"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.346799 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbfe01cc-793a-4060-abff-bf3815481773-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.351617 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbfe01cc-793a-4060-abff-bf3815481773-kube-api-access-mxzsv" (OuterVolumeSpecName: "kube-api-access-mxzsv") pod "cbfe01cc-793a-4060-abff-bf3815481773" (UID: "cbfe01cc-793a-4060-abff-bf3815481773"). InnerVolumeSpecName "kube-api-access-mxzsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.410226 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbfe01cc-793a-4060-abff-bf3815481773-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbfe01cc-793a-4060-abff-bf3815481773" (UID: "cbfe01cc-793a-4060-abff-bf3815481773"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.448740 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxzsv\" (UniqueName: \"kubernetes.io/projected/cbfe01cc-793a-4060-abff-bf3815481773-kube-api-access-mxzsv\") on node \"crc\" DevicePath \"\"" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.448779 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbfe01cc-793a-4060-abff-bf3815481773-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.570559 4841 generic.go:334] "Generic (PLEG): container finished" podID="cbfe01cc-793a-4060-abff-bf3815481773" containerID="db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07" exitCode=0 Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.570612 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szg4g" event={"ID":"cbfe01cc-793a-4060-abff-bf3815481773","Type":"ContainerDied","Data":"db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07"} Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.570627 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szg4g" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.570652 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szg4g" event={"ID":"cbfe01cc-793a-4060-abff-bf3815481773","Type":"ContainerDied","Data":"18b639d51c3493614dd56d08beb690271fd5b2fcf880ed05f1f72d0c89d2fa41"} Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.570702 4841 scope.go:117] "RemoveContainer" containerID="db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.601724 4841 scope.go:117] "RemoveContainer" containerID="546b1bec4c7fe316b2973b06962fe1776b06e49877228deae168e5b554da2c6a" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.628896 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szg4g"] Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.639375 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-szg4g"] Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.642131 4841 scope.go:117] "RemoveContainer" containerID="d498e96c808120315885299be253c804328a62a6a0fcd60c291b3afa958fb302" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.675941 4841 scope.go:117] "RemoveContainer" containerID="db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07" Jan 30 07:02:51 crc kubenswrapper[4841]: E0130 07:02:51.676299 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07\": container with ID starting with db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07 not found: ID does not exist" containerID="db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.676329 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07"} err="failed to get container status \"db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07\": rpc error: code = NotFound desc = could not find container \"db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07\": container with ID starting with db72d339d740854d95b9a0b02aee4c4005193a36ef359774081953b62ea23a07 not found: ID does not exist" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.676347 4841 scope.go:117] "RemoveContainer" containerID="546b1bec4c7fe316b2973b06962fe1776b06e49877228deae168e5b554da2c6a" Jan 30 07:02:51 crc kubenswrapper[4841]: E0130 07:02:51.676633 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"546b1bec4c7fe316b2973b06962fe1776b06e49877228deae168e5b554da2c6a\": container with ID starting with 546b1bec4c7fe316b2973b06962fe1776b06e49877228deae168e5b554da2c6a not found: ID does not exist" containerID="546b1bec4c7fe316b2973b06962fe1776b06e49877228deae168e5b554da2c6a" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.676669 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"546b1bec4c7fe316b2973b06962fe1776b06e49877228deae168e5b554da2c6a"} err="failed to get container status \"546b1bec4c7fe316b2973b06962fe1776b06e49877228deae168e5b554da2c6a\": rpc error: code = NotFound desc = could not find container \"546b1bec4c7fe316b2973b06962fe1776b06e49877228deae168e5b554da2c6a\": container with ID starting with 546b1bec4c7fe316b2973b06962fe1776b06e49877228deae168e5b554da2c6a not found: ID does not exist" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.676684 4841 scope.go:117] "RemoveContainer" containerID="d498e96c808120315885299be253c804328a62a6a0fcd60c291b3afa958fb302" Jan 30 07:02:51 crc kubenswrapper[4841]: E0130 07:02:51.677079 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d498e96c808120315885299be253c804328a62a6a0fcd60c291b3afa958fb302\": container with ID starting with d498e96c808120315885299be253c804328a62a6a0fcd60c291b3afa958fb302 not found: ID does not exist" containerID="d498e96c808120315885299be253c804328a62a6a0fcd60c291b3afa958fb302" Jan 30 07:02:51 crc kubenswrapper[4841]: I0130 07:02:51.677100 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d498e96c808120315885299be253c804328a62a6a0fcd60c291b3afa958fb302"} err="failed to get container status \"d498e96c808120315885299be253c804328a62a6a0fcd60c291b3afa958fb302\": rpc error: code = NotFound desc = could not find container \"d498e96c808120315885299be253c804328a62a6a0fcd60c291b3afa958fb302\": container with ID starting with d498e96c808120315885299be253c804328a62a6a0fcd60c291b3afa958fb302 not found: ID does not exist" Jan 30 07:02:52 crc kubenswrapper[4841]: I0130 07:02:52.452192 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbfe01cc-793a-4060-abff-bf3815481773" path="/var/lib/kubelet/pods/cbfe01cc-793a-4060-abff-bf3815481773/volumes" Jan 30 07:03:02 crc kubenswrapper[4841]: I0130 07:03:02.485301 4841 scope.go:117] "RemoveContainer" containerID="64cc1845f05ed3d4eebc200eb4b899e89b060e8e1e33e900b099d0b3d868d19b" Jan 30 07:03:10 crc kubenswrapper[4841]: I0130 07:03:10.464076 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 07:03:10 crc kubenswrapper[4841]: I0130 07:03:10.464713 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 07:03:10 crc kubenswrapper[4841]: I0130 07:03:10.464758 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" Jan 30 07:03:10 crc kubenswrapper[4841]: I0130 07:03:10.800217 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f996c8819c194d15de61cb4cf0dae2be58222d7e2572d190b2fcc044b3e875c1"} pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 07:03:10 crc kubenswrapper[4841]: I0130 07:03:10.800350 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" containerID="cri-o://f996c8819c194d15de61cb4cf0dae2be58222d7e2572d190b2fcc044b3e875c1" gracePeriod=600 Jan 30 07:03:11 crc kubenswrapper[4841]: I0130 07:03:11.815268 4841 generic.go:334] "Generic (PLEG): container finished" podID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerID="f996c8819c194d15de61cb4cf0dae2be58222d7e2572d190b2fcc044b3e875c1" exitCode=0 Jan 30 07:03:11 crc kubenswrapper[4841]: I0130 07:03:11.815374 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerDied","Data":"f996c8819c194d15de61cb4cf0dae2be58222d7e2572d190b2fcc044b3e875c1"} Jan 30 07:03:11 crc kubenswrapper[4841]: I0130 07:03:11.815984 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" event={"ID":"a24700eb-27ff-4126-9f6a-40ee9575e5ef","Type":"ContainerStarted","Data":"340f751c32c272077dc09c95849c02f76113a63905c0deee54b89dddb7e83c99"} Jan 30 07:03:11 crc kubenswrapper[4841]: I0130 07:03:11.816042 4841 scope.go:117] "RemoveContainer" containerID="e34e0299cbe0212fb8407d9013ac31d06e78749b7db5d946badbf29c3a7ce5e2" Jan 30 07:03:54 crc kubenswrapper[4841]: I0130 07:03:54.060860 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-1557-account-create-update-jbrq7"] Jan 30 07:03:54 crc kubenswrapper[4841]: I0130 07:03:54.078597 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-96zrp"] Jan 30 07:03:54 crc kubenswrapper[4841]: I0130 07:03:54.088820 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-96zrp"] Jan 30 07:03:54 crc kubenswrapper[4841]: I0130 07:03:54.097916 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-1557-account-create-update-jbrq7"] Jan 30 07:03:54 crc kubenswrapper[4841]: I0130 07:03:54.451868 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44" path="/var/lib/kubelet/pods/4dbd7f71-0dfa-40ca-b8a4-1455a31e3f44/volumes" Jan 30 07:03:54 crc kubenswrapper[4841]: I0130 07:03:54.453679 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d248d34a-8ccf-48dd-bb30-9ad79bd380c8" path="/var/lib/kubelet/pods/d248d34a-8ccf-48dd-bb30-9ad79bd380c8/volumes" Jan 30 07:04:02 crc kubenswrapper[4841]: I0130 07:04:02.595060 4841 scope.go:117] "RemoveContainer" containerID="424f52dbd0bc876630cdb01ccdafa8905c99c6c3802758ba1afc93b9ed6e00cc" Jan 30 07:04:02 crc kubenswrapper[4841]: I0130 07:04:02.625194 4841 scope.go:117] "RemoveContainer" containerID="31f586e4fcccb4d5bc63337210744e4a8360bff4d48c0ae8692117a061c2023f" Jan 30 07:04:06 crc kubenswrapper[4841]: I0130 07:04:06.033694 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-npgfr"] Jan 30 07:04:06 crc kubenswrapper[4841]: I0130 07:04:06.041235 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-npgfr"] Jan 30 07:04:06 crc kubenswrapper[4841]: I0130 07:04:06.451925 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f73848a-d9bc-4486-b7f5-f9f3fca5e13c" path="/var/lib/kubelet/pods/6f73848a-d9bc-4486-b7f5-f9f3fca5e13c/volumes" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.037341 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nvvjp"] Jan 30 07:04:55 crc kubenswrapper[4841]: E0130 07:04:55.038479 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc994cfa-341d-4e23-ac9c-72abe21d2b0c" containerName="gather" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.038495 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc994cfa-341d-4e23-ac9c-72abe21d2b0c" containerName="gather" Jan 30 07:04:55 crc kubenswrapper[4841]: E0130 07:04:55.038507 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbfe01cc-793a-4060-abff-bf3815481773" containerName="extract-utilities" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.038515 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbfe01cc-793a-4060-abff-bf3815481773" containerName="extract-utilities" Jan 30 07:04:55 crc kubenswrapper[4841]: E0130 07:04:55.038526 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc994cfa-341d-4e23-ac9c-72abe21d2b0c" containerName="copy" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.038531 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc994cfa-341d-4e23-ac9c-72abe21d2b0c" containerName="copy" Jan 30 07:04:55 crc kubenswrapper[4841]: E0130 07:04:55.038543 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbfe01cc-793a-4060-abff-bf3815481773" containerName="extract-content" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.038549 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbfe01cc-793a-4060-abff-bf3815481773" containerName="extract-content" Jan 30 07:04:55 crc kubenswrapper[4841]: E0130 07:04:55.038583 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbfe01cc-793a-4060-abff-bf3815481773" containerName="registry-server" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.038590 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbfe01cc-793a-4060-abff-bf3815481773" containerName="registry-server" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.038771 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc994cfa-341d-4e23-ac9c-72abe21d2b0c" containerName="copy" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.038791 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc994cfa-341d-4e23-ac9c-72abe21d2b0c" containerName="gather" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.038806 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbfe01cc-793a-4060-abff-bf3815481773" containerName="registry-server" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.040240 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.061052 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nvvjp"] Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.185947 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68b31c4f-ac22-4ea6-b980-83499668860d-catalog-content\") pod \"certified-operators-nvvjp\" (UID: \"68b31c4f-ac22-4ea6-b980-83499668860d\") " pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.186130 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68b31c4f-ac22-4ea6-b980-83499668860d-utilities\") pod \"certified-operators-nvvjp\" (UID: \"68b31c4f-ac22-4ea6-b980-83499668860d\") " pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.186222 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcf6t\" (UniqueName: \"kubernetes.io/projected/68b31c4f-ac22-4ea6-b980-83499668860d-kube-api-access-vcf6t\") pod \"certified-operators-nvvjp\" (UID: \"68b31c4f-ac22-4ea6-b980-83499668860d\") " pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.288615 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68b31c4f-ac22-4ea6-b980-83499668860d-utilities\") pod \"certified-operators-nvvjp\" (UID: \"68b31c4f-ac22-4ea6-b980-83499668860d\") " pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.288745 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcf6t\" (UniqueName: \"kubernetes.io/projected/68b31c4f-ac22-4ea6-b980-83499668860d-kube-api-access-vcf6t\") pod \"certified-operators-nvvjp\" (UID: \"68b31c4f-ac22-4ea6-b980-83499668860d\") " pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.288918 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68b31c4f-ac22-4ea6-b980-83499668860d-catalog-content\") pod \"certified-operators-nvvjp\" (UID: \"68b31c4f-ac22-4ea6-b980-83499668860d\") " pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.289284 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68b31c4f-ac22-4ea6-b980-83499668860d-utilities\") pod \"certified-operators-nvvjp\" (UID: \"68b31c4f-ac22-4ea6-b980-83499668860d\") " pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.289622 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68b31c4f-ac22-4ea6-b980-83499668860d-catalog-content\") pod \"certified-operators-nvvjp\" (UID: \"68b31c4f-ac22-4ea6-b980-83499668860d\") " pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.314495 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcf6t\" (UniqueName: \"kubernetes.io/projected/68b31c4f-ac22-4ea6-b980-83499668860d-kube-api-access-vcf6t\") pod \"certified-operators-nvvjp\" (UID: \"68b31c4f-ac22-4ea6-b980-83499668860d\") " pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.372622 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:04:55 crc kubenswrapper[4841]: I0130 07:04:55.936006 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nvvjp"] Jan 30 07:04:56 crc kubenswrapper[4841]: I0130 07:04:56.026087 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvjp" event={"ID":"68b31c4f-ac22-4ea6-b980-83499668860d","Type":"ContainerStarted","Data":"101f71c17d2ebe0e6b0f8e7232a49ba4282d2299767712b8a492c73a20c29e57"} Jan 30 07:04:57 crc kubenswrapper[4841]: I0130 07:04:57.039947 4841 generic.go:334] "Generic (PLEG): container finished" podID="68b31c4f-ac22-4ea6-b980-83499668860d" containerID="b40860d78ec30bb426e488adfc326c0a416bf55740ea1e2a201e7f91d37ff82e" exitCode=0 Jan 30 07:04:57 crc kubenswrapper[4841]: I0130 07:04:57.039982 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvjp" event={"ID":"68b31c4f-ac22-4ea6-b980-83499668860d","Type":"ContainerDied","Data":"b40860d78ec30bb426e488adfc326c0a416bf55740ea1e2a201e7f91d37ff82e"} Jan 30 07:04:58 crc kubenswrapper[4841]: I0130 07:04:58.053265 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvjp" event={"ID":"68b31c4f-ac22-4ea6-b980-83499668860d","Type":"ContainerStarted","Data":"dc3a01400759a84c7ffd9189881527da6f8e8771ac31feb4e5eb41c3175194a2"} Jan 30 07:05:00 crc kubenswrapper[4841]: I0130 07:05:00.075616 4841 generic.go:334] "Generic (PLEG): container finished" podID="68b31c4f-ac22-4ea6-b980-83499668860d" containerID="dc3a01400759a84c7ffd9189881527da6f8e8771ac31feb4e5eb41c3175194a2" exitCode=0 Jan 30 07:05:00 crc kubenswrapper[4841]: I0130 07:05:00.075699 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvjp" event={"ID":"68b31c4f-ac22-4ea6-b980-83499668860d","Type":"ContainerDied","Data":"dc3a01400759a84c7ffd9189881527da6f8e8771ac31feb4e5eb41c3175194a2"} Jan 30 07:05:01 crc kubenswrapper[4841]: I0130 07:05:01.086867 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvjp" event={"ID":"68b31c4f-ac22-4ea6-b980-83499668860d","Type":"ContainerStarted","Data":"5ff0d544020d0c7bed9f2b7d548eda91eaa03fd22cad7e62643f5afe79d63389"} Jan 30 07:05:01 crc kubenswrapper[4841]: I0130 07:05:01.108365 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nvvjp" podStartSLOduration=2.639703543 podStartE2EDuration="6.108346599s" podCreationTimestamp="2026-01-30 07:04:55 +0000 UTC" firstStartedPulling="2026-01-30 07:04:57.041606437 +0000 UTC m=+7034.035079085" lastFinishedPulling="2026-01-30 07:05:00.510249503 +0000 UTC m=+7037.503722141" observedRunningTime="2026-01-30 07:05:01.106730116 +0000 UTC m=+7038.100202774" watchObservedRunningTime="2026-01-30 07:05:01.108346599 +0000 UTC m=+7038.101819237" Jan 30 07:05:02 crc kubenswrapper[4841]: I0130 07:05:02.775810 4841 scope.go:117] "RemoveContainer" containerID="7d3802b1eec8b13571d4cc3babe84e4f32c139308006e2f7141bbeb194e5e5ec" Jan 30 07:05:05 crc kubenswrapper[4841]: I0130 07:05:05.374247 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:05:05 crc kubenswrapper[4841]: I0130 07:05:05.375095 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:05:05 crc kubenswrapper[4841]: I0130 07:05:05.439599 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:05:06 crc kubenswrapper[4841]: I0130 07:05:06.188588 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:05:06 crc kubenswrapper[4841]: I0130 07:05:06.256638 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nvvjp"] Jan 30 07:05:08 crc kubenswrapper[4841]: I0130 07:05:08.155997 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nvvjp" podUID="68b31c4f-ac22-4ea6-b980-83499668860d" containerName="registry-server" containerID="cri-o://5ff0d544020d0c7bed9f2b7d548eda91eaa03fd22cad7e62643f5afe79d63389" gracePeriod=2 Jan 30 07:05:09 crc kubenswrapper[4841]: I0130 07:05:09.188048 4841 generic.go:334] "Generic (PLEG): container finished" podID="68b31c4f-ac22-4ea6-b980-83499668860d" containerID="5ff0d544020d0c7bed9f2b7d548eda91eaa03fd22cad7e62643f5afe79d63389" exitCode=0 Jan 30 07:05:09 crc kubenswrapper[4841]: I0130 07:05:09.188124 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvjp" event={"ID":"68b31c4f-ac22-4ea6-b980-83499668860d","Type":"ContainerDied","Data":"5ff0d544020d0c7bed9f2b7d548eda91eaa03fd22cad7e62643f5afe79d63389"} Jan 30 07:05:09 crc kubenswrapper[4841]: I0130 07:05:09.562704 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:05:09 crc kubenswrapper[4841]: I0130 07:05:09.627300 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68b31c4f-ac22-4ea6-b980-83499668860d-utilities\") pod \"68b31c4f-ac22-4ea6-b980-83499668860d\" (UID: \"68b31c4f-ac22-4ea6-b980-83499668860d\") " Jan 30 07:05:09 crc kubenswrapper[4841]: I0130 07:05:09.627572 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcf6t\" (UniqueName: \"kubernetes.io/projected/68b31c4f-ac22-4ea6-b980-83499668860d-kube-api-access-vcf6t\") pod \"68b31c4f-ac22-4ea6-b980-83499668860d\" (UID: \"68b31c4f-ac22-4ea6-b980-83499668860d\") " Jan 30 07:05:09 crc kubenswrapper[4841]: I0130 07:05:09.627636 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68b31c4f-ac22-4ea6-b980-83499668860d-catalog-content\") pod \"68b31c4f-ac22-4ea6-b980-83499668860d\" (UID: \"68b31c4f-ac22-4ea6-b980-83499668860d\") " Jan 30 07:05:09 crc kubenswrapper[4841]: I0130 07:05:09.628344 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b31c4f-ac22-4ea6-b980-83499668860d-utilities" (OuterVolumeSpecName: "utilities") pod "68b31c4f-ac22-4ea6-b980-83499668860d" (UID: "68b31c4f-ac22-4ea6-b980-83499668860d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 07:05:09 crc kubenswrapper[4841]: I0130 07:05:09.629066 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68b31c4f-ac22-4ea6-b980-83499668860d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 07:05:09 crc kubenswrapper[4841]: I0130 07:05:09.648438 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b31c4f-ac22-4ea6-b980-83499668860d-kube-api-access-vcf6t" (OuterVolumeSpecName: "kube-api-access-vcf6t") pod "68b31c4f-ac22-4ea6-b980-83499668860d" (UID: "68b31c4f-ac22-4ea6-b980-83499668860d"). InnerVolumeSpecName "kube-api-access-vcf6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 07:05:09 crc kubenswrapper[4841]: I0130 07:05:09.701019 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b31c4f-ac22-4ea6-b980-83499668860d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68b31c4f-ac22-4ea6-b980-83499668860d" (UID: "68b31c4f-ac22-4ea6-b980-83499668860d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 07:05:09 crc kubenswrapper[4841]: I0130 07:05:09.731281 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcf6t\" (UniqueName: \"kubernetes.io/projected/68b31c4f-ac22-4ea6-b980-83499668860d-kube-api-access-vcf6t\") on node \"crc\" DevicePath \"\"" Jan 30 07:05:09 crc kubenswrapper[4841]: I0130 07:05:09.731604 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68b31c4f-ac22-4ea6-b980-83499668860d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 07:05:10 crc kubenswrapper[4841]: I0130 07:05:10.208196 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvjp" event={"ID":"68b31c4f-ac22-4ea6-b980-83499668860d","Type":"ContainerDied","Data":"101f71c17d2ebe0e6b0f8e7232a49ba4282d2299767712b8a492c73a20c29e57"} Jan 30 07:05:10 crc kubenswrapper[4841]: I0130 07:05:10.208250 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvvjp" Jan 30 07:05:10 crc kubenswrapper[4841]: I0130 07:05:10.208299 4841 scope.go:117] "RemoveContainer" containerID="5ff0d544020d0c7bed9f2b7d548eda91eaa03fd22cad7e62643f5afe79d63389" Jan 30 07:05:10 crc kubenswrapper[4841]: I0130 07:05:10.251025 4841 scope.go:117] "RemoveContainer" containerID="dc3a01400759a84c7ffd9189881527da6f8e8771ac31feb4e5eb41c3175194a2" Jan 30 07:05:10 crc kubenswrapper[4841]: I0130 07:05:10.260225 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nvvjp"] Jan 30 07:05:10 crc kubenswrapper[4841]: I0130 07:05:10.275027 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nvvjp"] Jan 30 07:05:10 crc kubenswrapper[4841]: I0130 07:05:10.298090 4841 scope.go:117] "RemoveContainer" containerID="b40860d78ec30bb426e488adfc326c0a416bf55740ea1e2a201e7f91d37ff82e" Jan 30 07:05:10 crc kubenswrapper[4841]: I0130 07:05:10.451861 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68b31c4f-ac22-4ea6-b980-83499668860d" path="/var/lib/kubelet/pods/68b31c4f-ac22-4ea6-b980-83499668860d/volumes" Jan 30 07:05:40 crc kubenswrapper[4841]: I0130 07:05:40.463615 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 07:05:40 crc kubenswrapper[4841]: I0130 07:05:40.464213 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 07:06:10 crc kubenswrapper[4841]: I0130 07:06:10.463503 4841 patch_prober.go:28] interesting pod/machine-config-daemon-hd8v2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 07:06:10 crc kubenswrapper[4841]: I0130 07:06:10.464105 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hd8v2" podUID="a24700eb-27ff-4126-9f6a-40ee9575e5ef" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515137054157024455 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015137054160017364 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015137035675016521 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015137035675015471 5ustar corecore